WO2022262521A1 - 数据展示方法、装置、计算机设备、存储介质、计算机程序产品及计算机程序 - Google Patents

数据展示方法、装置、计算机设备、存储介质、计算机程序产品及计算机程序 Download PDF

Info

Publication number
WO2022262521A1
WO2022262521A1 PCT/CN2022/093971 CN2022093971W WO2022262521A1 WO 2022262521 A1 WO2022262521 A1 WO 2022262521A1 CN 2022093971 W CN2022093971 W CN 2022093971W WO 2022262521 A1 WO2022262521 A1 WO 2022262521A1
Authority
WO
WIPO (PCT)
Prior art keywords
navigation
area
map
target
information
Prior art date
Application number
PCT/CN2022/093971
Other languages
English (en)
French (fr)
Inventor
田真
李斌
欧华富
刘旭
Original Assignee
上海商汤智能科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海商汤智能科技有限公司 filed Critical 上海商汤智能科技有限公司
Publication of WO2022262521A1 publication Critical patent/WO2022262521A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/28Databases characterised by their database models, e.g. relational or object models
    • G06F16/284Relational databases
    • G06F16/285Clustering or classification
    • G06F16/287Visualization; Browsing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/25Integrating or interfacing systems involving database management systems
    • G06F16/252Integrating or interfacing systems involving database management systems between a Database Management System and a front-end application
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models

Definitions

  • the present disclosure relates to but not limited to the field of augmented reality technology, and in particular relates to a data presentation method, device, computer equipment, storage medium, computer program product and computer program.
  • QR codes or guide boards are usually set up in each scenic spot; tourists can open the introduction page of the scenic spot by scanning the QR code set in the scenic spot, so as to learn about the scenic spot through the introduction page Understand, or directly read the introductory text of the guide board set at the scenic spot, so as to learn the guide information and related historical stories of the current guided attraction.
  • the guide content that can be presented to the user is relatively simple.
  • Embodiments of the present disclosure at least provide a data presentation method, device, computer equipment, storage medium, computer program product, and computer program.
  • An embodiment of the present disclosure provides a data presentation method, including:
  • AR Augmented Reality
  • An embodiment of the present disclosure also provides a data display device, including:
  • the scanning module is configured to scan the guide ticket to obtain the location of the augmented reality AR device; the first display module is configured to display the virtual guide map corresponding to the guide ticket in the AR environment of the AR device.
  • the acquisition module is configured to detect that the AR device is located in any navigation area identified by the virtual navigation map, and obtain the first historical navigation information of the AR device; the second display module is configured to be based on the first Historical navigation information, displaying the presentation effect content corresponding to the navigation area in the virtual navigation map in the AR environment.
  • An embodiment of the present disclosure also provides a computer device, including: a processor and a memory, the memory stores machine-readable instructions executable by the processor, and the processor is configured to execute the machine-readable instructions stored in the memory. Read instructions, when the machine-readable instructions are executed by the processor, the machine-readable instructions are executed by the processor to perform the steps in any one of the above data presentation methods.
  • An embodiment of the present disclosure also provides a computer-readable storage medium, on which a computer program is stored, and when the computer program is run, the steps in any one of the above data presentation methods are executed.
  • An embodiment of the present disclosure provides a computer program product, including computer-readable codes.
  • the processor in the device executes the data display method in any embodiment of the present disclosure. some or all of the steps.
  • An embodiment of the present disclosure provides a computer program configured to store computer-readable instructions, and when the computer-readable instructions are executed, the computer executes some or all steps of the data presentation method in any embodiment of the present disclosure.
  • the AR device by scanning the guide ticket, the AR device can enter the AR environment more conveniently and quickly, and display the virtual guide map corresponding to the guide ticket in the AR environment.
  • the virtual guide map Located in any navigation area identified by the virtual tour map, according to the first historical tour information of the AR device, display the presentation effect content corresponding to the navigation area in the virtual tour map in the AR environment, thereby improving In order to ensure the diversity of content displayed to users in each navigation area.
  • FIG. 1 is a schematic diagram of the implementation flow of a data display method provided by an embodiment of the present disclosure
  • FIG. 2 is a schematic diagram of a display interface including presentation effect content provided by an embodiment of the present disclosure
  • FIG. 3 is a schematic diagram of an implementation flow of a navigation route display method in a data display method provided by an embodiment of the present disclosure
  • FIG. 4 is a schematic diagram of a display interface including a navigation route provided by an embodiment of the present disclosure
  • FIG. 5 is a schematic diagram of a display interface including a display map and presentation effect content provided by an embodiment of the present disclosure
  • FIG. 6 is a schematic diagram of a display interface including a display map and presentation effect content provided by an embodiment of the present disclosure
  • FIG. 7 is a schematic diagram of a display interface including a relative positional relationship between a user and a target scene provided by an embodiment of the present disclosure
  • FIG. 8 is a schematic diagram of a display interface including directional information provided by an embodiment of the present disclosure.
  • FIG. 9 is a schematic diagram of an implementation flow of a data presentation method provided by an embodiment of the present disclosure.
  • FIG. 10 is a schematic structural diagram of a data display device provided by an embodiment of the present disclosure.
  • FIG. 11 is a schematic structural diagram of a computer device provided by an embodiment of the present disclosure.
  • QR codes or guide boards are usually set up in various scenic spots; tourists can scan the QR codes set in the scenic spots to open the introduction page of the scenic spots to pass the introduction Learn about the attractions on the page, or directly read the introductory text of the guide boards set at the attractions to learn the guide information and related historical stories of the current guide attractions.
  • the guide content that can be presented to users relatively simple.
  • the embodiment of the present disclosure provides a data display method.
  • the AR device can enter the AR environment more conveniently and quickly, and display the information corresponding to the guide ticket in the AR environment.
  • the virtual tour map if it is detected that the AR device is located in any navigation area identified by the virtual tour map, then according to the first historical tour information of the AR device, the virtual tour map is displayed in the AR environment.
  • the display effect content corresponding to the navigation area improves the diversity of the content displayed for users in each navigation area.
  • the execution subject of the data display method provided in the embodiment of the present disclosure may be, for example, an AR device, and the AR device is capable of supporting AR Functional smart devices, for example, AR devices include but are not limited to mobile phones, tablets, AR glasses, etc.
  • the data presentation method may be implemented by a processor invoking computer-readable instructions stored in a memory.
  • FIG. 1 it is a schematic flow diagram of a data presentation method provided by an embodiment of the present disclosure, the method includes steps S101 to S104, wherein:
  • S101 Scan the guide ticket to obtain the location of the augmented reality AR device.
  • the current position of the AR device can be obtained according to the positioning sensor on the AR device; in addition, the real-time scene image collected by the AR device can also be obtained; based on the real-time scene image and the target scene corresponding to the constructed tour ticket 3D scene model to determine the positioning information of the AR device.
  • the real-time scene image collected by the AR device can also be obtained; based on the real-time scene image and the target scene corresponding to the constructed tour ticket 3D scene model to determine the positioning information of the AR device.
  • the guide ticket may include, for example, an electronic ticket or a paper ticket of any target scene; the target scene may include at least one of scenic spots, museums, exhibition halls, memorial halls, and other scenes that users can visit.
  • the augmented reality AR environment can be activated.
  • the AR environment can be implemented, for example, through a web terminal or a small program deployed in the AR device.
  • a pre-built virtual tour map of the target scene can be displayed in the AR environment.
  • the pre-constructed virtual tour map of the target scene can use the structure from motion (Structure from Motion, SFM) technology to process the multiple images corresponding to the target scene collected in advance, and the generated high-level image of the target scene
  • SFM Structure from Motion
  • the precise map can also be a two-dimensional or three-dimensional map with certain design elements, etc., and the form of the virtual tour map can be determined according to the actual application scenario.
  • different tour areas can also be marked.
  • At least one navigation area contained in the target scene is generally identified in the virtual navigation map of the target scene;
  • the first historical navigation information may include at least one of the following: when the user of the AR device visits any navigation area in the target scene, the operation information on the first navigation material corresponding to the navigation area displayed for the user, And the target preference type of the first navigation material corresponding to the navigation area displayed for it; wherein, each navigation area in the target scene is provided with a variety of corresponding navigation materials, for example, it may include various types of navigation At least one of video, guide animation, virtual models of each guide area, guide voice, guide text introduction information, etc.; here, guide video, guide animation, virtual models of each guide area, and
  • the type of guide text introduction information may include but not limited to at least one of various types such as cartoon type and animation type; the type of guide voice may include but not limited to cheerful female voice, deep male voice, immature child voice, etc. at least one of; the first navigation material includes the navigation material displayed for the user of the AR device.
  • the operation information on the first navigation material corresponding to the navigation area may include at least one of the first operation information and the second operation information; the first operation information is used to indicate that the user of the AR device is interested in the first navigation material
  • the first operation information may include but not limited to at least one of adding attention operation, like operation, and sharing operation;
  • the second operation information is used to indicate that the user of the AR device is not interested in the first navigation material Or the degree of interest is low, the second operation information may include but not limited to at least one of a skip operation, a close operation, and a fast-forward operation, for example.
  • navigation materials of the same type to meet the user s interest points and improve the navigation experience; when the user who belongs to the AR device performs: skip operation, close operation, and When performing operations with negative emotions such as fast-forwarding operations, it is determined that the user of the AR device is not interested in the first navigation material, and then push other content other than the first navigation material for the user of the AR device again. Navigation materials to meet the user's points of interest and improve the navigation experience.
  • the user of the AR device closes the cartoon-type guide video, guide animation, virtual model of each guide area and guide text introduction information corresponding to the guide area displayed for it, then it is determined that the user has If you are not interested in cartoon-type guide videos, guide animations, and virtual models of each guide area, you can push other types of guide videos, guide animations, and other types of guide videos, guide animations, and guides in each guide area for users. virtual model.
  • the current location of the AR device can be determined according to the positioning sensor on the AR device; where the positioning sensor can be, for example, Including Global Positioning System (Global Positioning System, GPS), inertial sensor (Inertial Measurement Unit, IMU), etc.
  • the positioning sensor can be, for example, Including Global Positioning System (Global Positioning System, GPS), inertial sensor (Inertial Measurement Unit, IMU), etc.
  • the positioning information of the AR device can also be determined by acquiring the real-time scene image collected by the AR device; based on the real-time scene image and the constructed 3D scene model of the target scene corresponding to the guide ticket.
  • the feature points in the real-time scene image can be extracted, and the feature point can be matched with the feature point cloud included in the 3D scene model to determine the positioning of the AR device when collecting the real-time scene image information.
  • the positioning information may include at least one of position information and orientation information.
  • the position information may be the coordinate information of the AR device in the coordinate system corresponding to the 3D scene model; the orientation information may be the Euler angle corresponding to the AR device.
  • the three-dimensional scene model of the target scene corresponding to the guide ticket can be constructed according to the following steps: collect multiple frames of scene images at different positions, different angles, and different times in the target scene, and perform feature analysis on each frame of the scene image Point extraction to obtain the point cloud set corresponding to each frame of the scene image; use the point cloud sets corresponding to the multi-frame scene images to obtain the feature point cloud corresponding to the target scene, and the feature point cloud corresponding to the target scene constitutes a 3D scene model.
  • the current position of the AR device can be determined more accurately.
  • the current location of the AR device After the current location of the AR device is determined, based on the current location of the AR device and the location information of each navigation area identified on the virtual navigation map, it can be determined whether the AR device is located in any of the navigation areas identified on the virtual navigation map. within the navigation area; if it is detected that the location information corresponding to the current location of the AR device is consistent with the location information of any navigation area identified by the virtual navigation map, it is determined that the AR device is located in any navigation area identified by the virtual navigation map. within the region, obtain the first historical navigation record when the AR device historically visited any navigation area, so as to determine the navigation material matching the user’s point of interest for the user to which the AR device belongs based on the first historical navigation record , to generate presentation effect content corresponding to the navigation area based on the navigation material.
  • the content of the presentation effect can be displayed in conjunction with the video or image taken by the AR device; for example, the image taken by the AR device includes the navigation objects in the navigation area; image, determine the position of the navigation object in the image, and based on the position, determine the display position of the presentation effect content.
  • the image taken by the AR device includes the navigation objects in the navigation area; image, determine the position of the navigation object in the image, and based on the position, determine the display position of the presentation effect content.
  • the guide ticket is scanned, the image of the guide ticket will be obtained; for example, the position of the guide ticket in the image can be used as a reference to determine the display position of the presentation effect content.
  • the position of the ticket in the image determines the display plane or display space, and the position of the display plane or display space is the display position of the presentation effect content.
  • the position of the AR device in the target scene can also be determined according to the image captured by the AR device and the pre-generated three-dimensional scene model of the target scene. Then, based on the position, the display position for presenting the effect content is determined.
  • the presentation effect content is displayed in the AR environment according to the display position.
  • the presentation effect content may be displayed at the front end of the image captured by the AR device.
  • the presentation effect content may be generated based on the navigation material determined by the first historical navigation information; the presentation effect content may include but not limited to at least one of target rendering content and audio content, and the target rendering content may include but not limited to At least one of video, image, animation, virtual model, AR special effect, and text corresponding to the navigation area; wherein, the virtual model may include, for example, a two-dimensional model or a three-dimensional model.
  • the audio content may include, but not limited to, at least one of music material corresponding to the navigation area, voice explaining scenic spots related to the navigation area, and the like.
  • the virtual scenic spot guide can be displayed in the AR environment
  • the explanation content corresponding to the navigation area in the map when the user chooses to start playing the audio, matches the keywords in the audio with the set explanation content and AR special effects, and searches for the corresponding information in real time according to the keywords in the audio and match the different explanation contents with the divided time periods. For example, according to keywords, match the explanation content A from 0 milliseconds to 1200 milliseconds in the time period.
  • the AR special effects can be combined with user information.
  • the AR device scans the QR code, it can also obtain user information, such as device number information, which can be used in the original AR special effects.
  • user information such as device number information
  • the original AR special effects and special effects corresponding to user information can be superimposed and displayed at the same time, and can also be played back and forth.
  • the AR device can obtain the AR photographing template related to the navigation area and prompt Users can take pictures to check in or record their mood in words, and at the same time support one-click sharing of check-in content with friends.
  • the check-in records during the tour will be displayed.
  • Different AR camera special effect templates can be pushed for users to choose according to the number of times the user visits the navigation area.
  • the first historical navigation information includes the first operation information on the first navigation material corresponding to the navigation area
  • navigation material based on the first target navigation material, generate presentation effect content corresponding to the navigation area, and display the presentation effect content corresponding to the navigation area in the virtual tour map in the AR environment.
  • the second navigation material may include other navigation materials in the navigation materials corresponding to the navigation area except the first navigation material.
  • multiple tags can be set for each navigation material corresponding to the navigation area, such as cultural relics, natural scenery, history, architecture, etc. used to describe the content of the navigation material.
  • Type tags may also include at least one of the tags that characterize the display type of the navigation material, such as cartoons and animations.
  • the first target navigation material can be determined by the following method : According to the label of the first navigation material, determine the candidate navigation material with the same label as the first navigation material from the second navigation material in the navigation area except the first navigation material ; Determine the matching degree between the first navigation material and the candidate navigation materials by calculating the number of identical tags between the first navigation material and the candidate navigation materials, and occupying the percentage of the total number of tags; according to the first navigation material The matching degree between the material and the candidate navigation material, determine the candidate navigation material whose matching degree with the first navigation material is greater than the preset matching degree threshold as the first target navigation material; The first navigation material with interest is used as the first target navigation material.
  • the first historical navigation information includes the second operation information for the first navigation material corresponding to the navigation area
  • the second navigation material other than the first navigation material can be determined from the navigation materials corresponding to the navigation area; based on the second navigation material, the presentation effect content corresponding to the navigation area is generated, And display the presentation effect content corresponding to the navigation area in the virtual tour map in the AR environment.
  • the display progress information of the first navigation material is determined from the first navigation material to determine the second target navigation material that has not been displayed to the user; based on the first navigation material
  • the second target navigation material that has not been shown to the user in the navigation material generates the presentation effect content corresponding to the navigation area, and displays the presentation effect content corresponding to the navigation area in the virtual tour map in the AR environment.
  • the corresponding In the navigation material determine the third target navigation material belonging to the target preference type; based on the third target navigation material, generate the presentation effect content corresponding to the navigation area, and display the virtual navigation map in the AR environment The rendering effect content corresponding to the region.
  • the target scene corresponding to the guide ticket is a certain mountain, and the guide area included in the target scene includes a certain building, the Mood for Love scenic spot, the Happy World scenic spot, and the address park scenic spot; it is determined based on the implementation of S102 that the AR device is located at In a certain building, the first historical navigation information of the AR device is obtained, including: a virtual model of a certain building, and a brief introduction to the history of a certain building;
  • the navigation materials include: a virtual model of a building, and a brief introduction to the history of a building, so that based on the navigation materials, the presentation effect content corresponding to a certain building is generated, and the presentation effect content corresponding to a building is displayed in the AR environment.
  • the schematic diagram of the display interface can be as shown in FIG. 2 , and a virtual model 21 of a certain building and brief history information 22 of a certain building can be displayed in the display interface of presenting effect content.
  • the AR device by scanning the guide ticket, the AR device can enter the AR environment more conveniently and quickly, and display the virtual guide map corresponding to the guide ticket in the AR environment.
  • the presentation effect content corresponding to the guide area in the virtual guide map is displayed in the AR environment, thereby improving the Variety of content presented to users in various navigation areas.
  • a guide that matches the user's point of interest can be recommended to the user.
  • browsing routes refer to S301-S303 shown in Figure 3:
  • the second historical navigation information may include but not limited to at least one of the historical navigation records of each navigation area in the target scene and the number of visits of each navigation area among the plurality of navigation areas.
  • the current location of the AR device can be obtained according to the positioning sensor of the AR device; based on the current location of the AR device, it can be determined whether the current location of the AR device is Located in the target scene corresponding to the virtual tour map; if the current location of the AR device is located in the target scene corresponding to the virtual tour map, it is determined that the tour event is triggered.
  • the current location of the AR device based on the real-time scene image collected by the AR device, determine its position on the 3D scene model corresponding to the target scene, and then determine whether the AR device is located in the target scene corresponding to the virtual guide map, in order to determine if the navigation event was fired.
  • a guide event may also be triggered, so as to re-plan the guide for the user according to the current location of the AR device.
  • the navigation route can be as follows: obtain the current location of the AR device; based on the current location of the AR device, determine whether the position offset between the AR device and the last generated navigation route is greater than The preset position deviation threshold, if it is, then it is determined that the navigation event is triggered. After the navigation event is triggered, second historical navigation information such as the user's historical navigation records of each navigation area in the target scene and the number of times the user visits each navigation area in the target scene can be obtained.
  • the first target navigation area that has not triggered the corresponding navigation task can be determined based on the historical navigation record of the navigation area; Based on the location information of the first target navigation area in the target scene and the current location of the AR device, a navigation route from the current location of the AR device to the first target navigation area is generated, so that the user can follow the The navigation route obtains at least one of the navigation video, navigation animation, virtual model, and navigation voice corresponding to the first target navigation area, thereby enriching the information display form of each navigation area of the target scene, The user can fully understand the first target navigation area according to the information of various display forms corresponding to the first target navigation area, which improves the user's understanding of the target scene; wherein, the first target navigation area includes, for example, The navigation area where the user is shown the target rendered content.
  • the second historical navigation information when the second historical navigation information includes the number of visits to each navigation area in the multiple navigation areas, it may be based on the number of visits to each navigation area in the multiple navigation areas.
  • the number of visits determining a plurality of second target navigation areas whose number of visits is greater than the preset threshold of number of visits; based on the position information of the second target navigation area in the target scene and the current position of the AR device, Generate a navigation route from the current location of the AR device to the second target navigation area; here, use the navigation area that the user frequently browses as the navigation area that matches the user's point of interest, and generate a navigation route from the current location of the AR device The navigation route of the navigation area matched with the location of the user's point of interest, so that the user can browse the navigation area of interest again according to the navigation route, which meets the user's needs.
  • the first distance between each target navigation area may be determined based on the position information of each target navigation area in the target scene and the current position of the AR device, and The second distance between each target navigation area and the current location of the AR device; based on the first distance and the second distance, generate a guide with the shortest distance from the current location of the AR device to each target navigation area route.
  • the target navigation area may include at least one of a first target navigation area and a second target navigation area.
  • the target guide area when the target guide area is a guide area that needs to be visited in line, it can also monitor the current number of queues in each target guide area and the average play time per person. According to the current number of queues in each target guide area, and Per capita play time, determine the queuing time required for each target navigation area; according to the order of the queuing time required for each target navigation area from small to large, generate a map from the current location of the AR device to each target navigation area Guided routes to reduce the user's queuing time and improve the user's tour experience.
  • the guide route can be identified in the virtual guide map of the target scene based on the position information of each guide area included in the guide route in the target scene, and a display map containing the guide route can be generated; And display the display map in the AR device.
  • the display interface for displaying the guide route in the AR environment may be as shown in FIG. The route is from the rafting guide area 311 to the castle guide area 312 , and then to the parade float guide area 313 , and an arrow 310 can be used to indicate the guide route in the display interface of the guide route.
  • the AR device When the user visits each navigation area according to the navigation route, if the user triggers to display the rendering effect content of the navigation area when visiting a certain navigation area, the AR device responds to the triggering rendering effect content display event and zooms out the display map processing, and displaying the zoomed-out display map in the first area of the AR device; and displaying the presentation effect content in the second area of the AR device.
  • the presentation effect content of the navigation area may include but not limited to at least one of the navigation video, navigation voice, navigation animation, virtual model, and text information of the navigation area of the navigation area.
  • the first area of the AR device may include but not limited to the upper left corner, the lower right corner, the lower left corner, the upper right corner, etc., which are part of the graphical user interface display area in the edge area of the AR device;
  • the second area of the AR device may include but not limited to Part of the graphical user interface display area in the central area of the AR device.
  • the display interface can be as shown in Figure 5, and the display map marked with the guide route can be zoomed out Processing, and displaying the reduced display map 321 in the upper left corner screen display area of the AR device, and displaying the virtual model 322 of the castle and the corresponding text introduction information 323 of the castle in a part of the GUI display area in the central area of the AR device .
  • the AR device responds to the event of triggering the presentation effect content display, and obtains the The presentation effect content of the navigation area, and superimpose the presentation effect content and the display map.
  • the display interface can be as shown in Figure 6, and the text introduction information 323 corresponding to the castle, and the The virtual model 322 of the castle, and the text introduction information 323 corresponding to the castle, and the virtual model 322 of the castle are superimposed on the front end of the navigation route in FIG. 4 for display.
  • the AR device when the user triggers the zoomed-out display map, the AR device responds to the user's trigger operation on the zoomed-out display map, zooms in on the zoomed-out display map, and displays the zoomed-in display map in the second area of the AR device. Show the map.
  • the current location of the AR device in response to scanning the guide ticket, can be obtained according to the positioning sensor on the AR device; based on the current location of the AR device and the corresponding virtual tour map
  • the location information of the target scene, using the virtual tour map of the target scene shows the relative positional relationship between the current position of the AR device and the target scene.
  • based on the real-time scene image collected by the AR device determine its position on the 3D scene model corresponding to the target scene, and then determine the relative positional relationship between the current position of the AR device and the target scene.
  • the display interface can be as shown in Figure 7, which can display the current location of the AR device.
  • the real-time scene image collected by the AR device can be obtained as a detection image;
  • the feature points in the detection image are matched with the feature point cloud contained in the 3D scene model of the target scene to determine the positioning information when the AR device collects the real-time scene image; when the real-time scene image is collected based on the AR device location information of the target scene, and the location information of each navigation area in the target scene in the virtual navigation map, determine whether the AR device is located in any of the multiple navigation areas; if it is determined that the AR device is located in multiple navigation areas In any navigation area in the area, the first historical navigation information of the AR device is obtained; based on the first historical navigation information, the rendering effect content corresponding to the navigation area in the virtual navigation map is displayed in the AR environment, and the For implementation, refer to S103 to S104 in the data display method provided in the above embodiment.
  • the AR device may generate an information based on the current position of the AR device and the second position information of the third target navigation area among the plurality of navigation areas.
  • the current location points to the pointing information of the third target navigation area; wherein, the pointing information is used to guide the user to move from the current position of the AR device to the third target navigation area, for example, it may include navigation guidance text, navigation At least one of various information such as a guidance route map, a navigation guidance symbol, and a navigation guidance voice.
  • the third target navigation area may include, for example, at least one of the first target navigation area and the second target navigation area.
  • the third target navigation area can also represent the navigation area of interest selected by the user among multiple navigation areas in the target scene, and the third target navigation area can include, for example, multiple navigation Any navigation area in the area.
  • Fig. 8 is a schematic diagram of a display interface of an AR device provided by an embodiment of the present disclosure.
  • the display interface of the AR device displays pointing information 51, for example, the pointing information may include "move forward 50 meters first , then turn right, and move 20 meters after turning right", the navigation guidance text may also include the navigation guidance located at the upper left position in Figure 8 for guiding from the current position 521 of the AR device to the third target navigation area position 522 Roadmap 52.
  • Fig. 9 is a schematic flowchart of a data presentation method provided by an embodiment of the present disclosure. As shown in Fig. 9, the method includes:
  • the writing order of each step does not mean a strict execution order and constitutes any limitation on the implementation process.
  • the specific execution order of each step should be based on its function and possible
  • the inner logic is OK.
  • the embodiment of the present disclosure also provides a data display device corresponding to the data display method. Since the principle of the device in the embodiment of the present disclosure is similar to the above-mentioned data display method in the embodiment of the present disclosure, the implementation of the device can be found in method implementation.
  • the device includes: a scanning module 1001 , a first display module 1002 , an acquisition module 1003 , and a second display module 1004 ;
  • the scanning module 1001 is configured to scan the guide ticket to obtain the location of the augmented reality AR device;
  • the first display module 1002 is configured to display a virtual guide map corresponding to the guide ticket in the AR environment of the AR device;
  • the obtaining module 1003 is configured to detect that the AR device is located in any navigation area identified by the virtual navigation map, and obtain the first historical navigation information of the AR device;
  • the second display module 1004 is configured to display, in the AR environment, the presentation effect content corresponding to the navigation area in the virtual navigation map based on the first historical navigation information.
  • the presentation effect content includes at least one of the following: target rendering content, audio content, and the form of the target rendering content includes at least one of video, image, animation, virtual model, and text.
  • the first historical navigation information includes at least one of the following: operation information on the first navigation material corresponding to the navigation area; target preference type of the first navigation material.
  • the first historical navigation information includes first operation information on the first navigation material corresponding to the navigation area; the first operation information indicates that the first navigation material has Degree of interest; the second display module 1004 is further configured to: according to the first navigation material corresponding to the first operation information, from the second navigation material corresponding to the navigation area, determine the The first target guide material whose matching degree of the first guide material is greater than the preset matching threshold; based on the first target guide material, display the virtual guide map and the guide in the AR environment The rendering effect content corresponding to the region.
  • the first operation information includes: at least one of an operation of adding attention, an operation of liking, and an operation of sharing.
  • the first historical navigation information includes second operation information on the first navigation material corresponding to the navigation area; the second operation information indicates that the first navigation material does not have Degree of interest; the second display module 1004 is further configured to: determine a second navigation material other than the first navigation material from the navigation materials corresponding to the navigation area; Material, displaying the presentation effect content corresponding to the navigation area in the virtual navigation map in the AR environment.
  • the second operation information includes: at least one of a skip operation, a close operation, and a fast-forward operation.
  • the first historical navigation information includes the target preference type of the first navigation material; the second presentation module 1004 is further configured to: based on the target preference type, from the navigation area corresponding In the navigation material, determine a third target navigation material belonging to the target preference type; based on the third target navigation material, display the virtual navigation map and the navigation area in the AR environment Corresponding rendering effect content.
  • the data display device further includes: a third display module configured to: acquire the user's second historical navigation information in response to a navigation event being triggered; based on The second historical navigation information generates navigation routes for multiple navigation areas; and displays the navigation routes in the AR environment.
  • a third display module configured to: acquire the user's second historical navigation information in response to a navigation event being triggered; based on The second historical navigation information generates navigation routes for multiple navigation areas; and displays the navigation routes in the AR environment.
  • the navigation event is triggered, including at least one of the following: obtaining the current location of the AR device; based on the current location of the AR device, determining the closest The location offset between the navigation routes generated once is greater than the preset location offset threshold; obtain the current location of the AR device; determine the current location of the AR device based on the current location of the AR device The location at is located in the target scene corresponding to the virtual tour map.
  • the second historical navigation information includes historical navigation records of the navigation area; the third display module is further configured to: based on the historical navigation records of the navigation area, from Among the multiple navigation areas, determine a first target navigation area that has not triggered a corresponding navigation task; based on the position information of the first target navigation area in the target scene and the current location of the AR device location to generate the tour directions.
  • the second historical navigation information includes the number of visits of each navigation area in the plurality of navigation areas; the third display module is further configured to: based on the plurality of The number of visits of each navigation area in the above navigation area, determine a plurality of second target navigation areas whose number of visits is greater than the preset threshold of number of visits; based on the second target navigation area in the target scene The location information in the AR device and the current location of the AR device are used to generate the navigation route.
  • the third presentation module is further configured to: determine the current location of the AR device based on the current location of the AR device and the location information of each navigation area in the target scene. The distance between the location and each of the navigation areas; based on the distance, a navigation route is generated.
  • the third display module is further configured to: monitor the current number of people queuing up in each navigation area and the average playing time per person; determine the The estimated queuing time corresponding to each navigation area; based on the estimated queuing time corresponding to each navigation area, a guide route is generated.
  • the third presentation module is further configured to: identify in the virtual navigation map of the target scene based on the position information of each navigation area included in the navigation route in the target scene A display map is obtained for the navigation route; and the display map is displayed in the AR device.
  • the data display apparatus further includes: a fourth display module configured to, in response to triggering a presentation effect content display event, shrink the display map and display it in the first area of the AR device The zoomed-out display map; and, displaying the presentation effect content in the second area of the AR device.
  • a fourth display module configured to, in response to triggering a presentation effect content display event, shrink the display map and display it in the first area of the AR device The zoomed-out display map; and, displaying the presentation effect content in the second area of the AR device.
  • the fourth display module is further configured to, in response to a trigger operation on the reduced display map, zoom in on the reduced display map, and display the enlarged display map in the second area.
  • the fourth display module is further configured to superimpose and display the presentation effect content and the presentation map in response to triggering a presentation effect content display event.
  • the scanning module is further configured to: obtain the current location of the AR device in response to scanning the guide ticket;
  • the data display module further includes: a fifth display module configured to The current position of the AR device and the position information of the target scene corresponding to the virtual guide map show the relative positional relationship between the current position of the AR device and the target scene.
  • the acquisition module is further configured to acquire a detection image in response to the AR device being located in the target scene; based on the detection image, determine whether the AR device is located in a plurality of the navigation areas any navigation area of the .
  • the fifth display module is further configured to, in response to the AR device being outside the navigation area, based on the current location of the AR device and the first of the multiple navigation areas The second location information of the three-target navigation area, generating pointing information pointing to the third target navigation area; displaying the pointing information.
  • the fifth presentation module is further configured to, based on at least one of the current location of the AR device, the first historical navigation information and the second historical navigation information, and the navigation route, from A third target navigation area is determined among the plurality of navigation areas.
  • FIG. 11 it is a schematic structural diagram of a computer device 1100 provided by an embodiment of the present disclosure, including a processor 1101 , a memory 1102 , and a bus 1103 .
  • the memory 1102 is used to store execution instructions, including a memory 11021 and an external memory 11022; the memory 11021 here is also called an internal memory, and is used to temporarily store the calculation data in the processor 1101 and the data exchanged with the external memory 11022 such as a hard disk,
  • the processor 1101 exchanges data with the external memory 11022 through the memory 11021.
  • the processor 1101 communicates with the memory 1102 through the bus 1103, so that the processor 1101 executes the following instructions: scan to the guide ticket, obtain enhanced Realize the location of the AR device; display the virtual guide map corresponding to the guide ticket in the AR environment of the AR device; detect that the AR device is located in any guide area identified by the virtual guide map Acquiring first historical navigation information of the AR device; based on the first historical navigation information, displaying the presentation effect content corresponding to the navigation area in the virtual navigation map in the AR environment.
  • the processor 1101 communicates with the memory 1102 through the bus 1103, so that the processor 1101 executes the following instructions: scan to the guide ticket, obtain enhanced Realize the location of the AR device; display the virtual guide map corresponding to the guide ticket in the AR environment of the AR device; detect that the AR device is located in any guide area identified by the virtual guide map Acquiring first historical navigation information of the AR device; based on the first historical navigation information, displaying the presentation effect content corresponding to the navigation area in the virtual navigation map in the AR environment.
  • Embodiments of the present disclosure also provide a computer-readable storage medium, on which a computer program is stored, and when the computer program is run by a processor, the steps of the data presentation method described in the foregoing method embodiments are executed.
  • the storage medium may be a volatile or non-volatile computer-readable storage medium.
  • An embodiment of the present disclosure also provides a computer program, the computer program includes computer readable codes, and when the computer readable codes are read and executed by a computer, the data display method described in the embodiments of the present disclosure is implemented some or all of the steps.
  • Embodiments of the present disclosure also provide a computer program product, the computer program product carries a program code, and the instructions included in the program code can be used to execute the steps of the data display method described in the above method embodiment, for details, please refer to the above method Example.
  • the above-mentioned computer program product may be specifically implemented by means of hardware, software or a combination thereof.
  • the computer program product is embodied as a computer storage medium, and in other embodiments, the computer program product is embodied as a software product, such as a software development kit (Software Development Kit, SDK) and the like.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some communication interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or may be distributed to multiple network units. Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present disclosure may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the functions are realized in the form of software function units and sold or used as independent products, they can be stored in a non-volatile computer-readable storage medium executable by a processor.
  • the technical solution of the embodiments of the present disclosure is essentially or the part that contributes to the prior art or the part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium , including several instructions to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the methods described in various embodiments of the present disclosure.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disc and other media that can store program codes. .
  • An embodiment of the present disclosure provides a data display method, device, computer equipment, storage medium, computer program product, and computer program, wherein the method includes: scanning the tour ticket to obtain the location of the augmented reality AR device; The virtual guide map corresponding to the guide ticket is displayed in the AR environment of the AR device; it is detected that the AR device is located in any guide area identified by the virtual guide map, and the first information of the AR device is acquired.
  • Historical navigation information based on the first historical navigation information, displaying the presentation effect content corresponding to the navigation area in the virtual navigation map in the AR environment.
  • the AR device can enter the AR environment more conveniently and quickly by scanning the guide ticket, and display the presentation effect content corresponding to the guide area in the virtual guide map in the AR environment, so that Increased the variety of content displayed to users in each navigation area.

Abstract

一种数据展示方法、装置、计算机设备、存储介质、计算机程序产品及计算机程序,其中,该方法包括:扫描到导览门票,获取增强现实AR设备所处的位置(S101);在所述AR设备的AR环境中展示与所述导览门票对应的虚拟导览地图(S102);检测到AR设备位于所述虚拟导览地图标识的任一导览区域内,获取所述AR设备的第一历史导览信息(S103);基于所述第一历史导览信息,在所述AR环境中展示所述虚拟导览地图中与所述导览区域对应的呈现效果内容(S104)。

Description

数据展示方法、装置、计算机设备、存储介质、计算机程序产品及计算机程序
相关申请的交叉引用
本公开基于申请号为202110679377.4、申请日为2021年06月18日、申请名称为“数据展示方法、装置、计算机设备及存储介质”的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此引入本公开作为参考。
技术领域
本公开涉及但不限于增强现实技术领域,尤其涉及一种数据展示方法、装置、计算机设备、存储介质、计算机程序产品及计算机程序。
背景技术
为了方便游客在景区中游览,通常会在景区的各个景点设置二维码或者导览牌;游客可以通过扫描设置在景点中的二维码打开对景点的介绍页面,以通过介绍页面对景点进行了解,或者直接阅读设置在景点的导览牌的介绍文字,以获知当前导览景点的导览信息以及相关的历史故事,在这种方式下,能够为用户呈现的导览内容较为单一。
发明内容
本公开实施例至少提供一种数据展示方法、装置、计算机设备、存储介质、计算机程序产品及计算机程序。
本公开实施例提供了一种数据展示方法,包括:
扫描到导览门票,获取增强现实(Augmented Reality,AR)设备所处的位置;在所述AR设备的AR环境中展示与所述导览门票对应的虚拟导览地图;检测到AR设备位于所述虚拟导览地图标识的任一导览区域内,获取所述AR设备的第一历史导览信息;基于所述第一历史导览信息,在所述AR环境中展示所述虚拟导览地图中与所述导览区域对应的呈现效果内容。
本公开实施例还提供一种数据展示装置,包括:
扫描模块,配置为扫描到导览门票,获取增强现实AR设备所处的位置;第一展示模块,配置为在所述AR设备的AR环境中展示与所述导览门票对应的虚拟导览地图;获取模块,配置为检测到AR设备位于所述虚拟导览地图标识的任一导览区域内,获取所述AR设备的第一历史导览信息;第二展示模块,配置为基于所述第一历史导览信息,在所述AR环境中展示所述虚拟导览地图中与所述导览区域对应的呈现效果内容。
本公开实施例还提供一种计算机设备,包括:处理器、存储器,所述存储器存储有所述处理器可执行的机器可读指令,所述处理器配置为执行所述存储器中存储的机器可读指令,所述机器可读指令被所述处理器执行时,所述机器可读指令被所述处理器执行时执行上述任一种数据展示方法中的步骤。
本公开实施例还提供一种计算机可读存储介质,该计算机可读存储介质上存储有计 算机程序,该计算机程序被运行时执行上述任一种数据展示方法中的步骤。
本公开实施例提供一种计算机程序产品,包括计算机可读代码,在计算机可读代码在设备上运行的情况下,设备中的处理器执行用于实现本公开任一实施例中的数据展示方法的部分或全部步骤。
本公开实施例提供一种计算机程序,配置为存储计算机可读指令,所述计算机可读指令被执行时使得计算机执行本公开任一实施例中的数据展示方法的部分或全部步骤。
本公开实施例中,通过扫描导览门票,可以使得AR设备能够较便捷、较快速的进入到AR环境中,并在AR环境中展示导览门票对应的虚拟导览地图,若检测到AR设备位于虚拟导览地图标识的任一导览区域内,则根据AR设备的第一历史导览信息,在AR环境中展示虚拟导览地图中与所述导览区域对应的呈现效果内容,从而提升了在各个导览区域为用户展示的内容的多样性。
关于上述数据展示装置、计算机设备、计算机可读存储介质、计算机程序产品及计算机程序的效果描述参见上述数据展示方法的说明。为使本公开的上述目的、特征和优点能更明显易懂,下文特举部分实施例,并配合所附附图,作详细说明如下。
附图说明
为了更清楚地说明本公开实施例的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,此处的附图被并入说明书中并构成本说明书中的一部分,这些附图示出了符合本公开的实施例,并与说明书一起用于说明本公开的技术方案。应当理解,以下附图仅示出了本公开的某些实施例,因此不应被看作是对范围的限定,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他相关的附图。
图1为本公开实施例所提供的一种数据展示方法的实现流程示意图;
图2为本公开实施例所提供的一种包含呈现效果内容的展示界面的示意图;
图3为本公开实施例所提供的一种数据展示方法中,导览路线展示方法的实现流程示意图;
图4为本公开实施例所提供的一种包含导览路线的展示界面的示意图;
图5为本公开实施例所提供的一种包含展示地图和呈现效果内容的展示界面的示意图;
图6为本公开实施例所提供的一种包含展示地图和呈现效果内容的展示界面的示意图;
图7为本公开实施例所提供的一种包含用户与目标场景之间的相对位置关系的展示界面的示意图;
图8为本公开实施例所提供的一种包含指向信息的展示界面的示意图;
图9为本公开实施例所提供的一种数据展示方法的实现流程示意图;
图10为本公开实施例所提供的一种数据展示装置的架构示意图;
图11为本公开实施例所提供的一种计算机设备的结构示意图。
具体实施方式
为使本公开实施例的目的、技术方案和优点更加清楚,下面将结合本公开实施例中附图,对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本公开一部分实施例,而不是全部的实施例。通常在此处描述和示出的本公开实施例的组件可以以各种不同的配置来布置和设计。因此,以下对本公开的实施例的详细描述并非旨在限制要求保护的本公开的范围,而是仅仅表示本公开的示例性实施例。基于本公开的实施例,本领域技术人员在没有做出创造性劳动的前提下所获得的所有其他实施例,都属于本公开保护的范围。
经研究发现,为了方便游客在景区中游览,通常会在景区的各个景点设置二维码或者导览牌;游客可以通过扫描设置在景点中的二维码打开对景点的介绍页面,以通过介绍页面对景点进行了解,或者直接阅读设置在景点的导览牌的介绍文字,以获知当前导览景点的导览信息以及相关的历史故事,在这种方式下,能够为用户呈现的导览内容较为单一。
基于上述研究,本公开实施例提供了一种数据展示方法,通过扫描导览门票,可以使得AR设备能够较便捷、较快速的进入到AR环境中,并在AR环境中展示导览门票对应的虚拟导览地图,若检测到AR设备位于虚拟导览地图标识的任一导览区域内,则根据AR设备的第一历史导览信息,在AR环境中展示虚拟导览地图中与所述导览区域对应的呈现效果内容,从而提升了在各个导览区域为用户展示的内容的多样性。
针对以上方案所存在的缺陷,均是发明人在经过实践并仔细研究后得出的结果,因此,上述问题的发现过程以及下文中本公开实施例所提出的方案,都应该是发明人在本公开过程中对本公开做出的贡献。
应注意到:相似的标号和字母在下面的附图中表示类似项,因此,一旦某一项在一个附图中被定义,则在随后的附图中不需要对其进行进一步定义和解释。
为便于对本实施例进行理解,首先对本公开实施例所公开的一种数据展示方法进行详细介绍,本公开实施例所提供的数据展示方法的执行主体例如可以为AR设备,AR设备为能够支持AR功能的智能设备,比如,AR设备包括但不限于手机、平板、AR眼镜等。在一些可能的实现方式中,该数据展示方法可以通过处理器调用存储器中存储的计算机可读指令的方式来实现。
参见图1所示,为本公开实施例提供的一种数据展示方法的实现流程示意图,所述方法包括步骤S101~S104,其中:
S101、扫描到导览门票,获取增强现实AR设备所处的位置。
S102、在所述AR设备的AR环境中展示与所述导览门票对应的虚拟导览地图。
在实施中,可以根据AR设备上的定位传感器,获取AR设备当前所处的位置;另外,也可以通过获取AR设备采集的实时场景图像;基于实时场景图像和构建的导览门票对应的目标场景的三维场景模型,确定AR设备的定位信息,实施时可以参见S103中确定AR设备的所在位置的实施方式。
其中,导览门票例如可以包括任一目标场景的电子门票或纸质门票;该目标场景例如可以包括景区、博物馆、展览馆、纪念馆等用户能够进行游览的场景中的至少一个。
在实施中,用户使用AR设备扫描任一目标场景的导览门票上的文字、图案、以及标识码中的至少一种后,可以启动增强现实AR环境。
此处,AR环境例如可以通过部署在AR设备中的web端或者小程序实现。AR环境在启动后,可以在AR环境中展示预先构建的该目标场景的虚拟导览地图。其中,预先构建的目标场景的虚拟导览地图可以是采用运动恢复结构(Structure from Motion,SFM)技术,对预先采集到的该目标场景对应的多张图像进行处理,生成的该目标场景的高精地图,其也可以是具有一定设计元素的二维或三维地图等,可以根据实际的应用场景确定虚拟导览地图的形式。在虚拟导览地图中,例如还可以标识出不同的导览区域。
本公开实施例提供的数据展示方法还包括:
S103、检测到AR设备位于所述虚拟导览地图标识的任一导览区域内,获取所述AR设备的第一历史导览信息。
在实施中,目标场景的虚拟导览地图中一般标识出该目标场景包含的至少一个导览区域;该导览区域例如可以包括景区中的任一景点区域,或者博物馆、展览馆、纪念馆等任一展馆内的展览区域;比如,导览区域可以为书画展览馆内的书法展览区域、或绘画展览区域等,或者还可以为某景点区域等。
第一历史导览信息可以包括下述至少一种:AR设备所属用户游览该目标场景中任一导览区域时,对为其展示的该导览区域对应的第一导览素材的操作信息、以及为其展示的该导览区域对应的第一导览素材的目标偏好类型;其中,目标场景中各导览区域均设置有对应的多种导览素材,例如可以包括各种类型的导览视频、导览动画、各导览区域的虚拟模型、以及导览语音、导览文字介绍信息等中的至少一种;这里,导览视频、导览动画、各导览区域的虚拟模型、以及导览文字介绍信息的类型可以包括但不限于卡通类型、动漫类型等多种类型中的至少一种;导览语音的类型可以包括但不限于欢快女声、低沉男声、稚嫩童声等多种类型中的至少一种;第一导览素材包括为AR设备所属用户展示过的导览素材。
对导览区域对应的第一导览素材的操作信息可以包括第一操作信息以及第二操作信息中的至少一种;第一操作信息用来表征AR设备所属用户对第一导览素材具有兴趣度,该第一操作信息例如可以包括但不限于添加关注操作、点赞操作、分享操作中的至少一种;第二操作信息用来表征AR设备所属用户对第一导览素材不具有兴趣度或兴趣度较低,该第二操作信息例如可以包括但不限于跳过操作、关闭操作、以及快进操作中的至少一种。示例性的,可以通过对导览区域对应的第一导览素材的操作信息进行分析,确定AR设备所属用户对为其展示过的各第一导览素材的兴趣度、以及AR设备所属用户偏好的第一导览素材的目标偏好类型;当AR设备所属用户对为其展示的导览区域对应的第一导览素材进行添加关注操作、点赞操作、分享操作等具有正向情感情绪的操作时,则确定该AR设备所属用户对该第一导览素材具有兴趣度,可以再次为AR设备所属用户推送该第一导览素材,也可以为AR设备所属用户推送与该第一导览素材的类型相同的导览素材,以符合用户的兴趣点,提高导览体验;当AR设备所属用户对为其展示的导览区域对应的第一导览素材进行:跳过操作、关闭操作、以及快进操作等具有负面情感情绪的操作时,则确定该AR设备所属用户对该第一导览素材不具有兴趣度,则 可以再次为AR设备所属用户推送除该第一导览素材外的其他导览素材,以符合用户的兴趣点,提高导览体验。
示例性的,AR设备所属用户对为其展示的导览区域对应的卡通类型的导览视频、导览动画、各导览区域的虚拟模型以及导览文字介绍信息进行关闭操作,则确定用户对该卡通类型的导览视频、导览动画、各导览区域的虚拟模型不感兴趣,则可以为用户推送动漫类型、自然风景类型等其他类型的导览视频、导览动画、各导览区域的虚拟模型。
在确定AR设备的所在位置时,为了减少数据处理数量,快速的确定AR设备当前所处的位置,可以根据AR设备上的定位传感器,确定AR设备当前所处的位置;其中,定位传感器例如可以包括全球定位系统(Global Positioning System,GPS)、惯性传感器(Inertial Measurement Unit,IMU)等。
另外,也可以通过获取AR设备采集的实时场景图像;基于实时场景图像和构建的导览门票对应的目标场景的三维场景模型,确定AR设备的定位信息。此处,获取AR设备采集的实时场景图像后,可以提取实时场景图像中的特征点,将该特征点与三维场景模型中包括的特征点云进行匹配,确定AR设备采集实时场景图像时的定位信息。该定位信息可以包括位置信息、朝向信息中的至少一种,比如该位置信息可以为AR设备在三维场景模型对应的坐标系下的坐标信息;朝向信息可以为AR设备对应的欧拉角。
另一实施例中,可以根据下述步骤构建导览门票对应的目标场景的三维场景模型:采集目标场景内不同位置、不同角度、不同时间下的多帧场景图像,对每帧场景图像进行特征点提取,得到每帧场景图像对应的点云集合;利用多帧场景图像分别对应的点云集合,得到目标场景对应的特征点云,该目标场景对应的特征点云构成了三维场景模型。
或采集不同位置、不同角度、不同时间下目标场景对应的场景视频,从采集的场景视频中获取多帧视频帧,对每帧视频帧进行特征点提取,得到每帧视频帧对应的点云集合;利用多帧视频帧分别对应的点云集合,得到目标场景对应的三维场景模型。
这里,通过利用AR设备采集的实时场景图像和构建的三维场景模型,能够较为准确的确定AR设备当前所处的位置。
在确定AR设备当前所处的位置之后,可以基于AR设备当前所处的位置、以及虚拟导览地图标识的各导览区域的位置信息,确定AR设备是否位于虚拟导览地图标识的任一导览区域内;若检测到AR设备当前所处的位置对应的位置信息与虚拟导览地图标识的任一导览区域的位置信息一致,则确定AR设备位于虚拟导览地图标识的任一导览区域内,则获取AR设备历史游览该任一导览区域时的第一历史导览记录,以基于该第一历史导览记录为AR设备所属用户确定与该用户的兴趣点匹配的导览素材,以基于该导览素材生成该导览区域对应的呈现效果内容。
S104、基于所述第一历史导览信息,在所述AR环境中展示所述虚拟景点导览地图中与所述导览区域对应的呈现效果内容。
在实施中,在展示呈现效果内容时,例如可以将呈现效果内容结合AR设备拍摄的视频或图像进行展示;例如,AR设备拍摄的图像包括导览区域内的导览对象;可以根据AR设备拍摄的图像,确定导览对象在图像中的位置,并基于该位置,确定呈现效果内容的展示位置。又例如,在扫描导览门票的时候,会获取到导览门票的图像;例如可 以将导览门票在图像中的位置作为参考,确定呈现效果内容的展示位置,此处,例如可以根据导览门票在图像中的位置,确定展示平面或者展示空间,该展示平面或者展示空间的位置,即为呈现效果内容的展示位置。
另外,还可以根据AR设备拍摄的图像、以及预先生成的目标场景的三维场景模型,确定AR设备在目标场景中的位置。然后根据该位置,确定呈现效果内容的展示位置。
在基于上述任一种方法确定了呈现效果内容的展示位置后,根据该展示位置,在AR环境中展示呈现效果内容。在将呈现效果内容结合AR设备拍摄的图像进行展示的时候,例如可以将呈现效果内容在AR设备拍摄的图像的前端进行展示。
呈现效果内容可以是基于第一历史导览信息确定的导览素材生成的;该呈现效果内容例如可以包括但不限于目标渲染内容、音频内容中至少一种,该目标渲染内容可以包括但不限于导览区域对应的视频、图像、动画、虚拟模型、AR特效以及文本中的至少一种;其中,虚拟模型例如可以包括二维模型或三维模型。音频内容可以包括但不限于与导览区域对应的音乐素材、对导览区域相关的景区故事进行讲解的语音等中至少一种。
例如,在呈现效果内容包括音频内容的情况下,可以在用户到达任一导览区域(如特定的景区),且AR设备满足进入AR条件与讲解条件后,在AR环境中展示虚拟景点导览地图中与该导览区域对应的讲解内容,在用户选择开始播放音频的情况下,根据音频中的关键词与设定的讲解内容及AR特效进行匹配,根据音频中的关键词实时搜索相对应的讲解内容,并将不同的讲解内容与分割好的时间段进行匹配,如根据关键词,时间段第0毫秒到1200毫秒匹配讲解内容A,此时可以在AR环境中结合虚拟景点导览地图为用户展示讲解内容A对应的AR特效,并采用AR的方式引导用户看向对应的景点;在讲解内容A的音频播放结束的情况下,提示用户讲解内容A已讲解完毕并引导用户点击下一段音频;在全部音频内容播放完成的情况下,导览结束。此外,在采用AR的方式引导用户看向对应的景点的过程中,可以检测用户是否在看向对应的景点,在用户未看向该景点的情况下,提示用户看向该景点;在用户看向该景点的情况下,检测用户与该景点之间的距离是否小于设定的距离阈值,并在用户与该景点之间的距离小于该距离阈值的情况下,提示用户靠近该景点。这样,除了根据用户当前所处的导览区域对应的讲解内容进行AR音频讲解,同时还能将虚拟景点导览地图与AR音频导览相结合,从而可以增加AR音频导览的实用性与灵活性。
又如,在呈现效果内容包括AR特效的情况下,AR特效可以与用户信息相结合,在AR设备扫描到二维码时,还可以获取用户信息,如设备号信息,可在原有AR特效的基础上叠加用户信息对应的特效,原有AR特效与用户信息对应的特效可以同时叠加展示,还可以前后播放。此外,还可以根据各游览区域的被到访次数等更新虚拟导览地图中的导览路线、根据用户的历史记录进行导览路线推荐(如优先推荐用户多次游览的导览路线)或更新景区明信片的景区故事等,如用户在历史途径一些导览区域时将此处的AR讲解点击跳过,则下次经过时会更换AR故事内容或者为用户推荐其他导览路线。
再如,在呈现效果内容包括与导览区域相关的AR拍照特效模板的情况下,可以在用户到达某导览区域的情况下,AR设备获取与该导览区域相关的AR拍照模板,并提示用户可以拍照打卡或文字记录心情,同时支持一键分享打卡内容给好友。在用户游览 结束到达终点时会显示游览过程中的打卡记录。可以根据用户在导览区域的游览次数推送不同的AR拍照特效模板供用户选择,如在用户第1次来到该导览区域进行打卡的情况下,可以在AR拍照特效模板中显示“您是第1次游览”等信息;在用户多次来到该导览区域进行打卡的情况下,可以调用历史打卡记录生成一个该导览区域的打卡影集推送给用户,用户可以保存到相册或者分享给好友。
在实施时,在第一历史导览信息包括对导览区域对应的第一导览素材的第一操作信息的情况下,则确定AR设备所属用户对该第一导览素材具有兴趣度,则可以根据该第一操作信息对应的第一导览素材,从与导览区域对应的第二导览素材中,确定与第一导览素材的匹配度大于预设匹配度阈值的第一目标导览素材;基于第一目标导览素材,生成导览区域对应的呈现效果内容,并在AR环境中展示虚拟导览地图中导览区域对应的呈现效果内容。
第二导览素材可以包括该导览区域对应的导览素材中除第一导览素材外的其他导览素材。在实施中,可以为导览区域对应的各导览素材设置多个表征该导览素材类型的标签,例如可以包括用于描述该导览素材内容的文物、自然风景、历史、建筑等多种类型的标签,还可以包括表征该导览素材的展示类型的卡通、动漫等多种展示类型的标签中的至少一种,在一些实施方式中,可以通过下述方法确定第一目标导览素材:根据第一导览素材的标签,从该导览区域的导览素材中除第一导览素材外的第二导览素材中,确定与第一导览素材具有相同标签的候选导览素材;通过计算第一导览素材与候选导览素材之间具有的相同标签数量,占据标签总数量的百分比,确定第一导览素材与候选导览素材之间的匹配度;根据第一导览素材与候选导览素材之间的匹配度,确定与第一导览素材之间的匹配度大于预设匹配度阈值的候选导览素材作为第一目标导览素材;还可以将AR设备所属用户具有兴趣度的第一导览素材作为第一目标导览素材。
在一种可能的实施方式中,在第一历史导览信息包括对导览区域对应的第一导览素材的第二操作信息的情况下,则确定AR设备所属用户对该第一导览素材不具有兴趣度,则可以从导览区域对应的导览素材中,确定除第一导览素材外的第二导览素材;基于第二导览素材,生成导览区域对应的呈现效果内容,并在AR环境中展示虚拟导览地图中导航区域对应的呈现效果内容。
在另一种可能的实施方式中,在第一历史导览信息包括对导览区域对应的第一导览素材的跳过操作、以及关闭操作中的至少一种的情况下,可以根据用户在对第一导览素材进行跳过或关闭操作时该第一导览素材的展示进度信息,从第一导览素材中,确定未向用户展示过的第二目标导览素材;基于第一导览素材中未向用户展示过的第二目标导览素材,生成导览区域对应的呈现效果内容,并在AR环境中展示虚拟导览地图中导航区域对应的呈现效果内容。
在另一种可能的实施方式中,在第一历史导览信息包括第一导览素材的目标偏好类型的情况下,可以基于该第一导览素材的目标偏好类型,从导览区域对应的导览素材中,确定属于目标偏好类型的第三目标导览素材;基于所述第三目标导览素材,生成导览区域对应的呈现效果内容,并在AR环境中展示虚拟导览地图中导航区域对应的呈现效果内容。示例性的,导览门票对应的目标场景为某山,该目标场景中包含的导览区域包括 某建筑、花样年华景区、欢乐世界景区、地址公园景区;基于上述S102的实施方式确定AR设备位于某建筑内,则获取该AR设备的第一历史导览信息包括:某建筑的虚拟模型、以及某建筑历史简介信息;基于该第一历史导览信息,确定与该用户的兴趣点匹配的导览素材包括:某建筑的虚拟模型、以及某建筑历史简介信息,从而基于该导览素材,生成某建筑对应的呈现效果内容,并在AR环境中展示某建筑对应的呈现效果内容,呈现效果内容的展示界面示意图可以如图2所示,可以在呈现效果内容的展示界面中展示某建筑的虚拟模型21、以及某建筑历史简介信息22。
本公开实施例中,通过扫描导览门票,可以使得AR设备能够较便捷、较快速的进入到AR环境中,并AR环境中展示导览门票对应的虚拟导览地图,若检测到AR设备位于虚拟导览地图标识的任一导览区域内,则根据AR设备的第一历史导览信息,在AR环境中展示虚拟导览地图中与所述导览区域对应的呈现效果内容,从而提升了在各个导览区域为用户展示的内容的多样性。
在一种可能的实施方式中,在导览门票对应的目标场景中包括多个导览区域时,可以根据用户游览该目标场景的第二历史导览信息,为用户推荐符合用户兴趣点的导览路线,可以参见图3所示的S301~S303:
S301、响应于导览事件被触发,获取用户的第二历史导览信息。
其中,第二历史导览信息可以包括但不限于目标场景中各导览区域的历史导览记录、以及多个导览区域中各导览区域的被到访次数中的至少一种。
实施时,在用户使用AR设备扫描到导览门票时,可以根据AR设备的定位传感器,获取AR设备当前所处的位置;基于AR设备当前所处的位置,确定AR设备当前所处的位置是否位于虚拟导览地图对应的目标场景内;若AR设备当前所处的位置位于虚拟导览地图对应的目标场景内,则确定导览事件被触发。或者,也可以参考上述实施例记载,基于AR设备采集的实时场景图像,确定其在目标场景对应的三维场景模型上的位置,进而确定出AR设备是否位于虚拟导览地图对应的目标场景内,以便确定导览事件是否被触发。另外,在用户根据导览路线游览目标场景时,若用户未按照该导览路线规划的路径行驶时,则也可能触发导览事件,以便根据AR设备当前所处的位置,为用户重新规划导览路线,在一些实施方式中可以如下:获取AR设备的当前所处的位置;基于AR设备的当前所处的位置,确定AR设备是否与最近一次生成的导览路线之间的位置偏移大于预设的位置偏移阈值,若是,则确定导览事件被触发。在导览事件被触发后,可以获取用户在目标场景中各导览区域的历史导览记录、以及用户对目标场景中各导览区域的到访次数等第二历史导览信息。
S302、基于第二历史导览信息,生成针对多个导览区域的导览路线。
在实施中,在第二历史导览信息中包括导览区域的历史导览记录时,可以基于导览区域的历史导览记录,确定未触发过对应导览任务的第一目标导览区域;基于第一目标导览区域在目标场景中的位置信息,以及AR设备当前所处的位置,生成从AR设备当前所处的位置到第一目标导览区域的导览路线,以使用户可以根据该导览路线,获取到第一目标导览区域对应的导览视频、导览动画、虚拟模型、以及导览语音中的至少一种,从而丰富了目标场景各导览区域的信息展示形式,用户能够根据该第一目标导览区域对 应的各种展现形式的信息,充分了解该第一目标导览区域,提升了用户对目标场景的了解程度;其中,第一目标导览区域例如包括未向用户展示过目标渲染内容的导览区域。
在一种可能的实施方式中,在第二历史导览信息包括多个导览区域中各导览区域的被到访次数的情况下,可以基于多个导览区域中各导览区域的被到访次数,确定多个被到访次数大于预设被到访次数阈值的第二目标导览区域;基于第二目标导览区域在目标场景中的位置信息以及AR设备当前所处的位置,生成从AR设备当前所处的位置到第二目标导览区域的导览路线;这里,将用户经常浏览的导览区域作为与用户兴趣点匹配的导览区域,并生成从AR设备当前所处的位置到用户兴趣点匹配的导览区域的导览路线,以使用户可以根据该导览路线,再次浏览自己感兴趣的导览区域,满足了用户需求。
在目标导览区域包括多个的情况下,可以基于各目标导览区域在目标场景中的位置信息,以及AR设备当前所处的位置,确定各目标导览区域之间的第一距离、以及各目标导览区域与AR设备当前所处的位置之间的第二距离;基于第一距离和第二距离,生成从AR设备当前所处的位置到各目标导览区域的路程最短的导览路线。这里,目标导览区域可以包括第一目标导览区域和第二目标导览区域中的至少一种。
除此之外,在目标导览区域为需要排队游览的导览区域的情况下,还可以监测各目标导览区域当前排队人数、以及人均游玩时长,根据各目标导览区域当前排队人数、以及人均游玩长,确定各目标导览区域所需的排队时长;按照各目标导览区域所需的排队时长由小到大的顺序,生成由AR设备当前所处的位置到各目标导览区域的导览路线,以使减少用户的排队时长,提高用户游览体验。
S303、在AR环境中展示导览路线。
在确定导览路线后,可以基于导览路线中包括的各导览区域在目标场景中的位置信息,在目标场景的虚拟导览地图中标识导览路线,生成包含导览路线的展示地图;并在AR设备中展示该展示地图。示例性的,目标场景为某游乐场时,在AR环境中展示导览路线的展示界面可以如图4所示,用户的当前位置位于漂流导览区域311;基于上述S301~S303生成的导览路线为从漂流导览区域311到城堡导览区域312、再到巡游花车导览区域313,在导览路线的展示界面中可以用箭头310表示导览路线。
在用户根据导览路线游览各导览区域时,若用户在游览某一导览区域时触发展示该导览区域的呈现效果内容,则AR设备响应触发呈现效果内容展示事件,将展示地图进行缩小处理,并在AR设备的第一区域展示缩小后的展示地图;以及,在AR设备的第二区域展示呈现效果内容。其中,该导览区域的呈现效果内容可以包括但不限于该导览区域的导览视频、导览语音、导览动画、虚拟模型以及导览区域文字信息中的至少一种。AR设备的第一区域例如可以包括但不限于左上角、右下角、左下角、右上角等处于AR设备边缘区域的部分图形用户界面展示区域;该AR设备的第二区域例如可以包括但不限于AR设备的中心区域的部分图形用户界面展示区域。示例性的,若用户根据图4中的导览路线游览目标场景时,用户触发展示城堡的呈现效果内容后,展示界面可以如图5所示,可以将标识了导览路线的展示地图进行缩小处理,并在AR设备的左上角屏幕展示区域展示缩小后的展示地图321,并在AR设备的中心区域的部分图形用户界面展示区域展示该城堡的虚拟模型322、以及城堡对应的文字介绍信息323。
此外,用户在根据导览路线游览各导览区域的过程中,若在游览某一导览区域时触发展示该导览区域的呈现效果内容,则AR设备响应触发呈现效果内容展示事件,获取该导览区域的呈现效果内容,并将该呈现效果内容以及展示地图进行叠加展示。示例性的,若用户根据图4中的导览路线游览目标场景时,用户触发展示城堡的呈现效果内容后,展示界面可以如图6所示,可以获取城堡对应的文字介绍信息323、以及该城堡的虚拟模型322,并将城堡对应的文字介绍信息323、以及该城堡的虚拟模型322叠加在图4中的导览路线的前端进行展示。
在实施中,在用户触发缩小后的展示地图时,AR设备响应用户对缩小后的展示地图的触发操作,将缩小后的展示地图进行放大处理,并在AR设备的第二区域展示放大后的展示地图。
在一种可能的实施方式中,可以响应于扫描到导览门票,根据AR设备上的定位传感器,获取AR设备当前所处的位置;基于AR设备当前所处的位置、以及虚拟导览地图对应的目标场景的位置信息,利用目标场景的虚拟导览地图,展示AR设备当前所处的位置与目标场景的相对位置关系。或者,也可以参考上述实施例记载,基于AR设备采集的实时场景图像,确定其在目标场景对应的三维场景模型上的位置,进而确定出AR设备当前所处位置与目标场景的相对位置关系。若基于AR设备当前所处的位置、以及虚拟导览地图对应的目标场景的位置信息,确定AR设备当前所处的位置位于目标场景外,展示界面可以如图7所示,可以展示AR设备当前所处的位置41与目标场景位置42之间的相对方向和距离。
若基于AR设备当前所处的位置、以及虚拟导览地图对应的目标场景的位置信息,确定用户位于目标场景内,可以获取AR设备采集到的实时场景图像,作为检测图像;提取该检测图像中的特征点,将该检测图像中的特征点与目标场景的三维场景模型中包含的特征点云进行匹配,确定AR设备采集该实时场景图像时的定位信息;基于AR设备采集该实时场景图像时的位置信息、以及目标场景中各导览区域在虚拟导览地图中的位置信息,确定该AR设备是否位于多个导览区域中的任一导览区域;若确定AR设备位于多个导览区域中的任一导览区域中,则获取AR设备的第一历史导览信息;基于第一历史导览信息,在AR环境中展示虚拟导览地图中导览区域对应的呈现效果内容,在实施时可以参见上述实施例提供的数据展示方法中的S103~S104所示。
若确定AR设备位于目标场景中任一导览区域外,则可以基于AR设备当前所处的位置、以及多个导览区域中的第三目标导览区域的第二位置信息,生成由AR设备当前所处的位置指向第三目标导览区域的指向信息;其中,该指向信息用来指引用户从AR设备当前所处的位置移动至第三目标导览区域,例如可以包括导航指引文字、导航指引路线图、导航指引符号、导航指引语音等多种信息中的至少一种。
在实施中,可以基于AR设备当前所处的位置、第一历史导览信息、第二历史导览信息、以及生成的导览路线中的至少一项,从多个导览区域中,确定与用户兴趣点匹配的第三目标导览区域;这里,第三目标导览区域例如可以包括第一目标导览区域和第二目标导览区域中的至少一种。此外,第三目标导览区域还可以表征用户在目标场景中多个导览区域中选中的感兴趣的导览区域,该第三目标导览区域例如可以包括目标场景中 包含的多个导览区域中的任一导览区域。
在生成由AR设备当前所处的位置指向第三目标导览区域的指向信息后,可以通过AR设备展示该指向信息,以便可以基于该指向信息,控制AR设备移动至第三目标导览区域。图8为本公开实施例提供的一种AR设备的展示界面示意图,如图8所示,AR设备的展示界面中展示有指向信息51,比如,该指向信息可以包括“先向前移动50米,接着右转,右转后移动20米”的导航指引文字,还可以包括位于图8左上位置的用于指引从AR设备当前所处位置521去往第三目标导览区域位置522的导航指引路线图52。
图9为本公开实施例提供的一种数据展示方法的流程示意图,如图9所示,该方法包括:
S901、响应于扫描到导览门票,获取AR设备当前所处的位置。
S902、基于AR设备当前所处的位置、以及虚拟导览地图对应的目标场景的位置信息,确定AR设备当前所处的位置是否位于目标场景内。
S903、响应于AR设备当前所处的位置位于目标场景外,展示AR设备当前所处的位置与目标场景的相对位置关系。
S904、响应于AR设备当前所处的位置位于目标场景内,获取检测图像。
S905、基于检测图像,确定AR设备是否位于多个导览区域中的任一导览区域。
S906、响应于AR设备当前所处的位置位于导览区域内,获取AR设备的第一历史导览信息;基于第一历史导览信息,在AR环境中展示虚拟导览地图中导览区域对应的呈现效果内容。
S907、响应于AR设备当前所处的位置位于导览区域外,基于AR设备当前所处的位置、以及多个导览区域中的第三目标导览区域的第二位置信息,生成指向第三目标导览区域的指向信息;展示指向信息。
本领域技术人员可以理解,在具体实施方式的上述方法中,各步骤的撰写顺序并不意味着严格的执行顺序而对实施过程构成任何限定,各步骤的具体执行顺序应当以其功能和可能的内在逻辑确定。
基于同一发明构思,本公开实施例中还提供了与数据展示方法对应的数据展示装置,由于本公开实施例中的装置的原理与本公开实施例上述数据展示方法相似,因此装置的实施可以参见方法的实施。
参照图10所示,为本公开实施例提供的一种数据展示装置的示意图,所述装置包括:扫描模块1001、第一展示模块1002、获取模块1003、第二展示模块1004;
扫描模块1001,配置为扫描到导览门票,获取增强现实AR设备所处的位置;
第一展示模块1002,配置为在所述AR设备的AR环境中展示与所述导览门票对应的虚拟导览地图;
获取模块1003,配置为检测到AR设备位于所述虚拟导览地图标识的任一导览区域内,获取所述AR设备的第一历史导览信息;
第二展示模块1004,配置为基于所述第一历史导览信息,在所述AR环境中展示所述虚拟导览地图中与所述导览区域对应的呈现效果内容。
在一些实施例中,所述呈现效果内容包括以下至少之一:目标渲染内容、音频内容, 所述目标渲染内容的形式包括视频、图像、动画、虚拟模型、文本中至少一种。
在一些实施例中,所述第一历史导览信息包括下述至少一种:对所述导览区域对应的第一导览素材的操作信息;所述第一导览素材的目标偏好类型。
在一些实施例中,所述第一历史导览信息包括对所述导览区域对应的第一导览素材的第一操作信息;所述第一操作信息表征对所述第一导览素材具有兴趣度;第二展示模块1004,还配置为:根据所述第一操作信息对应的所述第一导览素材,从与所述导览区域对应的第二导览素材中,确定与所述第一导览素材的匹配度大于预设匹配阈值的第一目标导览素材;基于所述第一目标导览素材,在所述AR环境中展示所述虚拟导览地图中与所述导览区域对应的呈现效果内容。
在一些实施例中,所述第一操作信息包括:添加关注操作、点赞操作、以及分享操作中至少一种。
在一些实施例中,述第一历史导览信息包括对所述导览区域对应的第一导览素材的第二操作信息;所述第二操作信息表征对所述第一导览素材不具有兴趣度;第二展示模块1004,还配置为:从所述导览区域对应的导览素材中,确定除所述第一导览素材外的第二导览素材;基于所述第二导览素材,在所述AR环境中展示所述虚拟导览地图中与所述导览区域对应的呈现效果内容。
在一些实施例中,所述第二操作信息包括:跳过操作、关闭操作、以及快进操作中至少一种。
在一些实施例中,所述第一历史导览信息包括第一导览素材的目标偏好类型;第二展示模块1004,还配置为:基于所述目标偏好类型,从所述导览区域对应的导览素材中,确定属于所述目标偏好类型的第三目标导览素材;基于所述第三目标导览素材,在所述AR环境中展示所述虚拟导览地图中与所述导览区域对应的呈现效果内容。
在一些实施例中,所述导览区域有多个;所述数据展示装置还包括:第三展示模块,配置为:响应于导览事件被触发,获取用户的第二历史导览信息;基于所述第二历史导览信息,生成针对多个所述导览区域的导览路线;在所述AR环境中展示所述导览路线。
在一些实施例中,所述导览事件被触发,包括下述至少一种:获取所述AR设备当前所处的位置;基于所述AR设备当前所处的位置,确定所述AR设备与最近一次生成的导览路线之间的位置偏移大于预设的位置偏移阈值;获取所述AR设备当前所处的位置;基于所述AR设备当前所处的位置,确定所述AR设备当前所处的位置位于所述虚拟导览地图对应的目标场景内。
在一些实施例中,所述第二历史导览信息包括所述导览区域的历史导览记录;所述第三展示模块,还配置为:基于所述导览区域的历史导览记录,从多个所述导览区域中,确定未触发过对应导览任务的第一目标导览区域;基于所述第一目标导览区域在目标场景中的位置信息、以及所述AR设备当前所处的位置,生成所述导览路线。
在一些实施例中,所述第二历史导览信息包括多个所述导览区域中各导览区域的被到访次数;所述第三展示模块,还配置为:基于所述多个所述导览区域中各导览区域的被到访次数,确定多个被到访次数大于预设被到访次数阈值的第二目标导览区域;基于所述第二目标导览区域在目标场景中的位置信息、以及所述AR设备当前所处的位置, 生成所述导览路线。
在一些实施例中,所述第三展示模块,还配置为:基于所述AR设备当前所处的位置、以及各导览区域在目标场景中的位置信息,确定所述AR设备当前所处的位置与所述各导览区域之间的距离;基于所述距离,生成导览路线。
在一些实施例中,所述第三展示模块,还配置为:监测各导览区域当前排队人数、以及人均游玩时长;根据所述各导览区域当前排队人数、以及人均游玩时长,确定所述各导览区域对应的预估排队时长;基于所述各导览区域对应的预估排队时长,生成导览路线。
在一些实施例中,所述第三展示模块,还配置为:基于所述导览路线中包括的各导览区域在目标场景中的位置信息,在所述目标场景的虚拟导览地图中标识所述导览路线,得到展示地图;在所述AR设备中展示所述展示地图。
在一些实施例中,所述数据展示装置还包括:第四展示模块,配置为响应于触发呈现效果内容展示事件,将所述展示地图进行缩小处理,并在所述AR设备的第一区域展示缩小后的所述展示地图;以及,在所述AR设备的第二区域展示所述呈现效果内容。
在一些实施例中,第四展示模块,还配置为响应于对缩小后的所述展示地图的触发操作,将缩小后的所述展示地图进行放大处理,并在所述第二区域展示放大后的所述展示地图。
在一些实施例中,第四展示模块,还配置为响应于触发呈现效果内容展示事件,将所述呈现效果内容以及所述展示地图进行叠加展示。
在一些实施例中,扫描模块还配置为:响应于扫描到所述导览门票,获取所述AR设备当前所处的位置;所述数据展示模块还包括:第五展示模块,配置为基于所述AR设备当前所处的位置、以及所述虚拟导览地图对应的目标场景的位置信息,展示所述AR设备当前所处的位置与所述目标场景的相对位置关系。
在一些实施例中,获取模块,还配置为响应于所述AR设备位于所述目标场景内,获取检测图像;基于所述检测图像,确定所述AR设备是否位于多个所述导览区域中的任一导览区域。
在一些实施例中,第五展示模块,还配置为响应于所述AR设备位于所述导览区域外,基于所述AR设备当前所处的位置、以及多个所述导览区域中的第三目标导览区域的第二位置信息,生成指向所述第三目标导览区域的指向信息;展示所述指向信息。
在一些实施例中,第五展示模块,还配置为基于所述AR设备当前所处的位置、第一历史导览信息和第二历史导览信息、以及导览路线中的至少一项,从多个所述导览区域中确定第三目标导览区域。
关于装置中的各模块的处理流程、以及各模块之间的交互流程的描述可以参照上述方法实施例中的相关说明。
基于同一技术构思,本公开实施例还提供了一种计算机设备。参照图11所示,为本公开实施例提供的计算机设备1100的结构示意图,包括处理器1101、存储器1102、和总线1103。其中,存储器1102用于存储执行指令,包括内存11021和外部存储器11022;这里的内存11021也称内存储器,用于暂时存放处理器1101中的运算数据,以及与硬 盘等外部存储器11022交换的数据,处理器1101通过内存11021与外部存储器11022进行数据交换,当计算机设备1100运行时,处理器1101与存储器1102之间通过总线1103通信,使得处理器1101执行以下指令:扫描到导览门票,获取增强现实AR设备所处的位置;在所述AR设备的AR环境中展示与所述导览门票对应的虚拟导览地图;检测到AR设备位于所述虚拟导览地图标识的任一导览区域内,获取所述AR设备的第一历史导览信息;基于所述第一历史导览信息,在所述AR环境中展示所述虚拟导览地图中与所述导览区域对应的呈现效果内容。其中,处理器1101的具体处理流程可以参照上述方法实施例的记载。
本公开实施例还提供一种计算机可读存储介质,该计算机可读存储介质上存储有计算机程序,该计算机程序被处理器运行时执行上述方法实施例中所述的数据展示方法的步骤。其中,该存储介质可以是易失性或非易失的计算机可读取存储介质。
本公开实施例还提供一种计算机程序,所述计算机程序包括计算机可读代码,在所述计算机可读代码被计算机读取并执行的情况下,实现本公开实施例中所述的数据展示方法的部分或全部步骤。
本公开实施例还提供一种计算机程序产品,该计算机程序产品承载有程序代码,所述程序代码包括的指令可用于执行上述方法实施例中所述的数据展示方法的步骤,具体可参见上述方法实施例。其中,上述计算机程序产品可以具体通过硬件、软件或其结合的方式实现。在一些实施例中,所述计算机程序产品具体体现为计算机存储介质,在另一些实施例中,计算机程序产品具体体现为软件产品,例如软件开发包(Software Development Kit,SDK)等等。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统和装置的具体工作过程,可以参考前述方法实施例中的对应过程。在本公开所提供的几个实施例中,应该理解到,所揭露的设备、装置、方法、存储介质、计算机程序产品和计算机程序,可以通过其它的方式实现。以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,又例如,多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些通信接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本公开各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个处理器可执行的非易失的计算机可读取存储介质中。基于这样的理解,本公开实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本公 开各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
最后应说明的是:以上所述实施例,仅为本公开的具体实施方式,用以说明本公开的技术方案,而非对其限制,本公开的保护范围并不局限于此,尽管参照前述实施例对本公开进行了详细的说明,本领域的普通技术人员应当理解:任何熟悉本技术领域的技术人员在本公开揭露的技术范围内,其依然可以对前述实施例所记载的技术方案进行修改或可轻易想到变化,或者对其中部分技术特征进行等同替换;而这些修改、变化或者替换,并不使相应技术方案的本质脱离本公开实施例技术方案的精神和范围,都应涵盖在本公开的保护范围之内。
工业实用性
本公开实施例提供了一种数据展示方法、装置、计算机设备、存储介质、计算机程序产品及计算机程序,其中,该方法包括:扫描到导览门票,获取增强现实AR设备所处的位置;在所述AR设备的AR环境中展示与所述导览门票对应的虚拟导览地图;检测到AR设备位于所述虚拟导览地图标识的任一导览区域内,获取所述AR设备的第一历史导览信息;基于所述第一历史导览信息,在所述AR环境中展示所述虚拟导览地图中与所述导览区域对应的呈现效果内容。根据本公开实施例,可以使得AR设备通过扫描导览门票较便捷、较快速的进入到AR环境中,并在AR环境中展示虚拟导览地图中与该导览区域对应的呈现效果内容,从而提升了在各个导览区域为用户展示的内容的多样性。

Claims (20)

  1. 一种数据展示方法,包括:
    扫描到导览门票,获取增强现实AR设备所处的位置;
    在所述AR设备的AR环境中展示与所述导览门票对应的虚拟导览地图;
    检测到AR设备位于所述虚拟导览地图标识的任一导览区域内,获取所述AR设备的第一历史导览信息;
    基于所述第一历史导览信息,在所述AR环境中展示所述虚拟导览地图中与所述导览区域对应的呈现效果内容。
  2. 根据权利要求1所述的数据展示方法,其中,所述呈现效果内容包括以下至少之一:目标渲染内容、音频内容,所述目标渲染内容的形式包括视频、图像、动画、虚拟模型、文本中至少一种。
  3. 根据权利要求1或2所述的数据展示方法,其中,所述第一历史导览信息包括下述至少一种:
    对所述导览区域对应的第一导览素材的操作信息;
    所述第一导览素材的目标偏好类型。
  4. 根据权利要求1-3任一项所述的数据展示方法,其中,所述第一历史导览信息包括对所述导览区域对应的第一导览素材的第一操作信息;所述第一操作信息表征对所述第一导览素材具有兴趣度;
    所述基于所述第一历史导览信息,在所述AR环境中展示所述虚拟导览地图中与所述导览区域对应的呈现效果内容,包括:
    根据所述第一操作信息对应的所述第一导览素材,从与所述导览区域对应的第二导览素材中,确定与所述第一导览素材的匹配度大于预设匹配阈值的第一目标导览素材;
    基于所述第一目标导览素材,在所述AR环境中展示所述虚拟导览地图中与所述导览区域对应的呈现效果内容。
  5. 根据权利要求1-4任一项所述的数据展示方法,其中,所述第一历史导览信息包括对所述导览区域对应的第一导览素材的第二操作信息;所述第二操作信息表征对所述第一导览素材不具有兴趣度;
    所述基于所述第一历史导览信息,在所述AR环境中展示所述虚拟导览地图中与所述导览区域对应的呈现效果内容,包括:
    从所述导览区域对应的导览素材中,确定除所述第一导览素材外的第二导览素材;
    基于所述第二导览素材,在所述AR环境中展示所述虚拟导览地图中与所述导览区域对应的呈现效果内容。
  6. 根据权利要求1-5任一项所述的数据展示方法,其中,所述第一历史导览信息包括第一导览素材的目标偏好类型;
    所述基于所述第一历史导览信息,在所述AR环境中展示所述虚拟导览地图中与所述导览区域对应的呈现效果内容,包括:
    基于所述目标偏好类型,从所述导览区域对应的导览素材中,确定属于所述目标偏好类型的第三目标导览素材;
    基于所述第三目标导览素材,在所述AR环境中展示所述虚拟导览地图中与所述导览区域对应的呈现效果内容。
  7. 根据权利要求1-6任一项所述的数据展示方法,其中,所述导览区域有多个;
    所述数据展示方法还包括:
    响应于导览事件被触发,获取用户的第二历史导览信息;
    基于所述第二历史导览信息,生成针对多个所述导览区域的导览路线;
    在所述AR环境中展示所述导览路线。
  8. 根据权利要求7所述的数据展示方法,其中,所述导览事件被触发,包括下述至少一种:
    获取所述AR设备当前所处的位置;基于所述AR设备当前所处的位置,确定所述AR设备与最近一次生成的导览路线之间的位置偏移大于预设的位置偏移阈值;
    获取所述AR设备当前所处的位置;基于所述AR设备当前所处的位置,确定所述AR设备当前所处的位置位于所述虚拟导览地图对应的目标场景内。
  9. 根据权利要求7-8任一项所述的数据展示方法,其中,所述第二历史导览信息包括所述导览区域的历史导览记录;
    所述基于所述第二历史导览信息,生成针对多个所述导览区域的导览路线,包括:
    基于所述导览区域的历史导览记录,从多个所述导览区域中,确定未触发过对应导览任务的第一目标导览区域;
    基于所述第一目标导览区域在目标场景中的位置信息、以及所述AR设备当前所处的位置,生成所述导览路线。
  10. 根据权利要求7-9任一项所述的数据展示方法,其中,所述第二历史导览信息包括多个所述导览区域中各导览区域的被到访次数;
    所述基于所述第二历史导览信息,生成针对多个所述导览区域的导览路线,包括:
    基于所述多个所述导览区域中各导览区域的被到访次数,确定多个被到访次数大于预设被到访次数阈值的第二目标导览区域;
    基于所述第二目标导览区域在目标场景中的位置信息、以及所述AR设备当前所处的位置,生成所述导览路线。
  11. 根据权利要求1-10任一项所述的数据展示方法,所述方法还包括:
    基于所述AR设备当前所处的位置、以及各导览区域在目标场景中的位置信息,确定所述AR设备当前所处的位置与所述各导览区域之间的距离;基于所述距离,生成导览路线;或者,
    监测各导览区域当前排队人数、以及人均游玩时长;根据所述各导览区域当前排队人数、以及人均游玩时长,确定所述各导览区域对应的预估排队时长;基于所述各导览区域对应的预估排队时长,生成导览路线。
  12. 根据权利要求7-10任一项所述的数据展示方法,其中,所述在所述AR环境中展示所述导览路线,包括:
    基于所述导览路线中包括的各导览区域在目标场景中的位置信息,在所述目标场景的虚拟导览地图中标识所述导览路线,得到展示地图;
    在所述AR设备中展示所述展示地图。
  13. 根据权利要求12所述的数据展示方法,所述方法还包括:
    响应于触发呈现效果内容展示事件,将所述展示地图进行缩小处理,并在所述AR设备的第一区域展示缩小后的所述展示地图,以及在所述AR设备的第二区域展示所述呈现效果内容;或者,
    响应于对缩小后的所述展示地图的触发操作,将缩小后的所述展示地图进行放大处理,并在所述第二区域展示放大后的所述展示地图。
  14. 根据权利要求1-13任一项所述的数据展示方法,其中,
    所述扫描到导览门票,获取增强现实AR设备所处的位置,包括:响应于扫描到所述导览门票,获取所述AR设备当前所处的位置;
    所述检测到AR设备位于所述虚拟导览地图标识的任一导览区域,包括:响应于所述AR设备位于所述目标场景内,获取检测图像;基于所述检测图像,确定所述AR设备是否位于多个所述导览区域中的任一导览区域;
    所述方法还包括:
    基于所述AR设备当前所处的位置、以及所述虚拟导览地图对应的目标场景的位置信息,展示所述AR设备当前所处的位置与所述目标场景的相对位置关系。
  15. 根据权利要求14所述的数据展示方法,所述方法还包括:
    基于所述AR设备当前所处的位置、第一历史导览信息和第二历史导览信息、以及导览路线中的至少一项,从多个所述导览区域中确定第三目标导览区域;
    响应于所述AR设备位于所述导览区域外,基于所述AR设备当前所处的位置、以及多个所述导览区域中的第三目标导览区域的第二位置信息,生成指向所述第三目标导览区域的指向信息;
    展示所述指向信息。
  16. 一种数据展示装置,包括:
    扫描模块,配置为扫描到导览门票,获取增强现实AR设备所处的位置;
    第一展示模块,配置为在所述AR设备的AR环境中展示与所述导览门票对应的虚拟导览地图;
    获取模块,配置为检测到AR设备位于所述虚拟导览地图标识的任一导览区域内,获取所述AR设备的第一历史导览信息;
    第二展示模块,配置为基于所述第一历史导览信息,在所述AR环境中展示所述虚拟导览地图中与所述导览区域对应的呈现效果内容。
  17. 一种计算机设备,包括:处理器、存储器,所述存储器存储有所述处理器可执行的机器可读指令,所述处理器配置为执行所述存储器中存储的机器可读指令,所述机器可读指令被所述处理器执行时,所述处理器执行如权利要求1至15任一项所述的数据展示方法的步骤。
  18. 一种计算机可读存储介质,所述计算机可读存储介质上存储有计算机程序,所述计算机程序被计算机设备运行时,所述计算机设备执行如权利要求1至15任一项所述的数据展示方法的步骤。
  19. 一种计算机程序,包括计算机可读代码,在计算机可读代码在设备上运行的情况下,设备中的处理器执行用于实现权利要求1至15任一所述的数据展示方法的步 骤。
  20. 一种计算机程序产品,配置为存储计算机可读指令,所述计算机可读指令被执行时使得计算机执行权利要求1至15任一所述的数据展示方法的步骤。
PCT/CN2022/093971 2021-06-18 2022-05-19 数据展示方法、装置、计算机设备、存储介质、计算机程序产品及计算机程序 WO2022262521A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110679377.4 2021-06-18
CN202110679377.4A CN113282687A (zh) 2021-06-18 2021-06-18 数据展示方法、装置、计算机设备及存储介质

Publications (1)

Publication Number Publication Date
WO2022262521A1 true WO2022262521A1 (zh) 2022-12-22

Family

ID=77285052

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/093971 WO2022262521A1 (zh) 2021-06-18 2022-05-19 数据展示方法、装置、计算机设备、存储介质、计算机程序产品及计算机程序

Country Status (3)

Country Link
CN (1) CN113282687A (zh)
TW (1) TW202314535A (zh)
WO (1) WO2022262521A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113282687A (zh) * 2021-06-18 2021-08-20 北京市商汤科技开发有限公司 数据展示方法、装置、计算机设备及存储介质
CN114661398A (zh) * 2022-03-22 2022-06-24 上海商汤智能科技有限公司 一种信息展示方法、装置、计算机设备和存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109067839A (zh) * 2018-06-29 2018-12-21 北京小米移动软件有限公司 推送游览指导信息、创建景点信息数据库的方法及装置
US20190325509A1 (en) * 2018-04-20 2019-10-24 Walmart Apollo, Llc Systems and methods for generating virtual items
CN111551188A (zh) * 2020-06-07 2020-08-18 上海商汤智能科技有限公司 一种导航路线生成的方法及装置
CN111639818A (zh) * 2020-06-05 2020-09-08 上海商汤智能科技有限公司 一种路线规划方法、装置、计算机设备及存储介质
CN112365596A (zh) * 2020-11-28 2021-02-12 包头轻工职业技术学院 一种基于增强现实的旅游导览系统
CN113282687A (zh) * 2021-06-18 2021-08-20 北京市商汤科技开发有限公司 数据展示方法、装置、计算机设备及存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190325509A1 (en) * 2018-04-20 2019-10-24 Walmart Apollo, Llc Systems and methods for generating virtual items
CN109067839A (zh) * 2018-06-29 2018-12-21 北京小米移动软件有限公司 推送游览指导信息、创建景点信息数据库的方法及装置
CN111639818A (zh) * 2020-06-05 2020-09-08 上海商汤智能科技有限公司 一种路线规划方法、装置、计算机设备及存储介质
CN111551188A (zh) * 2020-06-07 2020-08-18 上海商汤智能科技有限公司 一种导航路线生成的方法及装置
CN112365596A (zh) * 2020-11-28 2021-02-12 包头轻工职业技术学院 一种基于增强现实的旅游导览系统
CN113282687A (zh) * 2021-06-18 2021-08-20 北京市商汤科技开发有限公司 数据展示方法、装置、计算机设备及存储介质

Also Published As

Publication number Publication date
CN113282687A (zh) 2021-08-20
TW202314535A (zh) 2023-04-01

Similar Documents

Publication Publication Date Title
US10769438B2 (en) Augmented reality
US10127734B2 (en) Augmented reality personalization
WO2022262521A1 (zh) 数据展示方法、装置、计算机设备、存储介质、计算机程序产品及计算机程序
CN111638796A (zh) 虚拟对象的展示方法、装置、计算机设备及存储介质
CN103620600B (zh) 用于实现虚拟标记的方法及设备
US11417365B1 (en) Methods, systems and apparatuses for multi-directional still pictures and/or multi-directional motion pictures
US9317962B2 (en) 3D space content visualization system
TW202119362A (zh) 一種擴增實境資料呈現方法、電子設備及儲存介質
US8489993B2 (en) Storage medium storing information processing program, information processing apparatus and information processing method
US20170153787A1 (en) Injection of 3-d virtual objects of museum artifact in ar space and interaction with the same
Romli et al. Mobile augmented reality (AR) marker-based for indoor library navigation
CN112684894A (zh) 增强现实场景的交互方法、装置、电子设备及存储介质
CN113359986B (zh) 增强现实数据展示方法、装置、电子设备及存储介质
US10102226B1 (en) Optical devices and apparatuses for capturing, structuring, and using interlinked multi-directional still pictures and/or multi-directional motion pictures
CN111625100A (zh) 图画内容的呈现方法、装置、计算机设备及存储介质
WO2022252688A1 (zh) 增强现实数据呈现方法、装置、电子设备及存储介质
CN113359985A (zh) 数据展示方法、装置、计算机设备以及存储介质
WO2022262389A1 (zh) 交互方法、装置、计算机设备及程序产品、存储介质
Bousbahi et al. Mobile augmented reality adaptation through smartphone device based hybrid tracking to support cultural heritage experience
CN114967914A (zh) 一种虚拟显示方法、装置、设备以及存储介质
CN111652986B (zh) 舞台效果呈现方法、装置、电子设备及存储介质
TW202248901A (zh) 瓶體的特效呈現方法、設備和電腦可讀儲存媒體
Bovcon et al. “Atlas 2012” Augmented Reality: A Case Study in the Domain of Fine Arts
Beder Language learning via an android augmented reality system
Yuan Design guidelines for mobile augmented reality reconstruction

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22824003

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE