CN111190485B - Information display method, information display device, electronic equipment and computer readable storage medium - Google Patents

Information display method, information display device, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN111190485B
CN111190485B CN201911379231.7A CN201911379231A CN111190485B CN 111190485 B CN111190485 B CN 111190485B CN 201911379231 A CN201911379231 A CN 201911379231A CN 111190485 B CN111190485 B CN 111190485B
Authority
CN
China
Prior art keywords
information
target
terminal equipment
scene
augmented reality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911379231.7A
Other languages
Chinese (zh)
Other versions
CN111190485A (en
Inventor
石盛传
欧华富
侯欣如
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sensetime Technology Development Co Ltd
Original Assignee
Beijing Sensetime Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sensetime Technology Development Co Ltd filed Critical Beijing Sensetime Technology Development Co Ltd
Priority to CN201911379231.7A priority Critical patent/CN111190485B/en
Publication of CN111190485A publication Critical patent/CN111190485A/en
Application granted granted Critical
Publication of CN111190485B publication Critical patent/CN111190485B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Abstract

The disclosure provides an information display method and device, electronic equipment and a computer readable storage medium, wherein a terminal device determines target positioning position information and target shooting angle information of the terminal device in an augmented reality scene based on initial positioning information, initial shooting angle information and moving information of the terminal device in the real scene, and further presents target information in the augmented reality scene in the terminal device based on the target positioning position information and the target shooting angle information, so that accuracy of the presented augmented reality scene is improved.

Description

Information display method, information display device, electronic equipment and computer readable storage medium
Technical Field
The disclosure relates to the technical field of computers, and in particular relates to an information display method, an information display device, electronic equipment and a computer readable storage medium.
Background
The augmented reality (Augmented Reality, AR) technology is a technology of skillfully fusing a virtual object with a real scene, the AR accurately places the virtual object in the environment of the real scene through a sensing technology, integrates the virtual object and the environment of the real scene into a whole by means of a display device, and presents a new environment with a real sensory effect to a user.
In the above technology, how to fuse a virtual object with a real object to form an augmented reality scene with more realistic and accurate user sensitivity is the subject of extensive research at present.
Disclosure of Invention
In view of this, the present disclosure provides an information display method and apparatus, an electronic device, and a computer-readable storage medium, for implementing accurate presentation of an augmented reality scene in a terminal device.
In a first aspect, the present disclosure discloses an information display method, applied to a terminal device, including:
acquiring initial positioning information of a terminal device in a real scene, initial shooting angle information of the terminal device and movement information of the terminal device in the real scene;
determining target positioning position information and target shooting angle information of the terminal equipment in an augmented reality scene based on the initial positioning information, the initial shooting angle information and the movement information;
presenting target information in the augmented reality scene in the terminal device based on the target positioning position information and the target shooting angle information; the target information comprises real scene information acquired by the terminal equipment and virtual scene information integrated in a real scene.
According to the method, after the initial positioning information and the initial shooting angle information of the terminal equipment in the real scene at the beginning of positioning are obtained, the real scene information and the virtual scene information matched with the real scene at the end of positioning are determined by combining the movement information in the positioning process, so that the accuracy of the presented augmented reality scene is improved.
In some possible implementations, the presenting, in the terminal device, the target information in the augmented reality scene based on the target positioning location information and the target shooting angle information includes:
determining virtual scene information to be presented in real scene information acquired by the terminal equipment based on the target positioning position information and the target shooting angle information;
and synthesizing the virtual scene information and the real scene information acquired by the terminal equipment to obtain the target information, and presenting the target information in the augmented reality scene.
Because the terminal equipment has the limitation of shooting angles and cannot shoot scenes in all directions, the implementation mode can screen out virtual scene information positioned in the shooting direction of the terminal equipment based on the target positioning position information and the target shooting angle information, namely, the virtual scene information shot by the terminal equipment, the virtual scene information and the real scene information acquired by the terminal equipment are synthesized and displayed in the augmented reality scene, and the accuracy of the augmented reality scene displayed in the terminal equipment is improved.
In some possible implementations, the presenting, in the terminal device, the target information in the augmented reality scene based on the target positioning location information and the target shooting angle information includes:
the target positioning position information and the target shooting angle information are sent to a server, and virtual scene information which is returned by the server and needs to be presented in the real scene currently shot by the terminal equipment is received; synthesizing the virtual scene information and the real scene information returned by the server to obtain the target information, and presenting the target information in the augmented reality scene of the terminal equipment; or alternatively, the process may be performed,
and sending the target positioning position information and the target shooting angle information to a server, receiving the target information which is returned by the server and is obtained by synthesizing the virtual scene information and the real scene information, and presenting the target information in the augmented reality scene of the terminal equipment.
According to the implementation mode, the target positioning position information and the target shooting angle information are sent to the server, so that the server can determine virtual scene information positioned in the shooting direction of the terminal equipment based on the target positioning position information and the target shooting angle information, and the accuracy of the augmented reality scene displayed in the terminal equipment can be improved. In addition, the virtual scene information or the target information is determined on the server, so that the information processing speed can be increased to a certain extent, and the occupation of processor resources on the terminal equipment can be reduced.
In some possible implementations, the initial positioning information includes plane coordinate information of the terminal device in the real scene;
the obtaining the initial positioning information of the terminal equipment in the real scene comprises the following steps:
acquiring a picture of a target area where the terminal equipment is located, wherein the picture is shot by the terminal equipment;
and determining plane coordinate information of the terminal equipment in a geographic coordinate system in a real scene based on the picture and a two-dimensional plane map comprising the target area.
The implementation mode is based on the association relation between the picture shot by the terminal equipment and the two-dimensional planar map, and can accurately determine the planar coordinate information of the terminal equipment in the geographic coordinate system in the real scene.
In some possible implementations, the determining plane coordinate information of the terminal device in a geographic coordinate system in a real scene based on the picture and a two-dimensional plane map including the target area includes:
determining map coordinate information of the terminal equipment on the two-dimensional planar map based on the picture and the two-dimensional planar map comprising the target area;
and determining plane coordinate information of the terminal equipment under a geographic coordinate system in the real scene based on the scale of the two-dimensional plane map and the real scene and the map coordinate information of the terminal equipment on the two-dimensional plane map.
Because the two-dimensional planar map and the real scene have the scale, after the map coordinate information of the terminal equipment on the two-dimensional planar map is determined, the plane coordinate information of the terminal equipment in the geographic coordinate system in the real scene is determined according to the scale. This approach is advantageous for improving the accuracy of the determined plane coordinate information.
In some possible implementations, the information display method further includes:
establishing communication connection with a target entity object in the augmented reality scene; responding to a first user trigger instruction aiming at the target entity object, and controlling the target entity object to execute a first response operation corresponding to the first user trigger instruction based on the communication connection; or alternatively, the process may be performed,
and responding to a first user trigger instruction aiming at the target entity object, and sending a first control instruction to a server controlling the target entity object so that the server controls the target entity object to execute a first response operation corresponding to the first control instruction based on the first control instruction.
After the terminal equipment is mapped to the augmented reality scene, the control operation of the terminal equipment on the target entity object in the augmented reality scene can be realized by utilizing the steps.
In some possible implementations, the information display method further includes:
and responding to a second user trigger instruction for a target virtual object in the augmented reality scene, determining a presentation effect of the target virtual object in the augmented reality scene based on the second user trigger instruction, and updating the currently presented augmented reality scene based on the presentation effect.
After the terminal equipment is mapped to the augmented reality scene, the control operation of the terminal equipment on the target virtual object in the augmented reality scene can be realized by utilizing the steps, and the state of the target virtual object after the corresponding operation is executed is presented in the augmented reality scene.
In some possible implementations, the information display method further includes:
responding to a third user trigger instruction aiming at a target virtual object and a target entity object in the augmented reality scene, wherein the third user trigger instruction is an instruction for jointly controlling the target virtual object and the target entity object;
and determining the presentation effect of the target virtual object and the target entity object in the augmented reality scene based on the third user trigger instruction, and updating the currently presented augmented reality scene based on the presentation effect.
After the terminal equipment is mapped to the augmented reality scene, the steps can be utilized to realize the joint control of the terminal equipment on the target virtual object and the target entity object in the augmented reality scene, and the presentation effect after the corresponding operation is executed is presented in the augmented reality scene.
In some possible implementations, the movement information includes plane coordinate change information of the terminal device in a real scene and altitude change information in the real scene; the initial positioning information comprises plane coordinate information and height information of the terminal equipment in a real scene;
the determining the target positioning position information of the terminal equipment in the augmented reality scene comprises the following steps:
determining the updated plane coordinate information of the terminal equipment in the real scene when the current positioning period is ended based on the plane coordinate information of the terminal equipment in the real scene and the plane coordinate change information of the terminal equipment in the current positioning period when the current positioning period is started; determining updated height information of the terminal equipment in the real scene when the current positioning period is ended based on the height information of the terminal equipment in the real scene at the beginning of the current positioning period and the height change information of the terminal equipment in the current positioning period;
And determining target positioning position information of the terminal equipment in the augmented reality scene when the current positioning period is finished based on the plane coordinate information and the height information updated by the terminal equipment in the real scene.
Because the position of the terminal equipment in the real scene may change in the positioning process, if the positioning period is only based on the plane coordinate information and other information at the beginning of the positioning period, the target positioning position information of the terminal equipment in the augmented reality scene cannot be accurately determined at the end of the positioning period. The implementation mode combines the mobile information of the terminal equipment in the positioning period, can determine the updated plane coordinate information and the updated height information of the terminal equipment in the real scene, and can accurately determine the target positioning position information based on the updated plane coordinate information and the updated height information, thereby being beneficial to improving the accuracy of the determined virtual scene information.
In some possible implementations, the movement information includes shooting angle change information of the terminal device in a plane coordinate system;
determining target shooting angle information of the terminal equipment in an augmented reality scene comprises the following steps:
Determining the updated shooting angle information of the terminal equipment under the plane coordinate system when the current positioning period is ended based on the initial shooting angle information of the terminal equipment under the plane coordinate system and the shooting angle change information of the terminal equipment under the plane coordinate system when the current positioning period is started;
and determining target shooting angle information of the terminal equipment in the augmented reality scene based on the updated shooting angle information of the terminal equipment in the plane coordinate system.
Since the shooting direction, that is, the shooting angle information of the terminal device may change during the positioning process, if the shooting angle information is only based on the shooting angle information at the beginning of the positioning period, the target shooting angle information of the terminal device in the augmented reality scene cannot be accurately determined at the end of the positioning period. According to the implementation mode, the shooting angle change information of the terminal equipment in the positioning period is combined, the updated shooting angle information of the terminal equipment in the plane coordinate system can be determined, and the target shooting angle information can be accurately determined based on the updated shooting angle information, so that the accuracy of the determined virtual scene information is improved.
In a second aspect, the present disclosure discloses an information display apparatus applied to a terminal device, including:
an information acquisition unit, configured to acquire initial positioning information of a terminal device in a real scene, initial shooting angle information of the terminal device, and movement information of the terminal device in the real scene;
an information processing unit, configured to determine target positioning position information and target shooting angle information of the terminal device in an augmented reality scene based on the initial positioning information, the initial shooting angle information, and the movement information;
an augmented reality processing unit, configured to present target information in the augmented reality scene in the terminal device based on the target positioning position information and the target shooting angle information; the target information comprises real scene information acquired by the terminal equipment and virtual scene information integrated in a real scene.
In one possible implementation manner, the augmented reality processing unit is configured to, when presenting, in the terminal device, target information in the augmented reality scene based on the target positioning position information and target shooting angle information:
Determining virtual scene information to be presented in real scene information acquired by the terminal equipment based on the target positioning position information and the target shooting angle information;
and synthesizing the virtual scene information and the real scene information acquired by the terminal equipment to obtain target information, and presenting the target information in the augmented reality scene of the terminal equipment.
In one possible implementation manner, the augmented reality processing unit is configured to, when presenting, in the terminal device, target information in the augmented reality scene based on the target positioning position information and target shooting angle information:
the target positioning position information and the target shooting angle information are sent to a server, and virtual scene information which is returned by the server and needs to be presented in the real scene currently shot by the terminal equipment is received; synthesizing the virtual scene information and the real scene information returned by the server to obtain the target information, and presenting the target information in the augmented reality scene of the terminal equipment; or alternatively, the process may be performed,
and sending the target positioning position information and the target shooting angle information to a server, receiving the target information which is returned by the server and is obtained by synthesizing the virtual scene information and the real scene information, and presenting the target information in the augmented reality scene of the terminal equipment.
In a possible implementation manner, the initial positioning information includes plane coordinate information of the terminal device in a real scene;
the information acquisition unit is used for, when acquiring initial positioning information of the terminal equipment in a real scene:
acquiring a picture of a target area where the terminal equipment is located, wherein the picture is shot by the terminal equipment;
and determining plane coordinate information of the terminal equipment in a geographic coordinate system in a real scene based on the picture and a two-dimensional plane map comprising the target area.
In a possible implementation manner, the information obtaining unit is configured to, when determining plane coordinate information of the terminal device in a geographic coordinate system in a real scene based on the picture and a two-dimensional plane map including the target area:
determining map coordinate information of the terminal equipment on the two-dimensional planar map based on the picture and the two-dimensional planar map comprising the target area;
and determining plane coordinate information of the terminal equipment under a geographic coordinate system in the real scene based on the scale of the two-dimensional plane map and the real scene and the map coordinate information of the terminal equipment on the two-dimensional plane map.
In a possible implementation manner, the movement information includes plane coordinate change information of the terminal device in a real scene and altitude change information in the real scene; the initial positioning information comprises plane coordinate information and height information of the terminal equipment in a real scene;
the information processing unit is used for, when determining target positioning position information of the terminal equipment in an augmented reality scene, determining target positioning position information of the terminal equipment in the augmented reality scene:
determining the updated plane coordinate information of the terminal equipment in the real scene when the current positioning period is ended based on the plane coordinate information of the terminal equipment in the real scene and the plane coordinate change information of the terminal equipment in the current positioning period when the current positioning period is started; determining updated height information of the terminal equipment in the real scene when the current positioning period is ended based on the height information of the terminal equipment in the real scene at the beginning of the current positioning period and the height change information of the terminal equipment in the current positioning period;
and determining target positioning position information of the terminal equipment in the augmented reality scene when the current positioning period is finished based on the plane coordinate information and the height information updated by the terminal equipment in the real scene.
In a possible implementation manner, the movement information includes shooting angle change information of the terminal device under a plane coordinate system;
the information processing unit determines target shooting angle information of the terminal equipment in an augmented reality scene, and the information processing unit comprises the following steps:
determining the updated shooting angle information of the terminal equipment under the plane coordinate system when the current positioning period is ended based on the initial shooting angle information of the terminal equipment under the plane coordinate system and the shooting angle change information of the terminal equipment under the plane coordinate system when the current positioning period is started;
and determining target shooting angle information of the terminal equipment in the augmented reality scene based on the updated shooting angle information of the terminal equipment in the plane coordinate system.
In one possible embodiment, the information display device further includes a communication control unit configured to:
establishing communication connection with a target entity object in the augmented reality scene; responding to a first user trigger instruction aiming at the target entity object, and controlling the target entity object to execute a first response operation corresponding to the first user trigger instruction based on the communication connection; or alternatively, the process may be performed,
And responding to a first user trigger instruction aiming at the target entity object, and sending a first control instruction to a server controlling the target entity object so that the server controls the target entity object to execute a first response operation corresponding to the first control instruction based on the first control instruction.
In some embodiments, the communication control unit is further configured to:
and responding to a second user trigger instruction for a target virtual object in the augmented reality scene, determining a presentation effect of the target virtual object in the augmented reality scene based on the second user trigger instruction, and updating the currently presented augmented reality scene based on the presentation effect.
In some embodiments, the communication control unit is further configured to:
responding to a third user trigger instruction aiming at a target virtual object and a target entity object in the augmented reality scene, wherein the third user trigger instruction is an instruction for jointly controlling the target virtual object and the target entity object;
and determining the presentation effect of the target virtual object and the target entity object in the augmented reality scene based on the third user trigger instruction, and updating the currently presented augmented reality scene based on the presentation effect.
In a third aspect, the present disclosure also provides an electronic device, including: a processor, a memory and a bus, said memory storing machine readable instructions executable by said processor, said processor and said memory communicating over the bus when the electronic device is running, said machine readable instructions when executed by said processor performing the steps of the information display method as described above.
The electronic equipment is any one of the following equipment:
a smart phone; a tablet computer; augmented reality AR glasses.
In a fourth aspect, the present disclosure also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the information display method as described above.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present disclosure and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person of ordinary skill in the art.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the technical aspects of the disclosure.
Fig. 1 shows a flowchart of an information display method provided by an embodiment of the present disclosure;
fig. 2 is a flowchart illustrating presentation of target information in the augmented reality scene in a terminal device based on target positioning position information and target shooting angle information in another information display method provided by an embodiment of the present disclosure;
fig. 3 is a flowchart illustrating determination of plane coordinate information of a terminal device in a real scene in still another information display method provided by an embodiment of the present disclosure;
fig. 4 shows a schematic diagram of a terminal device on a two-dimensional planar map in an embodiment of the disclosure;
fig. 5 is a flowchart illustrating a joint control of a target virtual object and a target physical object in an augmented reality scene by a terminal device in still another information display method according to an embodiment of the present disclosure;
fig. 6 is a schematic diagram showing a structure of an information display device according to an embodiment of the present disclosure;
fig. 7 shows a schematic structural diagram of an electronic device according to an embodiment of the disclosure.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present disclosure clearer, the technical solutions of the embodiments of the present disclosure will be clearly and completely described with reference to the accompanying drawings in the embodiments of the present disclosure, it should be understood that the drawings in the present disclosure are for the purpose of illustration and description only, and are not intended to limit the scope of protection of the present disclosure. In addition, it should be understood that the schematic drawings are not drawn to scale. A flowchart, as used in this disclosure, illustrates operations implemented according to some embodiments of the present disclosure. It should be understood that the operations of the flow diagrams may be implemented out of order and that steps without logical context may be performed in reverse order or concurrently. Moreover, one or more other operations may be added to or removed from the flow diagrams by those skilled in the art in light of the present disclosure.
In addition, the described embodiments are only some, but not all, of the embodiments of the present disclosure. The components of the embodiments of the present disclosure, which are generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure provided in the accompanying drawings is not intended to limit the scope of the disclosure, as claimed, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be made by those skilled in the art based on the embodiments of this disclosure without making any inventive effort, are intended to be within the scope of this disclosure.
In order to enable one skilled in the art to use the present disclosure, the following embodiments are presented in connection with a specific application scenario "scene display under augmented reality". It will be apparent to those having ordinary skill in the art that the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the present disclosure. While the present disclosure is primarily described in the context of scene display under augmented reality, it should be understood that this is but one exemplary embodiment.
It should be noted that the term "comprising" will be used in embodiments of the present disclosure to indicate the presence of the stated features hereinafter, but not to preclude the addition of further features.
When a scene combining virtual and reality is displayed in a terminal device such as a smart phone, a smart glasses and the like, the terminal device can be positioned into a real scene, virtual scene information matched with the positioning position is determined based on the positioned positioning information and shooting angle information, and the virtual scene information is fused with the real scene information shot at the positioning position to present an augmented reality scene. Because the terminal device may move during the positioning process, if the matching virtual scene information is determined directly based on the positioning information and the shooting angle information at the time of starting positioning, the matching virtual scene information may be inaccurate. Based on the above, the present disclosure combines positioning information and shooting angle information at the beginning of positioning, movement information in the positioning process, and the like to determine virtual scene information matching with a real scene shot after movement of a terminal device, so as to improve accuracy of a presented augmented reality scene.
The information display method in the embodiments of the present disclosure will be described in detail below.
The information display method shown in fig. 1 is performed by the terminal device or by a processor in the terminal device. Specifically, the method comprises the following steps:
s110, acquiring initial positioning information of the terminal equipment in a real scene, initial shooting angle information of the terminal equipment and movement information of the terminal equipment in the real scene.
The terminal device may have a shooting function, and may be a device capable of presenting an augmented reality scene, for example, smart glasses, smart phones, tablet computers, and the like.
Here, the terminal device requires a small period of time from the start of positioning to the end of positioning, which period of time is called a positioning period. The initial positioning information may be positioning information of the terminal device in the real scene when the current positioning period starts, and the initial shooting angle information may be shooting angle information of the terminal device in the real scene when the current positioning period starts.
The initial positioning information may include plane coordinate information and altitude information of the terminal device in the real scene.
The plane coordinate information of the terminal device in the real scene may refer to plane coordinate information of the terminal device in a geographic coordinate system in the real scene.
The initial shooting angle information is used for representing the shooting direction of the terminal equipment, and can be represented by the included angle between the shooting direction and each coordinate axis.
The terminal device may move in a period from the start of positioning to the end of positioning, and the initial positioning information and the initial shooting angle information may change during the positioning period due to the movement of the terminal device, so that in order to improve the accuracy of positioning the terminal device, the movement information of the terminal device during the positioning period needs to be acquired.
The above-mentioned movement information may include positioning change information of the terminal device in the real scene and shooting angle change information of the terminal device in the planar coordinate system, where the positioning change information may include planar coordinate change information of the terminal device in the real scene and altitude change information of the terminal device in the real scene. In the positioning process, the terminal equipment may or may not move, and when the terminal equipment does not move, the specific change value corresponding to the change information may take a value of 0, and when the terminal equipment moves, the specific change value corresponding to the change information may take any other value.
The movement information here can be acquired with a 6DOF device. The 6DOF may be reset after each positioning is completed when using the 6DOF device.
S120, determining target positioning position information and target shooting angle information of the terminal equipment in an augmented reality scene based on the initial positioning information, the initial shooting angle information and the movement information.
Here, the updated positioning information of the terminal device in the real scene and the updated shooting angle information of the terminal device in the plane coordinate system may be determined based on the initial positioning information, the initial shooting angle information and the movement information of the terminal device in the positioning period at the beginning of the positioning period, and then coordinate conversion is performed by using the coordinate conversion relationship between the coordinate system in the augmented reality scene and the plane coordinate system based on the updated positioning information and the updated shooting angle information, so as to obtain the target positioning position information and the target shooting angle information of the terminal device in the augmented reality scene.
Of course, other methods may be used to determine the target positioning position information and the target shooting angle information of the terminal device in the augmented reality scene at the end of the positioning period. For example, the positioning position information and the shooting angle information of the terminal device in the augmented reality scene may be determined based on the initial positioning information and the initial shooting angle information at the beginning of the positioning period, and then the movement information of the terminal device in the positioning period is mapped to the augmented reality scene to obtain the target positioning position information and the target shooting angle information of the terminal device in the augmented reality scene at the end of the positioning period.
The method determines the target positioning position information and the target shooting angle information of the terminal equipment in the augmented reality scene at the end of the positioning period, namely, the positioning of the terminal equipment in the augmented reality scene is realized, and the coordinate conversion from a plane coordinate system to a coordinate system in the augmented reality scene is realized. The coordinate system in the augmented reality scene here may be a Unity coordinate system.
S130, presenting target information in the augmented reality scene in the terminal equipment based on the target positioning position information and the target shooting angle information; the target information comprises real scene information acquired by the terminal equipment and virtual scene information integrated in a real scene.
Because the terminal equipment has a certain shooting angle and cannot shoot scenes corresponding to all angles of the area where the terminal equipment is located, virtual scene information in an angle range corresponding to the target shooting angle information is determined by combining the target positioning position information and the target shooting angle information, and the determined virtual scene information is merged into real scene information acquired by the terminal equipment to obtain target information. Combining the target positioning position information and the target shooting angle information can improve the accuracy of the determined virtual scene information, and then can improve the accuracy of the augmented reality scene presented in the terminal equipment.
According to the embodiment, after the initial positioning information and the initial shooting angle information of the terminal equipment in the real scene at the beginning of positioning are obtained, the virtual scene information matched with the real scene at the end of positioning is determined by combining the moving information in the positioning process, so that the accuracy of the presented augmented reality scene is improved.
In some embodiments, as shown in fig. 2, the presenting, in the terminal device, the target information in the augmented reality scene based on the target positioning position information and the target shooting angle information includes:
s210, determining virtual scene information to be presented in real scene information acquired by the terminal equipment based on the target positioning position information and the target shooting angle information.
The terminal device may have a device with a photographing function, and the scene information displayed in the terminal device should be scene information that it can photograph, where the scene information includes not only the real scene information currently photographed but also virtual scene information located in the augmented reality scene. Wherein, for the real scene information, since it is directly shot by the terminal device, the real scene information to be presented in the augmented reality scene can be directly determined. For the virtual scene information, since the terminal device cannot theoretically photograph the virtual scene information located behind the photographing direction of the terminal device, that is, the terminal device cannot theoretically photograph the virtual scene information outside the photographing visual field range corresponding to the target photographing angle, it is necessary to screen the virtual scene information located in front of the terminal device according to the positioning information (that is, the target positioning position information) and the photographing direction information (that is, the target photographing angle information) of the terminal device in the augmented reality scene, that is, the virtual scene information that the terminal device can present in the augmented reality scene.
For example, when screening the virtual scene information to be presented in the augmented reality scene, virtual scene information that is located in a shooting field of view of the terminal device theoretically may be screened from a three-dimensional model that is built in advance and includes virtual scene information based on the target positioning position information and the target shooting angle information. The three-dimensional model may be stored in a remote server communicatively connected to the terminal device or may be stored in the terminal device.
S220, synthesizing the virtual scene information and the real scene information acquired by the terminal equipment to obtain the target information, and presenting the target information in the augmented reality scene.
After the virtual scene information to be displayed is determined, the shot real scene information can be mapped into the augmented reality scene (which can be the shot original real scene image or the original real scene image after processing), then the real scene information mapped into the augmented reality scene is synthesized with the virtual scene information to obtain the target information, and finally the augmented reality scene comprising the target information is displayed in the terminal equipment.
Of course, the target information may be synthesized by other methods, for example, the real scene information in all possible shooting areas is pre-stored in the augmented reality scene and mapped to the scene information after the augmented reality scene, and then, after the terminal device shoots the real scene information in the current area, the matched real scene information (which may be an original real scene image or a processed real scene image) and the virtual scene information may be screened from the pre-stored real scene information corresponding to the augmented reality scene according to the features of the shot real scene information to obtain the target information. The embodiment of the present disclosure does not limit how to synthesize the target information.
According to the embodiment, the virtual scene information needed to be presented in the augmented reality scene is determined by the terminal equipment or the processor in the terminal equipment, and the determined virtual scene information is synthesized with the real scene information shot by the terminal equipment to obtain the target information presented in the augmented reality scene in the terminal equipment, so that the reality and accuracy of the augmented reality scene presented in the terminal equipment are improved.
Because the processor resources in the terminal device are limited, and in order to ensure the synthesis efficiency of the target information, the server connected with the terminal device can be used for determining the virtual scene information or synthesizing the target information, and then the determined virtual scene information or synthesized target information is fed back to the terminal device, specifically, the method can be realized by the following steps:
The terminal equipment sends the target positioning position information and the target shooting angle information to a server, and receives virtual scene information which is returned by the server and needs to be presented in a real scene currently shot by the terminal equipment; synthesizing the virtual scene information and the real scene information returned by the server to obtain the target information, and presenting the target information in the augmented reality scene of the terminal equipment; or alternatively, the process may be performed,
the terminal equipment sends the target positioning position information and the target shooting angle information to a server, receives target information which is returned by the server and is obtained by synthesizing the virtual scene information and the real scene information, and presents the target information in the augmented reality scene of the terminal equipment.
In the above steps, the method for determining the virtual scene information to be presented by the server based on the target positioning position information and the target shooting angle information may be the same as the method for determining the virtual scene information to be presented by the terminal device. The method for synthesizing the virtual scene information and the real scene information by the server to obtain the target information can be the same as the method for synthesizing the target information by the terminal equipment. Therefore, the method for determining the presented virtual scene information by the server and the method for synthesizing the target information by the server are not described herein.
In fact, in many cases, the plane coordinate information of the terminal device in the real scene cannot be directly acquired, so that after the initial positioning information of the terminal device in a certain two-dimensional coordinate system is acquired, the plane coordinate information of the terminal device in the real scene is obtained through processing, and specifically, as shown in fig. 3, the method can be implemented by using the following steps:
s310, acquiring a picture of a target area where the terminal equipment is located, wherein the picture is shot by the terminal equipment.
Here, at the beginning of the positioning period, a picture of the target area where the terminal device is located may be taken with the terminal device.
S320, based on the picture and the two-dimensional plane map comprising the target area, plane coordinate information of the terminal equipment in a geographic coordinate system in a real scene is determined.
In the implementation, firstly, the map coordinate information of the terminal equipment on the two-dimensional planar map can be determined based on the association relation between the picture shot by the terminal equipment and the two-dimensional planar map, and then the plane coordinate information of the terminal equipment in the geographic coordinate system in the real scene can be determined based on the scale of the two-dimensional planar map and the real scene and the map coordinate information of the terminal equipment on the two-dimensional planar map.
Since the two-dimensional planar map is not in equal proportion to the real scene, the map coordinate information of the terminal device cannot be directly used as the planar coordinate information of the terminal device in the real scene. In the implementation, map coordinate information of the terminal equipment on the two-dimensional planar map can be converted into plane coordinate information of the terminal equipment under a geographic coordinate system in the real scene by utilizing a scale of the two-dimensional planar map and the real scene.
As shown in fig. 4, the terminal device is located at the position indicated by the circle 1 on the two-dimensional planar map, the map coordinates thereof are xz= (1, 3), and the scale of the two-dimensional planar map and the real scene is 1:5, so that the plane coordinates of the terminal device in the geographic coordinate system of the real scene are (5, 15), the units are meters, and at this time, the position of the terminal device is 5 meters in the X direction and 15 meters in the Z direction.
In the implementation, the map coordinate information of the terminal device on the two-dimensional plane map can be determined by utilizing a visual positioning technology and combining the picture shot by the terminal device and the two-dimensional plane map comprising the target area. Of course, other methods may be used to determine the map coordinate information of the terminal device on the two-dimensional planar map, for example, marker positioning and GPS positioning techniques may be used to determine the map coordinate information of the terminal device on the two-dimensional planar map.
After positioning the terminal device into the augmented reality scene, the terminal device may be utilized to communicate, interact or control with the physical or virtual objects in the augmented reality scene. Specifically, the following steps can be utilized to realize the communication and control between the terminal device and the entity object in the augmented reality scene:
the terminal equipment establishes communication connection with the target entity object in the augmented reality scene; and responding to a first user trigger instruction aiming at the target entity object, and controlling the target entity object to execute a first response operation corresponding to the first user trigger instruction based on the communication connection.
In specific implementation, a user initiates a first user trigger instruction in a terminal device, the terminal device receives the first user trigger instruction and sends a corresponding control instruction to a target entity object based on the first user trigger instruction so as to control the target entity object to execute a response operation corresponding to the first user trigger instruction.
For example, the terminal device may send a control instruction for controlling the target entity object to move according to a certain route to the target entity object, and after receiving the control instruction, the target entity object moves according to the specified route.
For another example, the terminal device may send a control instruction for controlling the target entity object to play the preset picture to the target entity object, and after the target entity object receives the control instruction, the target entity object may play the preset picture on its own display screen.
After the target entity object performs the response operation, the state after the corresponding operation is performed is displayed, and the terminal device can directly acquire updated real scene information (including the state after the target entity object performs the corresponding operation) through shooting, so that the augmented reality scene information presented in the terminal device can be updated by using the updated real scene information.
The above embodiment uses the terminal device itself to control the target entity object, and in order to reduce the occupation of the processor resource in the terminal device, the control of the target entity object may be implemented by using a server that is connected to the terminal device and is used for controlling the target entity object. The method can be realized by the following steps:
the terminal equipment responds to a first user trigger instruction aiming at the target entity object and sends a first control instruction to a server controlling the target entity object, so that the server controls the target entity object to execute a first response operation corresponding to the first control instruction based on the first control instruction.
Here, the server transmits a corresponding control instruction to the target entity object based on the first control instruction to control the target entity object to perform a response operation corresponding to the first control instruction.
As described above, the server can realize the same control as the terminal device. Similarly, the terminal device may directly acquire updated real scene information through shooting, so that the augmented reality scene information presented in the terminal device may be updated by using the updated real scene information.
According to the embodiment of the disclosure, the terminal equipment can not only control the target entity object in the augmented reality scene, but also realize the interactive operation of the target virtual object in the augmented reality scene, and the specific implementation steps are as follows:
the terminal equipment responds to a second user trigger instruction aiming at a target virtual object in the augmented reality scene, determines the presentation effect of the target virtual object in the augmented reality scene based on the second user trigger instruction, and updates the currently presented augmented reality scene based on the presentation effect.
The embodiment may specifically be executed in a manner that the terminal device sends a control instruction to the server of the control target virtual object based on the second user trigger instruction, so that the server controls the target virtual object to execute the second response operation based on the control instruction. Since the target virtual object does not actually exist in the real scene, the server can determine the state of the target virtual object after the target virtual object performs the operation based on the second response operation, that is, the server determines the presentation effect corresponding to the second response operation based on the second response operation, and feeds back the presentation effect to the terminal device, and the terminal device presents the presentation effect in the augmented reality scene in the terminal device.
For example, the terminal device may send a second control instruction for controlling the target virtual object to move according to a predetermined rule to the server, after receiving the instruction sent by the server, the target virtual object moves according to the predetermined rule, and the server generates a corresponding presentation effect and feeds back the presentation effect to the terminal device, and then the terminal device presents the presentation effect in the augmented reality scene.
Of course, in this embodiment, in implementation, the terminal device may further control, based on the second user trigger instruction, the target virtual object to execute a second response operation corresponding to the second user trigger instruction, and determine a presentation effect corresponding to the second response operation.
According to the embodiment of the disclosure, through the terminal equipment, not only can the independent target entity object or the independent target virtual object in the augmented reality scene be controlled, but also the target virtual object and the target entity object in the augmented reality scene can be controlled in a combined way, as shown in fig. 5, the specific implementation steps are as follows:
s510, the terminal equipment responds to a third user trigger instruction aiming at the target virtual object and the target entity object in the augmented reality scene, wherein the third user trigger instruction is an instruction for carrying out associated control on the target virtual object and the target entity object.
The third user trigger instruction may be used to control the target entity object to execute a specific operation on the target virtual object, where the corresponding state of the target virtual object may be changed. The changed state of the target virtual object is presented as a presentation effect in the augmented reality scene. Of course, the third user trigger instruction may also be used to control the target virtual object to perform a specific operation on the target entity object, where the state of the target entity object in the real scene is not changed, but the state of the target entity object in the three-dimensional augmented reality space is changed, and is presented as a presentation effect in the augmented reality scene.
For example, the third user trigger instruction may be used to control the target physical object to transport the target virtual object to the target location, and for another example, the third user trigger instruction may be used to control the target virtual object to perform a cutting process on the target physical object.
S520, the terminal equipment determines the presentation effect of the target virtual object and the target entity object in the augmented reality scene based on the third user trigger instruction, and updates the currently presented augmented reality scene based on the presentation effect.
The presentation effect may include a state after the target physical object is changed, or may include a state after the target virtual object is changed. And when the third user trigger instruction for executing the operation to the target virtual object is specifically implemented, the server controls the target virtual object to execute the corresponding response operation based on the third control instruction, the server controls the target virtual object to execute the response operation corresponding to the third control instruction based on the third control instruction, determines the presentation effect corresponding to the response operation of the target virtual object, and finally feeds back the determined presentation effect to the terminal equipment. And the terminal equipment fuses the currently shot real scene information with the received presentation effect and updates the currently presented augmented reality scene based on the fusion result.
For example, the third user trigger instruction may be used to control the target entity object to convey the target virtual object to the target position, at this time, the target entity object moves to the target position in the real scene based on the third control instruction, the target virtual object is also conveyed to the target position in the three-dimensional augmented reality scene, the terminal device may directly obtain the real scene information (including the state of the target entity object at the target position) in the augmented reality scene to be rendered by photographing the current real scene, the terminal device determines the virtual scene information to be blended into the current real scene by receiving the rendering effect (the state of the target virtual object at the target position) fed back by the server, and finally, fuses the virtual scene information and the real scene information, and updates the currently rendered augmented reality scene based on the fused result.
For a third user trigger instruction of the target virtual object for executing the operation to the target entity object, when the method is implemented, the server controls the target virtual object to execute the corresponding response operation based on the third control instruction, and determines the response effect after the target virtual object executes the corresponding response operation; the state of the target entity object in the real scene is not changed, but in the augmented reality scene, the target entity object needs to change its state based on the operation performed by the target virtual object, and the changed state is used as a response effect, and finally, the server determines a presentation effect by combining the response effect of the target virtual object and the response effect of the target entity object, and feeds back the presentation effect to the terminal device, so that the terminal device updates the augmented reality scene displayed by the terminal device based on the presentation effect.
For example, the third user trigger instruction may be used to control the target virtual object to perform a cutting operation on the target entity object, at this time, the server controls the target virtual object to perform the cutting operation, and determines a response effect of the target virtual object after performing the cutting operation, where a state of the target entity object in the real scene is unchanged, but in the augmented reality scene, based on the cutting operation of the target virtual object, the state of the target entity object is changed into a state of multiple blocks, and the server uses the state of the target virtual object performing the cutting operation and the state of the target entity object being cut as a presentation effect, and feeds back the presentation effect to the terminal device, so that the terminal device updates the augmented reality scene displayed by the terminal device based on the presentation effect.
It should be noted that, in the above embodiment, the server is used to determine and feed back the presentation effect, and in implementation, the terminal device may also be directly used to determine the presentation effect based on the third user trigger instruction.
The terminal equipment can realize the joint control of the target virtual object and the target entity object in the augmented reality scene, can realize the joint control of two or more target virtual objects in the augmented reality scene, and can also realize the joint control of two or more target entity objects in the augmented reality scene.
The terminal device may for example be enabled to control the association of two target entity objects in an augmented reality scene by:
the user initiates a fourth user trigger instruction into the terminal equipment, the terminal equipment receives the fourth user trigger instruction, and sends a corresponding control instruction to the first target entity object based on the fourth user trigger instruction so as to control the first target entity object to execute a preset operation to the second target entity object.
And then, the terminal equipment acquires updated real scene information by shooting the current real scene, and updates the augmented reality scene displayed in the terminal equipment based on the updated real scene information.
The fourth user trigger instruction is an instruction for performing associated control on the first target entity object and the second target entity object.
Of course, the server may also be used to jointly control two target entity objects, specifically: the user initiates a fifth user trigger instruction into the terminal equipment, the terminal equipment sends a corresponding control instruction to the server based on the fifth user trigger instruction, and the server sends the corresponding control instruction to the first target entity object based on the received control instruction so as to control the first target entity object to execute a preset operation to the second target entity object.
For example, the following steps can be used to realize the joint control of the terminal device on two target virtual objects in the augmented reality scene:
the user initiates a sixth user trigger instruction into the terminal equipment, the terminal equipment receives the sixth user trigger instruction, and sends a control instruction to the server based on the sixth user trigger instruction, so that the server controls the first target virtual object to execute a preset operation to the second target virtual body, and determines updated states of the first target virtual object and the second target virtual object based on the preset operation executed by the first target virtual object to the second target virtual body, namely determines updated presentation effects of the first target virtual object and the second target virtual object. And the server feeds back the updated presentation effect to the terminal equipment so that the terminal equipment updates the displayed augmented reality scene based on the received presentation effect.
The sixth user trigger instruction is an instruction for performing a stripe control with respect to the first target virtual object and the second target virtual object.
Of course, the terminal device itself may be used to implement associated control over the two target virtual objects, specifically, the user initiates a seventh user trigger instruction into the terminal device, the terminal device receives the seventh user trigger instruction, and controls the first target virtual object to execute a preset operation on the second target virtual object based on the seventh user trigger instruction, and determines a state after the first target virtual object and the second target virtual object are updated, that is, determines a presentation effect after the first target virtual object and the second target virtual object are updated, based on the preset operation performed by the first target virtual object on the second target virtual object. Then, the terminal device updates its displayed augmented reality scene based on the determined presentation effect.
Since the positioning position of the terminal device in the real scene may change during the positioning process, when the positioning is finished, if the determination of the virtual scene information is performed based on the initial positioning information at the beginning of the positioning, the virtual scene information may not match with the real scene at the end of the positioning, and therefore, the embodiment of the disclosure combines the movement information during the positioning process to perform the matching of the virtual scene, and the specific implementation process includes:
The terminal equipment determines the updated plane coordinate information of the terminal equipment in the real scene when the current positioning period is ended based on the plane coordinate information of the terminal equipment in the real scene and the plane coordinate change information of the terminal equipment in the current positioning period when the current positioning period is started; and determining updated height information of the terminal equipment in the real scene when the current positioning period is ended based on the height information of the terminal equipment in the real scene at the beginning of the current positioning period and the height change information of the terminal equipment in the current positioning period. And determining target positioning position information of the terminal equipment in the augmented reality scene when the current positioning period is finished based on the plane coordinate information and the height information updated by the terminal equipment in the real scene.
The mobile information comprises plane coordinate change information of the terminal equipment in a real scene and altitude change information of the terminal equipment in the real scene; the initial positioning information comprises plane coordinate information and height information of the terminal equipment in a real scene.
In the implementation, after the updated plane coordinate information and the updated height information are determined, the updated position information of the terminal equipment in the real scene can be determined by using the updated plane coordinate information and the updated height information, and then coordinate conversion is performed on the updated position information of the terminal equipment in the real scene, so that the target positioning position information of the terminal equipment in the augmented reality scene is obtained when the current positioning period is finished.
In addition, since the scale of the augmented reality scene to the real scene is 1:1, when the coordinate conversion is performed, the coordinate conversion can be performed on the updated plane coordinate information only, then the updated height information is added, and the updated plane coordinate information is put into the augmented reality scene.
According to the embodiment, the plane coordinate information and the height information updated by the terminal equipment in the real scene are determined by combining the mobile information of the terminal equipment in the positioning period, and the accuracy of the determined target positioning position information can be improved based on the updated plane coordinate information and the updated height information.
Aiming at the situation that the mobile information comprises shooting angle change information, the terminal equipment determines the updated shooting angle information of the terminal equipment under the plane coordinate system when the current positioning period is ended based on the initial shooting angle information of the terminal equipment under the plane coordinate system when the current positioning period is started and the shooting angle change information of the terminal equipment under the plane coordinate system; and determining target shooting angle information of the terminal equipment in the augmented reality scene based on the updated shooting angle information of the terminal equipment in the plane coordinate system.
The above-mentioned when confirming the goal shooting angle information under the augmented reality scene, include the difference conversion of two parts, one is, the difference conversion in shooting direction of the coordinate system under the plane coordinate system and the augmented reality scene; and secondly, converting coordinates of shooting angle change information in the current positioning period.
The above embodiment combines the shooting angle change information of the terminal equipment in the positioning period, can determine the updated shooting angle information of the terminal equipment in the plane coordinate system, and can improve the accuracy of determining the target shooting angle information based on the updated shooting angle information.
Corresponding to the above information display method, the embodiment of the present disclosure further provides an information display device, where the information display device is applied to a terminal device, and the information display device and its respective units can perform the same method steps as the above information display method, and can achieve the same beneficial effects, so that repeated parts are not repeated.
Specifically, as shown in fig. 6, an information display apparatus provided by an embodiment of the present disclosure includes: an information acquisition unit 610, an information processing unit 620, and an augmented reality processing unit 630.
An information acquiring unit 610, configured to acquire initial positioning information of a terminal device in a real scene, initial shooting angle information of the terminal device, and movement information of the terminal device in the real scene.
An information processing unit 620, configured to determine target positioning location information and target shooting angle information of the terminal device in an augmented reality scene based on the initial positioning information, the initial shooting angle information, and the movement information.
An augmented reality processing unit 630, configured to present, in the terminal device, target information in the augmented reality scene based on the target positioning position information and target shooting angle information; the target information comprises real scene information acquired by the terminal equipment and virtual scene information integrated in a real scene.
In some embodiments, the augmented reality processing unit 630 is configured to, when presenting the target information in the augmented reality scene in the terminal device based on the target positioning position information and the target shooting angle information:
determining virtual scene information to be presented in real scene information acquired by the terminal equipment based on the target positioning position information and the target shooting angle information;
And synthesizing the virtual scene information and the real scene information acquired by the terminal equipment to obtain target information, and presenting the target information in the augmented reality scene of the terminal equipment.
In some embodiments, the augmented reality processing unit 630 is configured to, when presenting the target information in the augmented reality scene in the terminal device based on the target positioning position information and the target shooting angle information:
the target positioning position information and the target shooting angle information are sent to a server, and virtual scene information which is returned by the server and needs to be presented in the real scene currently shot by the terminal equipment is received; synthesizing the virtual scene information and the real scene information returned by the server to obtain the target information, and presenting the target information in the augmented reality scene of the terminal equipment; or alternatively, the process may be performed,
and sending the target positioning position information and the target shooting angle information to a server, receiving the target information which is returned by the server and is obtained by synthesizing the virtual scene information and the real scene information, and presenting the target information in the augmented reality scene of the terminal equipment.
In some embodiments, the initial positioning information includes plane coordinate information of the terminal device in the real scene;
the information acquisition unit 610 is configured to, when acquiring initial positioning information of a terminal device in a real scene:
acquiring a picture of a target area where the terminal equipment is located, wherein the picture is shot by the terminal equipment;
and determining plane coordinate information of the terminal equipment in a geographic coordinate system in a real scene based on the picture and a two-dimensional plane map comprising the target area.
In some embodiments, the information obtaining unit 610, when determining plane coordinate information of the terminal device in a geographic coordinate system in a real scene based on the picture and a two-dimensional plane map including the target area, is configured to:
determining map coordinate information of the terminal equipment on the two-dimensional planar map based on the picture and the two-dimensional planar map comprising the target area;
and determining plane coordinate information of the terminal equipment under a geographic coordinate system in the real scene based on the scale of the two-dimensional plane map and the real scene and the map coordinate information of the terminal equipment on the two-dimensional plane map.
In some embodiments, the movement information includes plane coordinate change information of the terminal device in a real scene and altitude change information in a real scene; the initial positioning information comprises plane coordinate information and height information of the terminal equipment in a real scene;
the information processing unit 620, when determining target positioning location information of the terminal device in an augmented reality scene, is configured to:
determining the updated plane coordinate information of the terminal equipment in the real scene when the current positioning period is ended based on the plane coordinate information of the terminal equipment in the real scene and the plane coordinate change information of the terminal equipment in the current positioning period when the current positioning period is started; determining updated height information of the terminal equipment in the real scene when the current positioning period is ended based on the height information of the terminal equipment in the real scene at the beginning of the current positioning period and the height change information of the terminal equipment in the current positioning period;
and determining target positioning position information of the terminal equipment in the augmented reality scene when the current positioning period is finished based on the plane coordinate information and the height information updated by the terminal equipment in the real scene.
In some embodiments, the movement information includes shooting angle change information of the terminal device in a plane coordinate system;
the information processing unit 620, when determining target shooting angle information of the terminal device in the augmented reality scene, includes:
determining the updated shooting angle information of the terminal equipment under the plane coordinate system when the current positioning period is ended based on the initial shooting angle information of the terminal equipment under the plane coordinate system and the shooting angle change information of the terminal equipment under the plane coordinate system when the current positioning period is started;
and determining target shooting angle information of the terminal equipment in the augmented reality scene based on the updated shooting angle information of the terminal equipment in the plane coordinate system.
In some embodiments, the above information display apparatus further includes a communication control unit 640, the communication control unit 640 unit being configured to:
establishing communication connection with a target entity object in the augmented reality scene; responding to a first user trigger instruction aiming at the target entity object, and controlling the target entity object to execute a first response operation corresponding to the first user trigger instruction based on the communication connection; or alternatively, the process may be performed,
And responding to a first user trigger instruction aiming at the target entity object, and sending a first control instruction to a server controlling the target entity object so that the server controls the target entity object to execute a first response operation corresponding to the first control instruction based on the first control instruction.
In some embodiments, the functions or units included in the apparatus provided by the embodiments of the present disclosure may be used to perform the methods described in the foregoing method embodiments, and specific implementations thereof may refer to descriptions of the foregoing method embodiments, which are not repeated herein for brevity.
The embodiment of the disclosure discloses an electronic device, as shown in fig. 7, including: a processor 701, a memory 702 and a bus 703, said memory 702 storing machine readable instructions executable by said processor 701, said processor 701 and said memory 702 communicating via the bus 703 when the electronic device is running.
The machine readable instructions, when executed by the processor 701, perform the steps of the information display method of:
acquiring initial positioning information of a terminal device in a real scene, initial shooting angle information of the terminal device and movement information of the terminal device in the real scene;
Determining target positioning position information and target shooting angle information of the terminal equipment in an augmented reality scene based on the initial positioning information, the initial shooting angle information and the movement information;
presenting target information in the augmented reality scene in the terminal device based on the target positioning position information and the target shooting angle information; the target information comprises real scene information acquired by the terminal equipment and virtual scene information integrated in a real scene.
The embodiment of the disclosure also provides a computer program product corresponding to the information display method and apparatus, including a computer readable storage medium storing program codes, where the instructions included in the program codes may be used to execute the method in the foregoing method embodiment, and specific implementation may refer to the method embodiment and will not be described herein.
The above embodiments of the present disclosure are not limited to specific applied technical directions, and any virtual and real combined technical directions may use the technical ideas set forth in the embodiments of the present disclosure: combining initial positioning information and initial shooting angle information of the terminal equipment and movement information of the terminal equipment to present target information after fusion of a real scene and a virtual scene, wherein the target information can be used in augmented reality Augmented Reality AR) scene, or may be shown in a Virtual Reality (VR) scene. The virtual reality scene differs from the augmented reality scene in that the virtual reality scene needs to be represented by creating a fully artificial scene based on the target information.
The foregoing description of various embodiments is intended to highlight differences between the various embodiments, which may be the same or similar to each other by reference, and is not repeated herein for the sake of brevity.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system and apparatus may refer to corresponding procedures in the method embodiments, and will not be described in detail in this disclosure. In the several embodiments provided in the present disclosure, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, and the division of the modules is merely a logical function division, and there may be additional divisions when actually implemented, and for example, multiple modules or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, indirect coupling or communication connection of devices or modules, electrical, mechanical, or other form.
The modules described as separate components may or may not be physically separate, and components shown as modules may or may not be physical units, may be located in one place, or may be distributed over multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present disclosure may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in essence or a part contributing to the prior art or a part of the technical solution, or in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method described in the embodiments of the present disclosure. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk, etc.
The foregoing is merely a specific embodiment of the disclosure, but the protection scope of the disclosure is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the disclosure, and it should be covered in the protection scope of the disclosure. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (20)

1. An information display method, comprising:
acquiring initial positioning information of a terminal device in a real scene, initial shooting angle information of the terminal device and movement information of the terminal device in the real scene; wherein the initial positioning information and the initial shooting angle information are obtained at the beginning of a current positioning period, and the movement information is movement information in the current positioning period;
determining target positioning position information and target shooting angle information of the terminal equipment in an augmented reality scene at the end of the current positioning period based on the initial positioning information, the initial shooting angle information and the movement information;
presenting target information in the augmented reality scene in the terminal device based on the target positioning position information and the target shooting angle information; the target information comprises real scene information acquired by the terminal equipment and virtual scene information integrated in a real scene, and the virtual scene information is matched with the target positioning position information and is positioned in an angle range corresponding to the target shooting angle information.
2. The information display method according to claim 1, wherein the movement information includes plane coordinate change information of the terminal device in a real scene and altitude change information in a real scene; the initial positioning information comprises plane coordinate information and height information of the terminal equipment in a real scene;
the determining the target positioning position information of the terminal equipment in the augmented reality scene comprises the following steps:
determining the updated plane coordinate information of the terminal equipment in the real scene when the current positioning period is ended based on the plane coordinate information of the terminal equipment in the real scene and the plane coordinate change information of the terminal equipment in the current positioning period when the current positioning period is started; determining updated height information of the terminal equipment in the real scene when the current positioning period is ended based on the height information of the terminal equipment in the real scene at the beginning of the current positioning period and the height change information of the terminal equipment in the current positioning period;
and determining target positioning position information of the terminal equipment in the augmented reality scene when the current positioning period is finished based on the plane coordinate information and the height information updated by the terminal equipment in the real scene.
3. The information display method according to claim 1, wherein the movement information includes shooting angle change information of the terminal device in a planar coordinate system;
determining target shooting angle information of the terminal equipment in an augmented reality scene comprises the following steps:
determining the updated shooting angle information of the terminal equipment under the plane coordinate system when the current positioning period is ended based on the initial shooting angle information of the terminal equipment under the plane coordinate system and the shooting angle change information of the terminal equipment under the plane coordinate system when the current positioning period is started;
and determining target shooting angle information of the terminal equipment in the augmented reality scene based on the updated shooting angle information of the terminal equipment in the plane coordinate system.
4. A method of displaying information according to any one of claims 1 to 3, further comprising:
establishing communication connection with a target entity object in the augmented reality scene; responding to a first user trigger instruction aiming at the target entity object, and controlling the target entity object to execute a first response operation corresponding to the first user trigger instruction based on the communication connection; or alternatively, the process may be performed,
And responding to a first user trigger instruction aiming at the target entity object, and sending a first control instruction to a server controlling the target entity object so that the server controls the target entity object to execute a first response operation corresponding to the first control instruction based on the first control instruction.
5. A method of displaying information according to any one of claims 1 to 3, further comprising:
and responding to a second user trigger instruction for a target virtual object in the augmented reality scene, determining a presentation effect of the target virtual object in the augmented reality scene based on the second user trigger instruction, and updating the currently presented augmented reality scene based on the presentation effect.
6. A method of displaying information according to any one of claims 1 to 3, further comprising:
responding to a third user trigger instruction aiming at a target virtual object and a target entity object in the augmented reality scene, wherein the third user trigger instruction is an instruction for jointly controlling the target virtual object and the target entity object;
and determining the presentation effect of the target virtual object and the target entity object in the augmented reality scene based on the third user trigger instruction, and updating the currently presented augmented reality scene based on the presentation effect.
7. The information display method according to any one of claims 1 to 6, wherein the presenting, in the terminal device, the target information in the augmented reality scene based on the target positioning position information and the target shooting angle information includes:
determining virtual scene information to be presented in real scene information acquired by the terminal equipment based on the target positioning position information and the target shooting angle information;
and synthesizing the virtual scene information and the real scene information acquired by the terminal equipment to obtain the target information, and presenting the target information in the augmented reality scene.
8. The information display method according to any one of claims 1 to 6, wherein the presenting, in the terminal device, the target information in the augmented reality scene based on the target positioning position information and the target shooting angle information includes:
the target positioning position information and the target shooting angle information are sent to a server, and virtual scene information which is returned by the server and needs to be presented in the real scene currently shot by the terminal equipment is received; synthesizing the virtual scene information and the real scene information returned by the server to obtain the target information, and presenting the target information in the augmented reality scene of the terminal equipment; or alternatively, the process may be performed,
And sending the target positioning position information and the target shooting angle information to a server, receiving the target information which is returned by the server and is obtained by synthesizing the virtual scene information and the real scene information, and presenting the target information in the augmented reality scene of the terminal equipment.
9. The information display method according to any one of claims 1 to 6, wherein the initial positioning information includes plane coordinate information of the terminal device in a real scene;
the obtaining the initial positioning information of the terminal equipment in the real scene comprises the following steps:
acquiring a picture of a target area where the terminal equipment is located, wherein the picture is shot by the terminal equipment;
and determining plane coordinate information of the terminal equipment in a geographic coordinate system in a real scene based on the picture and a two-dimensional plane map comprising the target area.
10. The information display method according to claim 9, wherein the determining plane coordinate information of the terminal device in a geographic coordinate system in a real scene based on the picture and a two-dimensional plane map including the target area includes:
determining map coordinate information of the terminal equipment on the two-dimensional planar map based on the picture and the two-dimensional planar map comprising the target area;
And determining plane coordinate information of the terminal equipment under a geographic coordinate system in the real scene based on the scale of the two-dimensional plane map and the real scene and the map coordinate information of the terminal equipment on the two-dimensional plane map.
11. An information display apparatus, characterized by being applied to a terminal device, comprising:
an information acquisition unit, configured to acquire initial positioning information of a terminal device in a real scene, initial shooting angle information of the terminal device, and movement information of the terminal device in the real scene; wherein the initial positioning information and the initial shooting angle information are obtained at the beginning of a current positioning period, and the movement information is movement information in the current positioning period;
an information processing unit, configured to determine target positioning position information and target shooting angle information of the terminal device in an augmented reality scene at the end of the current positioning period based on the initial positioning information, the initial shooting angle information, and the movement information;
an augmented reality processing unit, configured to present target information in the augmented reality scene in the terminal device based on the target positioning position information and the target shooting angle information; the target information comprises real scene information acquired by the terminal equipment and virtual scene information integrated in a real scene, and the virtual scene information is matched with the target positioning position information and is positioned in an angle range corresponding to the target shooting angle information.
12. The information display apparatus according to claim 11, wherein the movement information includes plane coordinate change information of the terminal device in a real scene and altitude change information in a real scene; the initial positioning information comprises plane coordinate information and height information of the terminal equipment in a real scene;
the information processing unit is used for, when determining target positioning position information of the terminal equipment in an augmented reality scene, determining target positioning position information of the terminal equipment in the augmented reality scene:
determining the updated plane coordinate information of the terminal equipment in the real scene when the current positioning period is ended based on the plane coordinate information of the terminal equipment in the real scene and the plane coordinate change information of the terminal equipment in the current positioning period when the current positioning period is started; determining updated height information of the terminal equipment in the real scene when the current positioning period is ended based on the height information of the terminal equipment in the real scene at the beginning of the current positioning period and the height change information of the terminal equipment in the current positioning period;
and determining target positioning position information of the terminal equipment in the augmented reality scene when the current positioning period is finished based on the plane coordinate information and the height information updated by the terminal equipment in the real scene.
13. The information display apparatus according to claim 11, wherein the movement information includes shooting angle change information of the terminal device in a planar coordinate system;
the information processing unit determines target shooting angle information of the terminal equipment in an augmented reality scene, and the information processing unit comprises the following steps:
determining the updated shooting angle information of the terminal equipment under the plane coordinate system when the current positioning period is ended based on the initial shooting angle information of the terminal equipment under the plane coordinate system and the shooting angle change information of the terminal equipment under the plane coordinate system when the current positioning period is started;
and determining target shooting angle information of the terminal equipment in the augmented reality scene based on the updated shooting angle information of the terminal equipment in the plane coordinate system.
14. The information display device according to any one of claims 11 to 13, further comprising a communication control unit configured to:
establishing communication connection with a target entity object in the augmented reality scene; responding to a first user trigger instruction aiming at the target entity object, and controlling the target entity object to execute a first response operation corresponding to the first user trigger instruction based on the communication connection; or alternatively, the process may be performed,
And responding to a first user trigger instruction aiming at the target entity object, and sending a first control instruction to a server controlling the target entity object so that the server controls the target entity object to execute a first response operation corresponding to the first control instruction based on the first control instruction.
15. The information display device according to claim 14, wherein the communication control unit is further configured to:
and responding to a second user trigger instruction for a target virtual object in the augmented reality scene, determining a presentation effect of the target virtual object in the augmented reality scene based on the second user trigger instruction, and updating the currently presented augmented reality scene based on the presentation effect.
16. The information display device according to claim 14, wherein the communication control unit is further configured to:
responding to a third user trigger instruction aiming at a target virtual object and a target entity object in the augmented reality scene, wherein the third user trigger instruction is an instruction for jointly controlling the target virtual object and the target entity object;
and determining the presentation effect of the target virtual object and the target entity object in the augmented reality scene based on the third user trigger instruction, and updating the currently presented augmented reality scene based on the presentation effect.
17. The information display apparatus according to any one of claims 11 to 16, wherein the augmented reality processing unit, when presenting the target information in the augmented reality scene in the terminal device based on the target positioning position information and target shooting angle information, is configured to:
determining virtual scene information to be presented in real scene information acquired by the terminal equipment based on the target positioning position information and the target shooting angle information;
and synthesizing the virtual scene information and the real scene information acquired by the terminal equipment to obtain target information, and presenting the target information in the augmented reality scene of the terminal equipment.
18. The information display apparatus according to any one of claims 11 to 16, wherein the augmented reality processing unit, when presenting the target information in the augmented reality scene in the terminal device based on the target positioning position information and target shooting angle information, is configured to:
the target positioning position information and the target shooting angle information are sent to a server, and virtual scene information which is returned by the server and needs to be presented in the real scene currently shot by the terminal equipment is received; synthesizing the virtual scene information and the real scene information returned by the server to obtain the target information, and presenting the target information in the augmented reality scene of the terminal equipment; or alternatively, the process may be performed,
And sending the target positioning position information and the target shooting angle information to a server, receiving the target information which is returned by the server and is obtained by synthesizing the virtual scene information and the real scene information, and presenting the target information in the augmented reality scene of the terminal equipment.
19. An electronic device, comprising: a processor, a storage medium and a bus, the storage medium storing machine-readable instructions executable by the processor, the processor and the storage medium communicating over the bus when the electronic device is running, the processor executing the machine-readable instructions to perform the steps of the information display method according to any one of claims 1 to 10.
20. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a computer program which, when executed by a processor, performs the steps of the information display method according to any of claims 1 to 10.
CN201911379231.7A 2019-12-27 2019-12-27 Information display method, information display device, electronic equipment and computer readable storage medium Active CN111190485B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911379231.7A CN111190485B (en) 2019-12-27 2019-12-27 Information display method, information display device, electronic equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911379231.7A CN111190485B (en) 2019-12-27 2019-12-27 Information display method, information display device, electronic equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN111190485A CN111190485A (en) 2020-05-22
CN111190485B true CN111190485B (en) 2023-05-09

Family

ID=70707715

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911379231.7A Active CN111190485B (en) 2019-12-27 2019-12-27 Information display method, information display device, electronic equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN111190485B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112068703B (en) 2020-09-07 2021-11-16 北京字节跳动网络技术有限公司 Target object control method and device, electronic device and storage medium
CN115878191A (en) * 2021-08-06 2023-03-31 华为技术有限公司 Equipment hot plug method and terminal equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108932051A (en) * 2017-05-24 2018-12-04 腾讯科技(北京)有限公司 augmented reality image processing method, device and storage medium
CN109685906A (en) * 2017-10-18 2019-04-26 深圳市掌网科技股份有限公司 Scene fusion method and device based on augmented reality
CN109782901A (en) * 2018-12-06 2019-05-21 网易(杭州)网络有限公司 Augmented reality exchange method, device, computer equipment and storage medium
CN110031880A (en) * 2019-04-16 2019-07-19 杭州易绘科技有限公司 High-precision augmented reality method and apparatus based on Geographic mapping

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103116451B (en) * 2013-01-25 2018-10-26 腾讯科技(深圳)有限公司 A kind of virtual character interactive of intelligent terminal, device and system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108932051A (en) * 2017-05-24 2018-12-04 腾讯科技(北京)有限公司 augmented reality image processing method, device and storage medium
CN109685906A (en) * 2017-10-18 2019-04-26 深圳市掌网科技股份有限公司 Scene fusion method and device based on augmented reality
CN109782901A (en) * 2018-12-06 2019-05-21 网易(杭州)网络有限公司 Augmented reality exchange method, device, computer equipment and storage medium
CN110031880A (en) * 2019-04-16 2019-07-19 杭州易绘科技有限公司 High-precision augmented reality method and apparatus based on Geographic mapping

Also Published As

Publication number Publication date
CN111190485A (en) 2020-05-22

Similar Documents

Publication Publication Date Title
US9324298B2 (en) Image processing system, image processing apparatus, storage medium having stored therein image processing program, and image processing method
US8933965B2 (en) Method for calculating light source information and generating images combining real and virtual images
CN110716646A (en) Augmented reality data presentation method, device, equipment and storage medium
JP5740884B2 (en) AR navigation for repeated shooting and system, method and program for difference extraction
JP3992629B2 (en) Image generation system, image generation apparatus, and image generation method
CN104571532A (en) Method and device for realizing augmented reality or virtual reality
CN108114471B (en) AR service processing method and device, server and mobile terminal
CN111190485B (en) Information display method, information display device, electronic equipment and computer readable storage medium
US20130278636A1 (en) Object display device, object display method, and object display program
JP6677890B2 (en) Information processing system, its control method and program, and information processing apparatus, its control method and program
CN110892714A (en) Control method, device and equipment of mobile robot and storage medium
KR102197615B1 (en) Method of providing augmented reality service and server for the providing augmented reality service
CN114895796B (en) Space interaction method and device based on panoramic image and application
KR101703013B1 (en) 3d scanner and 3d scanning method
JP6665402B2 (en) Content display terminal, content providing system, content providing method, and content display program
EP4054186A1 (en) Information processing apparatus, information processing method, and program
KR20130137076A (en) Device and method for providing 3d map representing positon of interest in real time
CN108932055B (en) Method and equipment for enhancing reality content
CN113870213A (en) Image display method, image display device, storage medium, and electronic apparatus
CN111899349B (en) Model presentation method and device, electronic equipment and computer storage medium
CN111569414B (en) Flight display method and device of virtual aircraft, electronic equipment and storage medium
CN111815783A (en) Virtual scene presenting method and device, electronic equipment and storage medium
US11238658B2 (en) AR space image projecting system, AR space image projecting method, and user terminal
CN113412479A (en) Mixed reality display device and mixed reality display method
CN111650953B (en) Aircraft obstacle avoidance processing method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant