CN108235764A - Information processing method, device, cloud processing equipment and computer program product - Google Patents

Information processing method, device, cloud processing equipment and computer program product Download PDF

Info

Publication number
CN108235764A
CN108235764A CN201780002896.9A CN201780002896A CN108235764A CN 108235764 A CN108235764 A CN 108235764A CN 201780002896 A CN201780002896 A CN 201780002896A CN 108235764 A CN108235764 A CN 108235764A
Authority
CN
China
Prior art keywords
information
building
model
image information
terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201780002896.9A
Other languages
Chinese (zh)
Other versions
CN108235764B (en
Inventor
王恺
廉士国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cloudminds Robotics Co Ltd
Original Assignee
Cloudminds Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cloudminds Inc filed Critical Cloudminds Inc
Publication of CN108235764A publication Critical patent/CN108235764A/en
Application granted granted Critical
Publication of CN108235764B publication Critical patent/CN108235764B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/28Databases characterised by their database models, e.g. relational or object models
    • G06F16/284Relational databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Library & Information Science (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Remote Sensing (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The embodiment of the present invention provides a kind of information processing method, device, cloud processing equipment and computer program product, is related to technical field of data processing, improves accuracy of identification and speed to static things to a certain extent.Information processing method provided in an embodiment of the present invention, including:Obtain the image information of terminal taking;Obtain the location information of terminal;AR models are retrieved according to the location information and the image information;The AR models are sent to the terminal.

Description

Information processing method and device, cloud processing equipment and computer program product
Technical Field
The present invention relates to the field of data processing technologies, and in particular, to an information processing method and apparatus, a cloud processing device, and a computer program product.
Background
The AR (Augmented Reality) technology is also called Augmented Reality, and superimposes entity information (visual information, sound, taste, touch, and the like) which is difficult to experience in a certain time and space range of the real world to the real world after simulation by a scientific technology, so that the entity information is perceived by human senses, thereby achieving the sensory experience beyond Reality.
And, through the AR technology, the interaction between static things and the user can also be enhanced, which can present different overlay scenes on the static things. However, in the prior art, since the image taken by the user may be only a part of the static object, a part of the content will be imaged to determine the whole static object, which results in a large search amount and further causes low accuracy in identifying the static object.
Disclosure of Invention
The embodiment of the invention provides an information processing method, an information processing device, cloud processing equipment and a computer program product, which improve the identification precision and speed of static objects and enable AR models to be more accurately superposed.
In a first aspect, an embodiment of the present invention provides an information processing method, including:
acquiring image information shot by a terminal;
acquiring the position information of the terminal;
retrieving an AR model according to the position information and the image information;
and sending the AR model to the terminal.
The above-described aspect and any possible implementation manner further provide an implementation manner, where the retrieving an AR model according to the location information and the image information includes:
determining a retrieval range according to the position information;
retrieving all building information within the retrieval range;
acquiring image information corresponding to each building in the building information;
determining a building contained in the image information according to the image information;
an AR model corresponding to the building is retrieved.
The above-described aspect and any possible implementation manner further provide an implementation manner, where the obtaining of the image information corresponding to each building in the building information includes:
acquiring 3D model information corresponding to each building in the building information; or,
and if the 3D model information corresponding to the building information is not acquired, acquiring 2D picture information corresponding to each building in the building information.
The above-described aspects and any possible implementations further provide an implementation, and the method further includes:
and establishing at least one relational database according to the building information, the image information and the AR model.
In a second aspect, an embodiment of the present invention further provides an information processing apparatus, including:
the first acquisition unit is used for acquiring image information shot by the terminal;
a second obtaining unit, configured to obtain location information of the terminal;
the retrieval unit is used for retrieving an AR model according to the position information and the image information;
and the sending unit is used for sending the AR model to the terminal.
The above-mentioned aspect and any possible implementation manner further provide an implementation manner, where the retrieving unit is specifically configured to:
determining a retrieval range according to the position information;
retrieving building information within the retrieval range;
acquiring image information corresponding to each building in the building information;
determining a building contained in the image information according to the image information;
an AR model corresponding to the building is retrieved.
The above-mentioned aspect and any possible implementation manner further provide an implementation manner, where the retrieving unit is specifically configured to:
acquiring 3D model information corresponding to each building in the building information; or,
and if the 3D model information corresponding to the building information is not acquired, acquiring 2D picture information corresponding to each building in the building information.
The above-described aspects and any possible implementations further provide an implementation, where the apparatus further includes:
and the establishing unit is used for establishing at least one relational database according to the building information, the image information and the AR model.
In a third aspect, an embodiment of the present invention further provides a cloud processing device, where the device includes a processor and a memory; the memory is for storing instructions that, when executed by the processor, cause the apparatus to perform the method of any of the first aspects.
In a fourth aspect, embodiments of the present invention further provide a computer program product directly loadable into the internal memory of a computer and containing software code, the computer program being capable of implementing the method according to any of the first aspects when loaded and executed by the computer.
According to the information processing method, the information processing device, the cloud processing equipment and the computer program product, the image information and the position information shot by the terminal are obtained, and then the corresponding AR model is searched by combining the position information and the image information, wherein the search range can be narrowed by searching the AR model according to the position information, the search precision is improved, the search efficiency can be improved by determining the corresponding AR model by combining the image information shot by the terminal, the recognition rate of static objects is improved, and the problems that the search quantity for recognizing the static objects is large and the accuracy is low in the prior art are solved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a flowchart of an embodiment of an image processing method according to the present invention;
FIG. 2 is another flowchart of an embodiment of an image processing method according to the present invention;
FIG. 3 is a schematic structural diagram of an information processing apparatus according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of another structure of an embodiment of an information processing apparatus according to the present invention;
fig. 5 is a schematic structural diagram of an embodiment of a cloud processing device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terminology used in the embodiments of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the examples of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be understood that the term "and/or" as used herein is merely one type of association that describes an associated object, meaning that three relationships may exist, e.g., a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
When a user uses a terminal or a wearable device, the user often uses a camera to shoot videos, images and the like, and when the user is in an outdoor environment, static objects such as buildings often appear in the camera, or the user actively shoots the static objects such as the buildings. Since various information such as service information, advertisement information, and publicity information can be attached to the surface of the outdoor building, but all have the disadvantages of time and labor consuming and high cost for replacing the content, the AR technology can be integrated into the outdoor building by means of the aforementioned usage habits of the user using the terminal or the wearable device, thereby enhancing the effects of customization, low cost, strong immersion feeling, and the like.
However, since outdoor buildings are subjected to illumination changes to acquire image images, and when the buildings are large in size, the acquired images are only a part of the buildings, the accuracy of building identification is reduced to a certain extent. In order to solve the foregoing problem, an embodiment of the present invention provides an information processing method, and specifically, fig. 1 is a flowchart of an embodiment of an image processing method according to an embodiment of the present invention, and as shown in fig. 1, the information processing method according to an embodiment of the present invention may specifically include the following steps:
101. and acquiring image information shot by the terminal.
In the embodiment of the present invention, the terminal may include: the mobile phone comprises intelligent equipment with a camera shooting function, such as a mobile phone, a tablet personal computer, a notebook computer, vehicle-mounted equipment, wearable equipment and the like. In addition, in the embodiment of the present invention, the terminal further needs to have a network communication function.
The image information shot by the terminal is obtained by shooting an outdoor environment by a device with an image acquisition function, such as a user control camera. The image information may be acquired by a computing device with high computing power, and the computing device at least includes an arithmetic unit and a wireless transmission unit, for example, a local computer, a cloud processing center, and the like. The terminal and the computing device can communicate with each other, and wireless communication methods such as 2G, 3G, 4G, WiFi and the like can be used for communication.
In the embodiment of the invention, the terminal can actively upload the shot image information to the computing equipment and receive the image information by the computing equipment, or the computing equipment can actively acquire the image information shot by the terminal.
102. And acquiring the position information of the terminal.
In the embodiment of the present invention, the location information of the terminal may be located, and specifically, in a specific implementation process, a Global Positioning System (GPS) may be used to determine the longitude and latitude of the terminal, and the terminal is located by combining with map information. In another specific implementation process, the location of the terminal may also be determined by the base station information, and the terminal reports the identifier of the base station to which the terminal currently belongs and the signal strength to the locating device, so as to determine the coordinates of the current location of the terminal.
In the embodiment of the invention, the terminal can actively upload the position information of the terminal to the computing equipment and receive the position information by the computing equipment, or the computing equipment can actively acquire the information of the terminal to position the terminal to obtain the position information. By acquiring the position information of the terminal, the position of the terminal can be determined more accurately, the difficulty in subsequent image information identification is reduced to the greatest extent, and a foundation is established for improving the accuracy of image information identification.
103. And retrieving the AR model according to the position information and the image information.
In the embodiment of the present invention, since the position of the terminal may change and the acquired position information may have a certain error, in order to improve accuracy, first, a retrieval range may be determined according to the position information. In one particular implementation, this may be an area of a certain range from the location information, for example, a planar area 50 meters from the location position.
Then, the building information within the search range is retrieved. In this process, information of the building location may be determined depending on map information or information provided by a building provider, etc., and as a result of the search, information of several buildings, a name of each building, a number of each building, etc., is included in the search range.
Next, image information corresponding to each building in the building information is acquired. In the embodiment of the present invention, the image information includes 3D model information and 2D picture information, and the image information may be stored in a database in advance. The 3D model information may be provided by a designer of the building, or may be obtained by scanning and reconstructing an appearance of the building using a camera or the like by a user, and generally includes a complete three-dimensional shape of the building. The 2D picture information may be collected by a user and uploaded to a computing device, and in general, may be the overall appearance of a building or may be a partial view of a building. In the embodiment of the invention, no matter the 3D model information and the 2D picture information are, the 3D model information and the 2D picture information need to be labeled manually, and the area or the position of the superposed AR model is determined. Therefore, acquiring image information corresponding to the building information requires determining 3D model information of the building according to the name or number of the building or determining 2D picture information of the building according to the name or number of the building. In the embodiment of the invention, the image information corresponding to the building is acquired for comparison with the shooting information acquired by the user, and the building shot by the user is determined.
Then, determining a building contained in the image information according to the image information; in the embodiment of the invention, whether the main body in the video information is the same as the image information corresponding to the building acquired before is determined by using the image contrast method, and if the main body in the video information is the same as the image information, which building is included in the video information can be determined. Specifically, feature points of the main content in the image information and feature points of the image information are respectively extracted, the feature points of the main content and the feature points of the image information are compared, and a display error of a comparison result is within a certain range, so that a building corresponding to the main content in the image information can be understood to be the same as a building represented by the image information.
Finally, an AR model corresponding to the building is retrieved. In the embodiment of the present invention, different AR models, such as pictures, videos, 3D models, animations, etc., may be stored in the cloud processing device in advance. Accordingly, different AR models correspond to different building image information, which may be matched or marked manually. When the building is determined, the corresponding AR model can be obtained by direct retrieval.
It should be noted that, in the embodiment of the present invention, when acquiring image information corresponding to each building in the building information, 3D model information corresponding to each building in the building information is preferentially acquired; and if the 3D model information corresponding to the building information is not acquired, acquiring 2D picture information corresponding to each building in the building information. The reason why the 3D model is preferably used is that the 3D model is a complete shape of the building, and can be operated more accurately when the AR model is superimposed, and the 3D model is more robust to changes in local details and changes in illumination, and is advantageous for calculating the pose subsequently. When there is no 3D model, 2D pictures are used, which is advantageous in that the amount of data is large and the picture sources are wide.
104. And sending the AR model to the terminal.
In the embodiment of the invention, after the AR model is sent to the terminal, the terminal starts the superposition operation according to the received AR model, and in order to accurately superpose the AR model to the image information, the relative pose between the building in the image information and the building in the image information is firstly calculated. Specifically, in a specific implementation process, feature points of a building in image information are extracted first, then feature points of the building in image information are extracted, then a relative position relationship of each feature point is calculated according to a feature point matching principle, and a relative pose of the building in the image information is determined according to the relative position relationship.
And then, the AR model is superposed into the image information according to the relative pose to form superposed image information, so that the pose of the AR is the same as the position of the building in the image information. And displaying the superposed image information in the display unit.
According to the information processing method provided by the embodiment of the invention, the image information and the position information shot by the terminal are obtained, and then the corresponding AR model is searched by combining the position information and the image information, wherein the search range can be reduced by searching the AR model according to the position information, the search precision is improved, the search efficiency can be improved by determining the corresponding AR model by combining the image information shot by the terminal, the identification rate of static objects is improved, and the problems of large search quantity and low accuracy in static object identification in the prior art are solved.
When a user holds the terminal by hand or walks, the position or the angle of the terminal is not fixed, and correspondingly, the image information shot by the terminal is also not fixed, so that the immersion feeling can be enhanced and the vision is smoother, on the basis of the above contents, the terminal can also collect inertial measurement data, determine the moving pose of the terminal according to the inertial measurement data, and further adjust the position of the AR model. In the embodiment of the present invention, the mobile pose of the terminal may be calculated by using data generated by an Inertial Measurement Unit (IMU) when the position or the angle of the terminal changes, specifically, by collecting data such as an acceleration change value and an angular velocity change value of the IMU in the terminal, and then calculating a position change, that is, a mobile pose, of the terminal according to the data value. And then the position of the AR model in the superposed image information is updated according to the moving pose.
Therefore, the technical scheme provided by the embodiment of the invention realizes that the AR model moves along with the movement of the image information shot by the user, improves the timeliness, is more smooth visually, and enhances the interaction with the user.
In the embodiment of the present invention, a large amount of data is stored in the computing device, and therefore, in order to improve the speed and accuracy of the retrieval, a relational database may be further established in advance, specifically, as shown in fig. 2, another flowchart of an embodiment of an image processing method according to the embodiment of the present invention is shown, as shown in fig. 2, before step 101, the information processing method according to the embodiment of the present invention may further include the following steps:
100. and establishing at least one relational database according to the building information, the image information and the AR model.
In the embodiment of the invention, the image information can be determined for each building depending on the map information and the image information uploaded by the user through various ways, the image information is labeled, the position of AR model superposition or the area of AR model superposition is determined, and a complete relational database about the building information, the image information and the AR model is established or a relational database is respectively established by matching the corresponding AR model for each building.
Moreover, the content of the database can be changed continuously, and the adjustment can be carried out according to the actual requirement, so as to increase, delete, modify and the like the content.
The database is established for improving the retrieval precision and efficiency, so that the AR model can be more accurately superposed in the image information shot by the user.
In order to implement the method in the foregoing, an embodiment of the present invention further provides an information processing apparatus, fig. 3 is a schematic structural diagram of an embodiment of the information processing apparatus provided in the embodiment of the present invention, and as shown in fig. 3, the apparatus of the embodiment may include: a first acquisition unit 11, a second acquisition unit 12, a retrieval unit 13 and a transmission unit.
A first obtaining unit 11, configured to obtain image information captured by a terminal;
a second obtaining unit 12, configured to obtain location information of a terminal;
a search unit 13 for searching the AR model based on the position information and the image information;
a sending unit 14, configured to send the AR model to the terminal.
In a specific implementation process, the retrieving unit 13 is specifically configured to:
determining a retrieval range according to the position information;
retrieving building information within the retrieval range;
acquiring image information corresponding to each building in the building information;
determining a building contained in the image information according to the image information;
an AR model corresponding to the building is retrieved.
In a specific implementation, the acquiring image information corresponding to each building in the building information includes:
acquiring 3D model information corresponding to each building in the building information; or,
and if the 3D model information corresponding to the building information is not acquired, acquiring 2D picture information corresponding to each building in the building information.
The information processing apparatus provided in the embodiment of the present invention may be used to execute the technical solution of the method embodiment shown in fig. 1, and the implementation principle and the technical effect are similar, which are not described herein again.
Further, an information processing apparatus is further provided in an embodiment of the present invention, fig. 4 is another schematic structural diagram of the information processing apparatus provided in the embodiment of the present invention, and as shown in fig. 4, the apparatus of the embodiment may further include, on the basis of the foregoing content: a building unit 15.
And the establishing unit 15 is used for establishing at least one relational database according to the building information, the image information and the AR model.
The information processing apparatus provided in the embodiment of the present invention may be used to execute the technical solution of the method embodiment shown in fig. 2, and the implementation principle and the technical effect are similar, which are not described herein again.
Fig. 5 is a schematic structural diagram of an embodiment of a cloud processing device provided in an embodiment of the present invention, and as shown in fig. 5, the cloud processing device provided in the embodiment of the present invention may specifically include: a processor 21 and a memory 22.
Wherein the memory 21 is configured to store instructions that, when executed by the processor 22, cause the apparatus to perform any of the methods shown in fig. 1 or fig. 2.
The cloud processing device provided in the embodiment of the present invention may be configured to execute the technical solution of the method embodiment shown in fig. 1 or fig. 2, and the implementation principle and the technical effect are similar, which are not described herein again.
The embodiment of the present invention further provides a computer program product, which can be directly loaded into an internal memory of a computer and contains software codes, and after the computer program is loaded and executed by the computer, the method shown in fig. 1 or fig. 2 can be implemented.
The computer program product provided by the embodiment of the present invention may be used to execute the technical solution of the method embodiment shown in fig. 1 or fig. 2, and the implementation principle and the technical effect are similar, which are not described herein again.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present invention, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions in actual implementation, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a Processor (Processor) to execute some steps of the methods according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (10)

1. An information processing method characterized by comprising:
acquiring image information shot by a terminal;
acquiring the position information of the terminal;
retrieving an AR model according to the position information and the image information;
and sending the AR model to the terminal.
2. The method of claim 1, wherein the retrieving an AR model based on the location information and the image information comprises:
determining a retrieval range according to the position information;
retrieving all building information within the retrieval range;
acquiring image information corresponding to each building in the building information;
determining a building contained in the image information according to the image information;
an AR model corresponding to the building is retrieved.
3. The method of claim 2, wherein the obtaining image information corresponding to each building in the building information comprises:
acquiring 3D model information corresponding to each building in the building information; or,
and if the 3D model information corresponding to the building information is not acquired, acquiring 2D picture information corresponding to each building in the building information.
4. The method according to any one of claims 1 to 3, further comprising:
and establishing at least one relational database according to the building information, the image information and the AR model.
5. An information processing apparatus characterized by comprising:
the first acquisition unit is used for acquiring image information shot by the terminal;
a second obtaining unit, configured to obtain location information of the terminal;
the retrieval unit is used for retrieving an AR model according to the position information and the image information;
and the sending unit is used for sending the AR model to the terminal.
6. The apparatus according to claim 5, wherein the retrieving unit is specifically configured to:
determining a retrieval range according to the position information;
retrieving building information within the retrieval range;
acquiring image information corresponding to each building in the building information;
determining a building contained in the image information according to the image information;
an AR model corresponding to the building is retrieved.
7. The apparatus of claim 6, wherein the obtaining image information corresponding to each building in the building information comprises:
acquiring 3D model information corresponding to each building in the building information; or,
and if the 3D model information corresponding to the building information is not acquired, acquiring 2D picture information corresponding to each building in the building information.
8. The apparatus of any one of claims 5 to 7, further comprising:
and the establishing unit is used for establishing at least one relational database according to the building information, the image information and the AR model.
9. A cloud processing device, the device comprising a processor and a memory; the memory is configured to store instructions that, when executed by the processor, cause the apparatus to perform the method of any of claims 1-4.
10. A computer program product directly loadable into the internal memory of a computer and containing software code, which when loaded and executed by the computer is able to carry out the method according to any of claims 1 to 4.
CN201780002896.9A 2017-12-29 2017-12-29 Information processing method and device, cloud processing equipment and computer program product Active CN108235764B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/119708 WO2019127320A1 (en) 2017-12-29 2017-12-29 Information processing method and apparatus, cloud processing device, and computer program product

Publications (2)

Publication Number Publication Date
CN108235764A true CN108235764A (en) 2018-06-29
CN108235764B CN108235764B (en) 2022-09-16

Family

ID=62645531

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780002896.9A Active CN108235764B (en) 2017-12-29 2017-12-29 Information processing method and device, cloud processing equipment and computer program product

Country Status (2)

Country Link
CN (1) CN108235764B (en)
WO (1) WO2019127320A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109545003A (en) * 2018-12-24 2019-03-29 北京卡路里信息技术有限公司 A kind of display methods, device, terminal device and storage medium
CN109886191A (en) * 2019-02-20 2019-06-14 上海昊沧系统控制技术有限责任公司 A kind of identification property management reason method and system based on AR

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103107938A (en) * 2013-01-25 2013-05-15 腾讯科技(深圳)有限公司 Information interactive method, server and terminal
US20130194392A1 (en) * 2012-01-26 2013-08-01 Qualcomm Incorporated Mobile Device Configured to Compute 3D Models Based on Motion Sensor Data
CN103489002A (en) * 2013-09-27 2014-01-01 广州中国科学院软件应用技术研究所 Reality augmenting method and system
US20150098607A1 (en) * 2013-10-07 2015-04-09 Hong Kong Applied Science and Technology Research Institute Company Limited Deformable Surface Tracking in Augmented Reality Applications
CN106101574A (en) * 2016-06-28 2016-11-09 广东欧珀移动通信有限公司 Control method, device and the mobile terminal of a kind of image enhaucament reality
CN106354869A (en) * 2016-09-13 2017-01-25 四川研宝科技有限公司 Real-scene image processing method and server based on location information and time periods
CN106373198A (en) * 2016-09-18 2017-02-01 福州大学 Method for realizing augmented reality
CN106446098A (en) * 2016-09-13 2017-02-22 四川研宝科技有限公司 Live action image processing method and server based on location information
CN106529452A (en) * 2016-11-04 2017-03-22 重庆市勘测院 Mobile intelligent terminal building rapid identification method based on building three-dimensional model
US20170243371A1 (en) * 2015-10-30 2017-08-24 Snap Inc. Image based tracking in augmented reality systems

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130194392A1 (en) * 2012-01-26 2013-08-01 Qualcomm Incorporated Mobile Device Configured to Compute 3D Models Based on Motion Sensor Data
CN103107938A (en) * 2013-01-25 2013-05-15 腾讯科技(深圳)有限公司 Information interactive method, server and terminal
CN103489002A (en) * 2013-09-27 2014-01-01 广州中国科学院软件应用技术研究所 Reality augmenting method and system
US20150098607A1 (en) * 2013-10-07 2015-04-09 Hong Kong Applied Science and Technology Research Institute Company Limited Deformable Surface Tracking in Augmented Reality Applications
US20170243371A1 (en) * 2015-10-30 2017-08-24 Snap Inc. Image based tracking in augmented reality systems
CN106101574A (en) * 2016-06-28 2016-11-09 广东欧珀移动通信有限公司 Control method, device and the mobile terminal of a kind of image enhaucament reality
CN106354869A (en) * 2016-09-13 2017-01-25 四川研宝科技有限公司 Real-scene image processing method and server based on location information and time periods
CN106446098A (en) * 2016-09-13 2017-02-22 四川研宝科技有限公司 Live action image processing method and server based on location information
CN106373198A (en) * 2016-09-18 2017-02-01 福州大学 Method for realizing augmented reality
CN106529452A (en) * 2016-11-04 2017-03-22 重庆市勘测院 Mobile intelligent terminal building rapid identification method based on building three-dimensional model

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
陈金华: "视频透射式增强现实系统", 《智慧学习环境构建》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109545003A (en) * 2018-12-24 2019-03-29 北京卡路里信息技术有限公司 A kind of display methods, device, terminal device and storage medium
CN109886191A (en) * 2019-02-20 2019-06-14 上海昊沧系统控制技术有限责任公司 A kind of identification property management reason method and system based on AR

Also Published As

Publication number Publication date
WO2019127320A1 (en) 2019-07-04
CN108235764B (en) 2022-09-16

Similar Documents

Publication Publication Date Title
US11393173B2 (en) Mobile augmented reality system
US10380410B2 (en) Apparatus and method for image-based positioning, orientation and situational awareness
US9576183B2 (en) Fast initialization for monocular visual SLAM
US9280852B2 (en) Augmented reality virtual guide system
US20130095855A1 (en) Method, System, and Computer Program Product for Obtaining Images to Enhance Imagery Coverage
CN104378735B (en) Indoor orientation method, client and server
CN106233371A (en) Select the panoramic picture for the Annual distribution shown
CN111028358B (en) Indoor environment augmented reality display method and device and terminal equipment
EP2981945A1 (en) Method and apparatus for determining camera location information and/or camera pose information according to a global coordinate system
CN105023266A (en) Method and device for implementing augmented reality (AR) and terminal device
CN107084740B (en) Navigation method and device
US20190147303A1 (en) Automatic Detection of Noteworthy Locations
US20200387711A1 (en) Indoor augmented reality information display method
CN111833457A (en) Image processing method, apparatus and storage medium
CN114185073A (en) Pose display method, device and system
CN108235764B (en) Information processing method and device, cloud processing equipment and computer program product
TW201823929A (en) Method and system for remote management of virtual message for a moving object
US10108882B1 (en) Method to post and access information onto a map through pictures
US10878278B1 (en) Geo-localization based on remotely sensed visual features
CN109074356A (en) System and method for being optionally incorporated into image in low bandwidth digital map database
CN113450439A (en) Virtual-real fusion method, device and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210224

Address after: 201111 2nd floor, building 2, no.1508, Kunyang Road, Minhang District, Shanghai

Applicant after: Dalu Robot Co.,Ltd.

Address before: 518000 Room 201, building A, No. 1, Qian Wan Road, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen, Guangdong (Shenzhen Qianhai business secretary Co., Ltd.)

Applicant before: CLOUDMINDS (SHENZHEN) ROBOTICS SYSTEMS Co.,Ltd.

CB02 Change of applicant information
CB02 Change of applicant information

Address after: 201111 Building 8, No. 207, Zhongqing Road, Minhang District, Shanghai

Applicant after: Dayu robot Co.,Ltd.

Address before: 201111 2nd floor, building 2, no.1508, Kunyang Road, Minhang District, Shanghai

Applicant before: Dalu Robot Co.,Ltd.

GR01 Patent grant
GR01 Patent grant