CN111597466A - Display method and device and electronic equipment - Google Patents

Display method and device and electronic equipment Download PDF

Info

Publication number
CN111597466A
CN111597466A CN202010369805.9A CN202010369805A CN111597466A CN 111597466 A CN111597466 A CN 111597466A CN 202010369805 A CN202010369805 A CN 202010369805A CN 111597466 A CN111597466 A CN 111597466A
Authority
CN
China
Prior art keywords
house source
image
building
target
real
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010369805.9A
Other languages
Chinese (zh)
Inventor
沈冠雄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing ByteDance Network Technology Co Ltd
Original Assignee
Beijing ByteDance Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing ByteDance Network Technology Co Ltd filed Critical Beijing ByteDance Network Technology Co Ltd
Priority to CN202010369805.9A priority Critical patent/CN111597466A/en
Publication of CN111597466A publication Critical patent/CN111597466A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9538Presentation of query results
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/16Real estate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

Landscapes

  • Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Tourism & Hospitality (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Health & Medical Sciences (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Computer Graphics (AREA)
  • Architecture (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the disclosure discloses a display method, a display device and electronic equipment. One embodiment of the method comprises: real building body images are collected in real time, and the real building body images are displayed; determining a house source in a target building body, wherein the target building body is the building body indicated by the real building body image; and displaying a target enhancement image on the displayed real building image, wherein the target enhancement information is used for indicating the house source. Therefore, a new display mode can be provided.

Description

Display method and device and electronic equipment
Technical Field
The present disclosure relates to the field of internet technologies, and in particular, to a display method and apparatus, and an electronic device.
Background
With the development of the internet, users increasingly use terminal devices to realize various functions. For example, a user can browse and search house source information through the terminal device, and therefore the user can obtain more house source information without going home. Or, the user can screen out the house source of the heart instrument of the user through the house source information on the network, and the house source is bought to the broker on the spot.
Disclosure of Invention
This disclosure is provided to introduce concepts in a simplified form that are further described below in the detailed description. This disclosure is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
The embodiment of the disclosure provides a display method, a display device and electronic equipment.
In a first aspect, an embodiment of the present disclosure provides a display method, where the method includes: real building body images are collected in real time, and the real building body images are displayed; determining a house source in a target building, wherein the target building is the building indicated by the real building image; and displaying a target enhancement image on the displayed real building image, wherein the target enhancement information is used for indicating the house source.
In a second aspect, an embodiment of the present disclosure provides a display device, including: the first display unit is used for acquiring a real building image in real time and displaying the real building image; the first determining unit is used for determining a house source in a target building, wherein the target building is the building indicated by the real building image; and the second display unit is used for displaying a target enhancement image on the displayed real building image, wherein the target enhancement information is used for indicating the house source.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including: one or more processors; the image acquisition device is used for acquiring images; a storage device, configured to store one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the presentation method as described above in the first aspect.
In a fourth aspect, the disclosed embodiments provide a computer readable medium, on which a computer program is stored, which when executed by a processor, implements the steps of the presentation method as described above in the first aspect.
According to the display method, the display device and the electronic equipment, the real building image is collected and displayed in real time, then the house source in the target building indicated by the real building image is determined, and finally the target enhanced image indicating the house source in the building is displayed on the displayed real building image; therefore, a new display mode can be provided, the enhanced image for indicating the house source can be displayed on the real-time acquired real building image, and the indication information of the house source in the building can be provided for the user, so that the user can know the condition of the house source in the building outside the building, and the time cost of the user is saved; in addition, the user can intuitively feel the position of the house source in the building body, the external condition of the house source (the position in the building body, the external environment where the building body is located) and the like, so that more reference information about the house source can be provided for the user, and more accurate related information of the house source can be provided for the user.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and features are not necessarily drawn to scale.
FIG. 1 is a flow chart diagram illustrating one embodiment of a method in accordance with the present disclosure;
fig. 2A and 2B are schematic diagrams of an application scenario of a presentation method according to the present disclosure;
FIG. 3 is a schematic diagram of an application scenario of a presentation method according to the present disclosure;
FIG. 4 is a schematic diagram of an application scenario of a presentation method according to the present disclosure;
FIG. 5 is a schematic diagram of an application scenario of a presentation method according to the present disclosure;
FIG. 6 is a schematic diagram of an application scenario of a presentation method according to the present disclosure;
FIG. 7 is a schematic diagram of an application scenario of a presentation method according to the present disclosure;
FIG. 8 is a schematic structural diagram of one embodiment of a display device according to the present disclosure;
FIG. 9 is an exemplary system architecture to which the presentation method of one embodiment of the present disclosure may be applied;
fig. 10 is a schematic diagram of a basic structure of an electronic device provided according to an embodiment of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order, and/or performed in parallel. Moreover, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "include" and variations thereof as used herein are open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
Referring to fig. 1, a flow of one embodiment of a presentation method according to the present disclosure is shown. The demonstration method as shown in fig. 1 comprises the following steps:
step 101, real building body images are collected in real time, and the real building body images are displayed.
In this embodiment, an execution subject (for example, a terminal device) of the display method may acquire a real building image in real time and display the real building image.
In this embodiment, the execution main body may acquire a real-world image related to a building body through a camera; and then, constructing a three-dimensional building model according to the real world image related to the building through the execution main body or a server end in communication connection with the execution main body.
In this embodiment, the execution subject can render the three-dimensional building model according to its pose (position and posture). As an example, a three-dimensional image rendering pipeline technology can be adopted to convert a three-dimensional model of a building body into a two-dimensional image; and then displaying the two-dimensional image obtained by the conversion.
In this embodiment, the real world image may include a building image. Here, the building body image may be understood as an image of the outside of a building, and the building body may be a building of one floor or at least two floors.
Step 102, determining a house source in a target building.
In this embodiment, the executing agent may determine the source of the house in the target building.
Here, the target building may be a building indicated by the real building image.
Here, the house source may refer to a resource leased or sold by a house.
As an example, a building A and a building B are arranged in the first cell, the building A is a three-layer building body, and the building B is a five-layer building body. There may be 2 houses to be sold in building a and 3 houses to be rented in building B.
Typically, premises to be sold or rented may be registered on the intermediary platform. The server supporting the intermediary platform may store the address of the house source.
In this embodiment, the execution subject may determine the source of the house in the target building in various ways, which is not limited herein. As an example, the executive body may present query information and input a cell name and a building number by the user to determine the building identity in the target building. Here, after determining the building identity of the target building, the executing entity may determine a house source located in the building from a pre-established house source database, and acquire house source information of the house source in the building.
As an example, in the house source database, the house source address and the house source information may be stored correspondingly. The execution main body can determine the house source with the house source address including the building body identifier as the house source in the target building body; then, the corresponding house source information is obtained.
And 103, displaying the target enhanced image on the displayed real building image.
In this embodiment, the executing subject may display the target augmented image on the displayed real building image.
Here, the above-mentioned target enhanced image may be used to indicate a house source.
The target-enhanced image may indicate the house source in various aspects, and is not limited herein. As an example, the target enhanced image may indicate the house source by: indicating the presence of a house source in a building, indicating the location of a house source, giving a house source profile, etc. It will be appreciated that the manner in which the house source is indicated may vary, as may the particular presentation of the target-enhanced image.
It should be noted that, in the display method provided in this embodiment, the real building image is collected and displayed in real time, then the house source in the target building indicated by the real building image is determined, and finally, the target augmented image indicating the house source in the building is displayed on the displayed real building image; therefore, a new display mode can be provided, the enhanced image for indicating the house source can be displayed on the real-time acquired real building image, and the indication information of the house source in the building can be provided for the user, so that the user can know the condition of the house source in the building outside the building, and the time cost of the user is saved; in addition, the user can intuitively feel the position of the house source in the building body, the external condition of the house source (the position in the building body, the external environment where the building body is located) and the like, so that more reference information about the house source can be provided for the user, and more accurate related information of the house source can be provided for the user.
Please refer to fig. 2A and 2B, which illustrate an application scenario according to an embodiment of the presentation method of the present disclosure. In the application scene, Zhang III can aim at a camera of the terminal equipment to a building body in a community to acquire a building body image. Then, as shown in fig. 2A, on the screen of the terminal device, a real building image 201 may be displayed. The terminal device can then determine the source of the room in the target building in the real building image. Then, as shown in fig. 2B, on the screen of the terminal device, a real building image 201 and a target augmented image 202 may be displayed, and the target augmented image may indicate a house source in the target building.
It should be noted that, in a simple schematic manner, the drawings in the present disclosure show relevant scenes related to the technical solution; however, the image displayed by the terminal in the real application scene is the real environment image, and therefore, the reality degree of the image in the drawings of the present disclosure is greatly different from that of the image displayed in the actual application.
In some embodiments, the target enhancement information may include house source information.
Here, the house source information may be used to introduce the relevant situation of the house source. The specific content in the house source information may be set according to the actual situation, and is not limited herein.
As an example, the house source information may include at least one of, but is not limited to: the house source address, the house source orientation, the floor where the house source is located, the house source pattern (three rooms, two rooms and the like), the house source area and the like.
It should be noted that the target enhancement information includes house source information, so that a user can obtain the house source information of each house source of a building body outside the building body, and a judgment basis is provided for the user. In other words, the user can obtain the house source information of the house source in the building body outside the building body at one time, and whether the house source in the building body meets the expectation of the user or not is judged, so that whether the user enters the building body for visiting or not and which house source in the building body for visiting can be further determined, and then the user can conveniently and regularly check the route when checking the house source on the spot, the house source checking efficiency is improved, and the user time is saved.
In some application scenarios, the presentation form (for example, the presentation position) of the house source information may be set according to an actual situation, and is not limited herein.
Referring to fig. 3, an application scenario of an embodiment of a presentation method according to the present disclosure is shown. In fig. 3, on the displayed image of the real building body, the house source information 301 in the form of an augmented image is displayed. It should be noted that, the real building image in fig. 3 may refer to the description in fig. 2, and is not described herein again.
In some embodiments, the step 103 includes: determining the house source position of the house source in the target building body; and displaying the target enhanced image at the room source position of the displayed real building image.
In some application scenarios, the floor where the house source is located in the target building body can be determined according to the house source address, and the floor where the house source is located is determined as the house source location. It will be appreciated that typically the address of the source needs to be written on the floor on which the source is located.
In some application scenarios, the position of the house source in the floor can be determined according to the house source orientation in the house source information, and the floor where the house source is located and the position of the house source in the floor are determined as the house source position.
As an example, for a 6-story building, a house source located south-facing from 3 stories, the house source location may be "third-story south-facing".
Here, the execution subject may determine, for the determined location of the house source, a location of the house source location in the real building image.
Optionally, the position of the room source position in the real building image can be determined by using a building three-dimensional model. Specifically, each floor in the three-dimensional building model can be associated with a floor identifier, then the execution main body determines the position of the house source position in the three-dimensional building model by comparing the house source position with the floor identifier, and finally the execution main body can determine the position of the house source position in the three-dimensional building model mapped to the position in the real building image according to the relationship between each part in the three-dimensional building model and the real building image, so that the position of the house source position in the real building image can be determined.
Alternatively, the floors can be determined directly on the image of the real building body through image recognition, and then the position of the house source position in the image of the real building body is determined.
Referring to fig. 4, fig. 4 illustrates an application scenario according to an embodiment of the presentation method of the present disclosure. In fig. 4, the location of the house source is determined to be one floor, and at one floor above the displayed real building image, a target augmented image 401 is displayed, wherein the target augmented image 401 may indicate that there is a house source at one floor. It should be noted that, the real building image in fig. 4 may refer to the description in fig. 2, and is not described herein again.
It should be noted that by determining the location of the house source and displaying the target augmented image at the house source location of the real building image, the house source can be indicated by indicating the location of the house source, so that the user can visually see the location of the house source in the target building, determine whether the house source in the target building meets the expectation of the user on the location of the house source, and if not, directly eliminate the house source without entering the target building to actually view the house source, thereby saving the time of the user.
In some embodiments, the step 103 may include: according to the house source position, combining the house source information three-dimensional model with the building body three-dimensional model to obtain a derivative three-dimensional model; and generating and displaying a first video based on the derived three-dimensional model.
Here, the first video includes a real building image and a target augmented image.
Here, the derivative three-dimensional model is obtained by combining a three-dimensional model of house information and a three-dimensional model of a building body, and the three-dimensional model of the building body corresponds to the real building body image.
It can be understood that the first video may be obtained by rendering the derived three-dimensional model according to the current pose of the execution subject.
Here, the three-dimensional building model may be previously established; or can be constructed in real time according to real building body images acquired in real time; the initial model can also be obtained by modifying the real building body image acquired in real time.
In some application scenes, a building body three-dimensional model constructed by using a real building body image can be determined as a building body three-dimensional model corresponding to the real building body image.
Here, the derived three-dimensional model includes a house source three-dimensional model and the building body three-dimensional model, and the first video may include the real building body image and the target augmented image.
Here, the execution subject may render the derived three-dimensional model according to its pose (position and posture). As an example, a three-dimensional image rendering pipeline technique may be employed to convert the derived three-dimensional model into a two-dimensional image; and then displaying the two-dimensional image obtained by the conversion. It is understood that the two-dimensional graphics obtained by conversion are arranged according to time to obtain the first video.
It should be noted that the derived three-dimensional model is obtained by combining the house source information three-dimensional model and the building body three-dimensional model, and the building body three-dimensional model and the house source information three-dimensional model can be fused, so that in the process of executing main body pose transformation, the relative position between the house source information three-dimensional model and the building body three-dimensional model is unchanged, the house source information and the building body with unchanged relative position seen by a user on a screen can reduce the shaking sense of screen images, so that the user can only obtain accurate and clear information, and the virtual sense of images obtained by rendering based on the house source information three-dimensional model can be reduced.
In some embodiments, the derived three-dimensional model may be generated by: acquiring a building body three-dimensional model corresponding to the real building body image; determining a combination position in the building three-dimensional model according to the house source position, wherein the combination position is the combination position of the house source information three-dimensional model and the building three-dimensional model; and combining the house source information three-dimensional model with the building body three-dimensional model at the combination position to obtain a derivative three-dimensional model.
Here, the electronic device that executes the generation step may be the execution subject or may be another electronic device other than the execution subject.
Here, the three-dimensional model of the house source information may be added to the three-dimensional model of the building body at the combination position to obtain a derived three-dimensional model.
It should be noted that, a combination position is determined according to the house source position, and then the house source information three-dimensional model is combined with the building body three-dimensional model at the combination position to obtain the derivative three-dimensional model, so that an accurate derivative model can be obtained according to the house source position.
Referring to fig. 5, fig. 5 illustrates an application scenario according to an embodiment of the presentation method of the present disclosure. In fig. 5, the three-dimensional model of the house source information is combined with the three-dimensional model of the building body to obtain a derivative three-dimensional model, so that the target enhancement information 501 obtained by rendering the three-dimensional model of the house source information and the real building body image obtained by rendering the three-dimensional model of the building body are relatively static.
In some embodiments, the step 102 may include: determining a building body identifier of a target building body according to the equipment position information and the acquired real building body image; and determining the house source in the target building according to the building identification.
Here, the device location information may be location information of the terminal. The positioning information may be obtained by a positioning device in the terminal, and is not described herein again.
Here, the approximate area (e.g., which cell or which village) in which the terminal device is located can be determined by the device location information.
Here, the execution subject can determine the building identity of the target building from the real building image.
As an example, if a real building image has a building number or house number, then in combination with the determined approximate region, a building identity may be determined. For example, a building a is written on the real building image, and then the building identity of the target building can be determined as the building a of the first cell by combining the first cell determined according to the equipment positioning information.
As an example, a street view image of the first cell may be acquired through the determined approximate region (e.g., the first cell), and the street view image and the real building body image are compared, so that the building body identifier of the target building body indicated by the real building body image may be determined.
Here, after determining the building identity of the target building, the executing entity may determine a house source located in the building from a pre-established house source database, and acquire house source information of the house source in the building.
As an example, in the house source database, the house source address and the house source information may be stored correspondingly. The execution main body can determine the house source with the house source address including the building body identifier as the house source in the target building body; then, the corresponding house source information is obtained.
It should be noted that the building body identifier is automatically determined through the execution main body, the house source in the target building body is determined according to the building body identifier, and the user can be prevented from inputting the building body identifier, so that the user interaction steps can be reduced, the speed of determining the house source in the target building body is improved, further, the target enhancement information can be determined and displayed without user operation while the real building body image is displayed, the speed of displaying the target enhancement information is improved, the efficiency of obtaining the house source related information by the user is further improved, and the user time is saved.
In some embodiments, the target enhanced image is associated with a house source page opening control; the method further comprises the following steps: sending a house source page acquisition request in response to the detection of the trigger operation of opening the control on the house source page, wherein the house source page acquisition request is used for acquiring a house source page corresponding to the house source indicated by the target enhanced image; and displaying the house source page in response to receiving the house source page.
Here, the house source page may show house source information. The feed page may also provide some avenues of action for the feed, such as contacting a broker for the feed, making an appointment to view the feed in the field, and so forth.
Here, the room source page opening control may be a control that is independently displayed, or a control that is displayed in superposition with the target enhancement information. If the control is independently displayed, characters such as 'clicking the control to open a house source page' and the like can be marked on the control, and a user is prompted to click the control. If the control is displayed in a manner of being overlapped with the target enhancement information, the user clicks the target enhancement information, and then the house source page can be triggered to open the control.
Here, the execution agent may send a room source page acquisition request to the server. It can be understood that, since the house source page opening control is associated with the target enhanced image, and the execution subject can determine the house source identifier corresponding to the house source information in the target enhanced image, the execution subject can send the house source identifier to the server. The server can obtain the house source page of the house source indicated by the house source identifier by using the house source identifier. The executive body may then present the house source page in response to receiving the house source page.
It should be noted that, by associating the target enhanced image with the room source page opening control, the user can see the target enhanced image and directly trigger the room source page opening control associated with the target enhanced image after the user is interested in the room source indicated by the target enhanced image, and then the terminal can quickly display the room source page of the room source, so that the user can quickly acquire detailed information of the interested room source, and the efficiency of acquiring the room source information of the room source by the user is improved. In addition, a user can find interested house resources outside a building body and immediately check house resource pages of detailed house resource information, so that the user can acquire a large amount of relevant information of the house resources of a district (such as a residential district) in a short time by combining the field environment of the district, the efficiency of checking the house resources on the spot by the user is improved, a detailed foundation is provided for the user to make judgment, and the judgment error of the user in the life of the house is avoided.
Referring to fig. 6, fig. 6 illustrates an application scenario according to an embodiment of the presentation method of the present disclosure. In fig. 6, a room source page open control 601 may be presented in association with a target augmented image. By way of example, the room source page open control 601 may be marked with the word "view more". It should be noted that, the target enhanced image in fig. 6 may refer to the description about the target enhanced image in fig. 5, and is not repeated herein.
In some embodiments, the target-augmented image is associated with a markup control. The method further comprises the following steps: and in response to the detection of the trigger operation for the marking control, establishing a corresponding relation between the house source identifier of the house source indicated by the target enhanced image and the marking content indicated by the marking control.
Here, the tagging control may be used to tag the house source identification. The specific marking content of the marking control can be set according to the actual application scene, and is not limited herein.
By way of example, the aforementioned markup content may include, but is not limited to, at least one of: collection, disinterest, etc.
In some application scenarios, the tagging control may comprise a house source collection control. The user triggers the house source collection control, the execution main body can establish house source collection indication information (namely the marking content corresponding to the house source collection control), and the house source identification of the house source indicated by the target enhanced image associated with the house source collection control, and establishes the association relationship. Optionally, the execution subject may further send the association relationship to a server.
It should be noted that, through the above marking control, after the user can see the target enhanced image and generate an idea for the house source indicated by the target enhanced image, the marking control consistent with the idea is triggered, so that the user can mark the browsed information, quickly screen out the house source which is interested or not interested, and facilitate the user to perform further actions. In other words, by providing a way of storing information in a classified manner, a user can acquire a large amount of information related to the house resources of a cell and perform screening in a short time by combining the field environment of the area (such as the cell), thereby improving the efficiency of the user in viewing the house resources in the field.
Referring to fig. 7, fig. 7 illustrates an application scenario according to an embodiment of the presentation method of the present disclosure. In fig. 7, a first markup control 601 can be presented in association with a target enhanced image. By way of example, the first marking control 701 may be marked with a "favorite" word, and the user may bookmark the house source by clicking the first marking control. In fig. 7, a second marking control 701 may be presented in association with the target enhanced image. As an example, the second marking control 701 may be marked with a word "not interested", and the user may stop presenting the target enhanced image of the house source by clicking the second marking control. It should be noted that, the target enhanced image in fig. 7 may refer to the description about the target enhanced image in fig. 5, and is not repeated herein.
With further reference to fig. 8, as an implementation of the methods shown in the above figures, the present disclosure provides an embodiment of a display apparatus, which corresponds to the embodiment of the method shown in fig. 1, and which may be applied in various electronic devices.
As shown in fig. 8, the display device of the present embodiment includes: a first presentation unit 801, a first determination 802 and a second presentation unit 803; the first display unit is used for acquiring a real building image in real time and displaying the real building image; the first determining unit is used for determining a house source in a target building, wherein the target building is the building indicated by the real building image; and the second display unit is used for displaying a target enhancement image on the displayed real building image, wherein the target enhancement information is used for indicating the house source.
In this embodiment, specific processing of the first presentation unit 801, the first determination unit 802, and the second presentation unit 803 of the presentation apparatus and technical effects thereof can refer to the related descriptions of step 101, step 102, and step 103 in the corresponding embodiment of fig. 1, respectively, and are not described herein again.
In some embodiments, the target enhancement information includes house source information.
In some embodiments, the displaying the target augmented image on the displayed real building image includes: determining the house source position of the house source in the target building body; and displaying the target enhanced image at the room source position of the displayed real building image.
In some embodiments, the displaying the target augmented image on the displayed real building image includes: according to the house source position, combining the house source information three-dimensional model with the building body three-dimensional model to obtain a derivative three-dimensional model; and generating and displaying a first video based on the derived three-dimensional model, wherein the first video comprises a real building image and a house source information enhanced image.
In some embodiments, the determining the source of the room in the target building comprises: determining a building body identifier of a target building body according to the equipment position information and the acquired real building body image; and determining the house source in the target building according to the building identification.
In some embodiments, the target enhanced image is associated with a house source page opening control; and the apparatus is further configured to: sending a house source page acquisition request in response to the detection of the trigger operation of opening the control on the house source page, wherein the house source page acquisition request is used for acquiring a house source page corresponding to the house source indicated by the target enhanced image; and displaying the house source page in response to receiving the house source page.
In some embodiments, the target-enhanced image is associated with a markup control; and the apparatus is further configured to: and in response to the detection of the trigger operation for the marking control, establishing a corresponding relation between the house source identifier corresponding to the house source indicated by the target enhanced image and the marking content indicated by the marking control.
Referring to fig. 9, fig. 9 illustrates an exemplary system architecture to which the presentation method of one embodiment of the present disclosure may be applied.
As shown in fig. 9, the system architecture may include terminal devices 901, 902, 903, a network 904, and a server 905. Network 904 is the medium used to provide communication links between terminal devices 901, 902, 903 and server 905. Network 904 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The terminal devices 901, 902, 903 may interact with a server 905 over a network 904 to receive or send messages or the like. The terminal devices 901, 902, 903 may have various client applications installed thereon, such as a web browser application, a search-type application, and a news-information-type application. The client application in the terminal devices 901, 902, and 903 may receive an instruction of the user, and complete a corresponding function according to the instruction of the user, for example, add corresponding information to the information according to the instruction of the user.
The terminal devices 901, 902, 903 may be hardware or software. When the terminal devices 901, 902, 903 are hardware, they may be various electronic devices having a display screen and supporting web browsing, including but not limited to smart phones, tablet computers, e-book readers, MP3 players (Moving Picture Experts Group Audio Layer III, mpeg compression standard Audio Layer 3), MP4 players (Moving Picture Experts Group Audio Layer IV, mpeg compression standard Audio Layer 4), laptop portable computers, desktop computers, and the like. When the terminal devices 901, 902, 903 are software, they can be installed in the electronic devices listed above. It may be implemented as multiple pieces of software or software modules (e.g., software or software modules used to provide distributed services) or as a single piece of software or software module. And is not particularly limited herein.
The server 905 may be a server providing various services, for example, receiving an information acquisition request sent by the terminal devices 901, 902, and 903, and acquiring the presentation information corresponding to the information acquisition request in various ways according to the information acquisition request. And the relevant data of the presentation information is sent to the terminal equipment 901, 902, 903.
It should be noted that the display method provided by the embodiment of the present disclosure may be executed by a terminal device, and accordingly, the display apparatus may be disposed in the terminal devices 901, 902, and 903. In addition, the display method provided by the embodiment of the present disclosure may also be executed by the server 905, and accordingly, the display apparatus may be disposed in the server 905.
It should be understood that the number of terminal devices, networks, and servers in fig. 9 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Referring now to fig. 10, a schematic diagram of an electronic device (e.g., the terminal device or the server of fig. 5) suitable for implementing embodiments of the present disclosure is shown. The terminal device in the embodiments of the present disclosure may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle terminal (e.g., a car navigation terminal), and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. The electronic device shown in fig. 10 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 10, the electronic apparatus may include a processing device (e.g., a central processing unit, a graphic processor, etc.) 1001 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)1002 or a program loaded from a storage device 1008 into a Random Access Memory (RAM) 1003. In the RAM 1003, various programs and data necessary for the operation of the electronic apparatus 1000 are also stored. The processing device 1001, the ROM 1002, and the RAM 1003 are connected to each other by a bus 1004. An input/output (I/O) interface 1005 is also connected to bus 1004.
Generally, the following devices may be connected to the I/O interface 1005: input devices 1006 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 1007 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage devices 1008 including, for example, magnetic tape, hard disk, and the like; and a communication device 1009. The communication apparatus 1009 may allow the electronic device to perform wireless or wired communication with other devices to exchange data. While fig. 10 illustrates an electronic device having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program carried on a non-transitory computer readable medium, the computer program containing program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication means 1009, or installed from the storage means 1008, or installed from the ROM 1002. The computer program, when executed by the processing device 1001, performs the above-described functions defined in the methods of the embodiments of the present disclosure.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText transfer protocol), and may be interconnected with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: real building body images are collected in real time, and the real building body images are displayed; determining a house source in a target building, wherein the target building is the building indicated by the real building image; and displaying a target enhancement image on the displayed real building image, wherein the target enhancement information is used for indicating the house source.
Computer program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including but not limited to an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of a unit does not in some cases constitute a limitation of the unit itself, for example, the first presentation unit may also be described as a "unit presenting a real house image".
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (10)

1. A method of displaying, comprising:
real building body images are collected in real time, and the real building body images are displayed;
determining a house source in a target building body, wherein the target building body is the building body indicated by the real building body image;
and displaying a target enhancement image on the displayed real building image, wherein the target enhancement information is used for indicating the house source.
2. The method of claim 1, wherein the target enhancement information comprises house source information.
3. The method of claim 1, wherein said presenting a target augmented image on said presented real building image comprises:
determining the house source position of the house source in the target building body;
and displaying the target enhanced image at the room source position of the displayed real building image.
4. The method of claim 2, wherein said presenting a target augmented image on said presented real building image comprises:
according to the house source position, combining the house source information three-dimensional model with the building body three-dimensional model to obtain a derivative three-dimensional model;
and generating and displaying a first video based on the derived three-dimensional model, wherein the first video comprises a real building body image and a house source information enhancement image.
5. The method of claim 1, wherein the determining the source of the room in the target building comprises:
determining a building body identifier of a target building body according to the equipment position information and the acquired real building body image;
and determining the house source in the target building according to the building identification.
6. The method of claim 1, wherein the target augmented image is associated with a house-source page open control; and
the method further comprises the following steps:
sending a house source page acquisition request in response to the detection of the trigger operation of opening a control on the house source page, wherein the house source page acquisition request is used for acquiring a house source page corresponding to the house source indicated by the target enhanced image;
and responding to the received house source page, and displaying the house source page.
7. The method of claim 1, wherein the target-enhanced image is associated with a markup control; and
the method further comprises the following steps:
and responding to the detected trigger operation aiming at the marking control, and establishing a corresponding relation between the house source identification corresponding to the house source indicated by the target enhanced image and the marking content indicated by the marking control.
8. A display device, comprising:
the first display unit is used for acquiring a real building image in real time and displaying the real building image;
the first determining unit is used for determining a house source in a target building, wherein the target building is the building indicated by the real building image;
and the second display unit is used for displaying the target enhancement image on the displayed real building image, wherein the target enhancement information is used for indicating the house source.
9. An electronic device, comprising:
one or more processors;
the image acquisition device is used for acquiring images;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-7.
10. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-7.
CN202010369805.9A 2020-04-30 2020-04-30 Display method and device and electronic equipment Pending CN111597466A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010369805.9A CN111597466A (en) 2020-04-30 2020-04-30 Display method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010369805.9A CN111597466A (en) 2020-04-30 2020-04-30 Display method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN111597466A true CN111597466A (en) 2020-08-28

Family

ID=72183418

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010369805.9A Pending CN111597466A (en) 2020-04-30 2020-04-30 Display method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN111597466A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112132654A (en) * 2020-09-04 2020-12-25 贝壳技术有限公司 Method, device and storage medium for displaying house source information
CN112232900A (en) * 2020-09-25 2021-01-15 北京五八信息技术有限公司 Information display method and device
CN114357348A (en) * 2021-12-29 2022-04-15 北京有竹居网络技术有限公司 Display method and device and electronic equipment
CN114625983A (en) * 2022-03-28 2022-06-14 北京有竹居网络技术有限公司 House resource information display method and device, electronic equipment and readable storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120042306A (en) * 2010-10-25 2012-05-03 에스케이텔레콤 주식회사 Method for providing realty information and system
US20140282220A1 (en) * 2013-03-14 2014-09-18 Tim Wantland Presenting object models in augmented reality images
US20150109339A1 (en) * 2012-07-19 2015-04-23 Huawei Device Co., Ltd. Method and apparatus for implementing augmented reality
KR101693631B1 (en) * 2015-09-01 2017-01-06 김진열 System for Real estate Providing Information Using Augmented Reality
CN106354758A (en) * 2016-08-17 2017-01-25 北京小米移动软件有限公司 Method and device for processing house information
US20170365019A1 (en) * 2016-06-17 2017-12-21 Guobin He Method and System for Searching Real Estate Information
CN108022306A (en) * 2017-12-30 2018-05-11 华自科技股份有限公司 Scene recognition method, device, storage medium and equipment based on augmented reality
US20180196819A1 (en) * 2017-01-12 2018-07-12 Move, Inc. Systems and apparatuses for providing an augmented reality real estate property interface
KR101921743B1 (en) * 2017-05-23 2019-02-13 부동산일일사 주식회사 Apparatus and method for providing real estate augmented reality services
CN109615482A (en) * 2018-12-21 2019-04-12 万翼科技有限公司 Methods of exhibiting, device and the storage medium of information of lease
CN110968798A (en) * 2019-10-25 2020-04-07 贝壳技术有限公司 House source display method and device, readable storage medium and processor

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120042306A (en) * 2010-10-25 2012-05-03 에스케이텔레콤 주식회사 Method for providing realty information and system
US20150109339A1 (en) * 2012-07-19 2015-04-23 Huawei Device Co., Ltd. Method and apparatus for implementing augmented reality
US20140282220A1 (en) * 2013-03-14 2014-09-18 Tim Wantland Presenting object models in augmented reality images
KR101693631B1 (en) * 2015-09-01 2017-01-06 김진열 System for Real estate Providing Information Using Augmented Reality
US20170365019A1 (en) * 2016-06-17 2017-12-21 Guobin He Method and System for Searching Real Estate Information
CN106354758A (en) * 2016-08-17 2017-01-25 北京小米移动软件有限公司 Method and device for processing house information
US20180196819A1 (en) * 2017-01-12 2018-07-12 Move, Inc. Systems and apparatuses for providing an augmented reality real estate property interface
KR101921743B1 (en) * 2017-05-23 2019-02-13 부동산일일사 주식회사 Apparatus and method for providing real estate augmented reality services
CN108022306A (en) * 2017-12-30 2018-05-11 华自科技股份有限公司 Scene recognition method, device, storage medium and equipment based on augmented reality
CN109615482A (en) * 2018-12-21 2019-04-12 万翼科技有限公司 Methods of exhibiting, device and the storage medium of information of lease
CN110968798A (en) * 2019-10-25 2020-04-07 贝壳技术有限公司 House source display method and device, readable storage medium and processor

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张明月: "《个性化营销》", 31 January 2020, pages: 9 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112132654A (en) * 2020-09-04 2020-12-25 贝壳技术有限公司 Method, device and storage medium for displaying house source information
CN112232900A (en) * 2020-09-25 2021-01-15 北京五八信息技术有限公司 Information display method and device
CN114357348A (en) * 2021-12-29 2022-04-15 北京有竹居网络技术有限公司 Display method and device and electronic equipment
CN114625983A (en) * 2022-03-28 2022-06-14 北京有竹居网络技术有限公司 House resource information display method and device, electronic equipment and readable storage medium
CN114625983B (en) * 2022-03-28 2023-08-15 北京有竹居网络技术有限公司 House source information display method and device, electronic equipment and readable storage medium

Similar Documents

Publication Publication Date Title
CN111597466A (en) Display method and device and electronic equipment
CN103473253B (en) The detection of data through geocoding and the user interface for it
CN111309240B (en) Content display method and device and electronic equipment
CN111597467A (en) Display method and device and electronic equipment
CN112015314A (en) Information display method and device, electronic equipment and medium
CN110619078B (en) Method and device for pushing information
CN111597465A (en) Display method and device and electronic equipment
CN111416756A (en) Protocol testing method, device, computer equipment and storage medium
CN111596991A (en) Interactive operation execution method and device and electronic equipment
CN109767257B (en) Advertisement putting method and system based on big data analysis and electronic equipment
CN111599022A (en) House display method and device and electronic equipment
CN111652675A (en) Display method and device and electronic equipment
CN111710017A (en) Display method and device and electronic equipment
CN114417782A (en) Display method and device and electronic equipment
CN113220752A (en) Display method and device and electronic equipment
CN110618811B (en) Information presentation method and device
CN110619101B (en) Method and apparatus for processing information
CN111798251A (en) Verification method and device of house source data and electronic equipment
CN111597414B (en) Display method and device and electronic equipment
US20230385524A1 (en) Web site preview based on client presentation state
CN114125485B (en) Image processing method, device, equipment and medium
CN113191257B (en) Order of strokes detection method and device and electronic equipment
CN114417214A (en) Information display method and device and electronic equipment
CN111460334B (en) Information display method and device and electronic equipment
CN111696214A (en) House display method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination