CN111710048A - Display method and device and electronic equipment - Google Patents

Display method and device and electronic equipment Download PDF

Info

Publication number
CN111710048A
CN111710048A CN202010509749.4A CN202010509749A CN111710048A CN 111710048 A CN111710048 A CN 111710048A CN 202010509749 A CN202010509749 A CN 202010509749A CN 111710048 A CN111710048 A CN 111710048A
Authority
CN
China
Prior art keywords
image
user
outdoor
dimensional
dimensional image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010509749.4A
Other languages
Chinese (zh)
Other versions
CN111710048B (en
Inventor
吴畏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Youzhuju Network Technology Co Ltd
Original Assignee
Beijing Youzhuju Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Youzhuju Network Technology Co Ltd filed Critical Beijing Youzhuju Network Technology Co Ltd
Priority to CN202010509749.4A priority Critical patent/CN111710048B/en
Publication of CN111710048A publication Critical patent/CN111710048A/en
Application granted granted Critical
Publication of CN111710048B publication Critical patent/CN111710048B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/16Real estate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/10Constructive solid geometry [CSG] using solid primitives, e.g. cylinders, cubes

Abstract

The embodiment of the disclosure discloses a display method, a display device and electronic equipment. One embodiment of the method comprises: displaying an indoor three-dimensional panoramic image, wherein the indoor three-dimensional panoramic image comprises an expandable visual field object image, the expandable visual field object can communicate the indoor space and the outdoor space, and the expandable visual field object image is associated with the outdoor three-dimensional image; and displaying the outdoor three-dimensional image in response to the preset display condition being met. Therefore, a new display mode can be provided.

Description

Display method and device and electronic equipment
Technical Field
The present disclosure relates to the field of internet technologies, and in particular, to a display method and apparatus, and an electronic device.
Background
With the development of the internet, users increasingly use terminal devices to realize various functions. For example, a user can browse and search house source information through the terminal device, and therefore the user can obtain more house source information without going home. Or, the user can screen out the house source of the heart instrument of the user through the house source information on the network, and the house source is bought to the broker on the spot.
Virtual Reality (VR) technology is a technology that generates an interactive three-dimensional interactive environment on a computer by comprehensively using a computer graphics system and various control interfaces, and provides immersion for a user.
Disclosure of Invention
This disclosure is provided to introduce concepts in a simplified form that are further described below in the detailed description. This disclosure is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
The embodiment of the disclosure provides a display method, a display device and electronic equipment.
In a first aspect, an embodiment of the present disclosure provides a display method, where the method includes: displaying an indoor three-dimensional panoramic image, wherein the indoor three-dimensional panoramic image comprises an expandable visual field object image, the expandable visual field object can communicate the indoor space and the outdoor space, and the expandable visual field object image is associated with the outdoor three-dimensional image; and displaying the outdoor three-dimensional image in response to the preset display condition being met.
In a second aspect, an embodiment of the present disclosure provides a display device, including: a first display unit, configured to display an indoor three-dimensional panoramic image, where the indoor three-dimensional panoramic image includes an expandable-field-of-view object image, the expandable-field-of-view object is capable of communicating indoors and outdoors, and the expandable-field-of-view object image is associated with an outdoor three-dimensional image; and the second display unit is used for responding to the preset display condition and displaying the outdoor three-dimensional image.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including: one or more processors; a storage device, configured to store one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the presentation method as described above in the first aspect.
In a fourth aspect, the disclosed embodiments provide a computer readable medium, on which a computer program is stored, which when executed by a processor, implements the steps of the presentation method as described above in the first aspect.
According to the display method, the display device and the electronic equipment, the indoor three-dimensional panoramic image can be displayed firstly, and then the outdoor three-dimensional panoramic image can be displayed under the condition that the preset display condition is met. Therefore, a new house display mode can be provided. In addition, by setting the outdoor image for the user to browse, on one hand, the real degree of house source display in a virtual reality mode can be improved, and the immersion sense in a virtual reality scene is improved; on the other hand, more house source information can be provided for the user, the house source information acquisition efficiency of the user is improved, the user can obtain more comprehensive house source information without looking at the house on the spot, and the time of the user is saved.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and features are not necessarily drawn to scale.
FIG. 1 is a flow chart diagram illustrating one embodiment of a method in accordance with the present disclosure;
FIG. 2 is an exemplary application scenario diagram of a presentation method according to the present disclosure;
FIG. 3 is a flow chart of yet another embodiment of a demonstration method according to the present disclosure;
FIG. 4 is a schematic structural diagram of one embodiment of a display device according to the present disclosure;
FIG. 5 is an exemplary system architecture to which the presentation method may be applied, according to one embodiment of the present disclosure;
fig. 6 is a schematic diagram of a basic structure of an electronic device provided according to an embodiment of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order, and/or performed in parallel. Moreover, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "include" and variations thereof as used herein are open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
Referring to fig. 1, a flow of one embodiment of a presentation method according to the present disclosure is shown. The demonstration method as shown in fig. 1 comprises the following steps:
and 101, displaying an indoor three-dimensional panoramic image.
In this embodiment, an execution subject (for example, a terminal device) of the presentation method may present a house three-dimensional panoramic image.
In the present embodiment, the indoor three-dimensional panoramic image may refer to a three-dimensional panoramic image of an indoor environment.
In some application scenarios, a three-dimensional panoramic image may be created in advance for the indoor environment. As an example, first, images or video in a room may be captured using an image capture device, such as a single camera or multiple cameras in a camera rig. Then, the image capturing device may send the captured indoor images to the image processing device, and the image processing device may process the received images at various angles, for example, perform interpolation, correction, stitching, and the like, to generate an all-angle stereoscopic panoramic image. The full-angle stereoscopic panorama image may be displayed in a virtual reality display device.
In this embodiment, the execution subject may be a device capable of displaying a virtual reality image. As an example, the execution subject may include, but is not limited to, a head mounted virtual reality display device, a screen-like virtual reality display device. The head-mounted type may include a helmet type, a glasses type, etc. The screen type virtual reality display device can comprise a mobile phone and the like.
In some application scenes, the execution body can also play music, release gas and the like, and cooperate with the displayed virtual reality image to create the atmosphere of the virtual reality scene.
In this embodiment, the indoor three-dimensional panoramic image may include an expandable-field object image. Here, the expandable-field object image can communicate between the inside and the outside of the room, and the expandable-field object image is associated with the outdoor three-dimensional image.
Here, the communication between the indoor and the outdoor may be between the indoor and the outdoor, and the light may pass through the image of the object with the expandable visual field.
By way of example, the expandable field of view object may be an entry door, a window, a door mirror mounted on an entry door, and the like. It will be appreciated that the entrance door, in the open position, allows light to pass therethrough, thus extending the user's field of view.
In some application scenarios, the outdoor three-dimensional image associated with the image of the expandable-field object may be an outdoor environment image outside the expandable-field object.
As an example, the outdoor image associated with the entrance door may be an outdoor environment image, such as an image of a corridor, after the entrance door is opened.
As an example, the outdoor image associated with the window may be an outdoor environment image that the user sees when positioned in front of the window.
The outdoor three-dimensional image may include an outdoor three-dimensional panoramic image or an outdoor three-dimensional half-scene image. Here, the outdoor three-dimensional half view image may be a three-dimensional image of a non-panoramic image, for example, an image viewed through a window, and may be a half view image, and it is understood that an environment image viewed from the window to the outside of the building may not be 360 degrees due to the limitation of the building body where the window is located.
And 102, responding to the preset display condition, and displaying the outdoor three-dimensional image.
In this embodiment, the execution subject may display the outdoor three-dimensional image in response to a preset display condition being satisfied.
Here, the specific content of the preset display condition may be set according to an actual application scenario, and is not limited herein.
As an example, the preset presentation condition may include, but is not limited to, at least one of: the user's simulated field of view includes the extensible field of view object image, and the predefined entry door is opened and the user's viewpoint position is moved outdoors.
Referring to fig. 2, an exemplary application scenario diagram according to the corresponding embodiment of fig. 1 is shown. In fig. 2, the terminal device may present an indoor three-dimensional panoramic image. The indoor three-dimensional panoramic image includes a window image. It is understood that the window can communicate between indoors and outdoors. The window image may be associated with an outdoor three-dimensional image. In fig. 2, the window image 201 is included in the simulated view of the user, which can be understood as the preset display condition being satisfied. The performing subject may then present an outdoor three-dimensional image, i.e., an outdoor image through a house window at the window image location, e.g., tree image 202 outside the room may be seen through the window image in fig. 2.
It should be noted that, the display method provided in this embodiment may display the indoor three-dimensional panoramic image first, and then display the outdoor three-dimensional image under the condition that the preset display condition is satisfied. Therefore, a new house display mode can be provided. In addition, the outdoor image is set for the user to browse, so that the display reality degree can be improved, and the immersion sense in the virtual reality scene is improved; on the other hand, more information can be provided for the user, the information acquisition efficiency of the user is improved, the user can obtain the house source information without looking at the house on the spot, and the time of the user is saved.
In some embodiments, the step 102 may include: in response to determining that an extensible-view object image is included in the simulated field of view of the user, the outdoor three-dimensional image associated with the extensible-view object image is presented at the extensible-view object image.
Here, the simulated field of view of the user may be an image region that can be seen by the user under certain conditions. It can be understood that the user watches the three-dimensional panoramic image as if the user is in the real scene, and the user can realize zooming in and out and move in various directions to watch the scene through self action or operation equipment. In general, the area that a user can see is limited, given the user's location and the user's orientation; the area that can be seen by the user, i.e. the simulated field of view, can be determined according to the viewing angle parameters of human eyes.
In some embodiments, the expandable-field object may comprise a window. The outdoor three-dimensional image may include a three-dimensional image of the exterior of the window.
In some embodiments, the presenting, at the expandable-field-of-view object image, the outdoor three-dimensional image associated with the expandable-field-of-view object image in response to determining that the user's simulated field of view includes the expandable-field-of-view object image may include: in response to determining that the window image is included in the user's simulated field of view, a three-dimensional image of an exterior of the window is presented at the window image location.
It should be noted that, by displaying the three-dimensional image outside the window at the position of the window image, it is possible to automatically display the three-dimensional image of the scene visible through the window when the window image is included in the virtual field of view of the user without the need for user operation. Therefore, the authenticity can be improved, and the richness of information provided by the image can be improved.
Alternatively, in some embodiments, the expandable-field object may include a door mirror. The outdoor three-dimensional image may include a door mirror exterior three-dimensional image. Here, the door mirror external three-dimensional image may be obtained based on the entrance door external three-dimensional panoramic image, and it is understood that, for the door mirror, the user viewpoint position is determined, and the simulated field of view of the user may be determined according to the user sight line direction, so as to determine the area of the entrance door external three-dimensional panoramic image displayed at the door mirror.
Optionally, the presenting, in response to determining that the simulated field of view of the user includes an expandable-field-of-view object image, the outdoor three-dimensional image associated with the expandable-field-of-view object image at the expandable-field-of-view object image may include: in response to determining that the simulated field of view of the user includes a door mirror image, a three-dimensional image of an exterior of the door mirror is presented at the door mirror image position.
In some embodiments, the expandable-field object image may include an entrance door. Here, the outdoor three-dimensional image may include an entrance door external three-dimensional panoramic image.
It is understood that the entrance door is a connection between the inside and outside of the room, and allows people to enter and exit.
In some embodiments, the step 102 may include: in response to detecting a predefined entry door opening operation, at an entry door position, a three-dimensional panoramic image outside the entry door is presented.
Here, the specific implementation manner of the entrance door opening operation may be set according to an actual application scenario, and is not limited herein.
As an example, the entrance door opening operation may be a trigger on the presented entrance door image, for example, a click or slide operation is performed on the entrance door image.
As an example, the entrance door opening operation may be a trigger operation on a preset control. For example, a control labeled with the word "open entrance door" may be presented, and then, when the user clicks the control, the user may be regarded as an entrance door opening operation.
Here, at the entrance position, a three-dimensional panoramic image outside the entrance is displayed, and it can be understood that when the user viewpoint is indoors and the entrance is opened, the user can see an environment image outside the entrance through the entrance position.
Optionally, in the displayed three-dimensional image, the entrance door image is an image of an open state of the entrance door.
In some application scenarios, a three-dimensional panoramic image outside the entrance door may be generated in advance.
In some application scenarios, a three-dimensional panoramic image may be created in advance for an outdoor environment outside the entrance door. As an example, first, an image or video of the outdoor environment outside the entrance door may be captured using an image capturing device, such as a single camera or multiple cameras in a camera rig. Then, the image capturing device may send the captured images to the image processing device, and the image processing device may process the received images at the respective angles, for example, perform interpolation, correction, stitching, and the like, to generate an all-angle stereoscopic panorama. The full-angle stereoscopic panorama can be displayed in a virtual reality display device.
In some embodiments, the three-dimensional panoramic image outside the entrance door may correspond to a house source. I.e., the three-dimensional panoramic image outside the entrance of each room source, may be acquired and built for that room source.
It should be noted that the entrance door can be opened or closed, so that the user can control the state of the entrance door, and when the entrance door is opened, the three-dimensional panoramic image outside the entrance door can be displayed at the entrance door. Therefore, when a user looks at a room virtually, on one hand, the reality degree of virtual reality display can be improved by opening and closing the door and displaying the environment outside the door; on the other hand, by providing the outdoor environment image of the entrance, the outdoor environment information related to the house source can be provided for the user, the information display efficiency is improved, and the house source information acquisition efficiency of the user is improved.
In some embodiments, the method may further include: and displaying the three-dimensional panoramic image outside the entrance door according to the user observation position in response to the fact that the user observation position is determined to move outdoors.
Here, the user observation position may be a position where the user is simulated by the execution subject when the virtual reality is displayed.
Here, the change of the user viewing position may be achieved in various ways, and is not limited herein.
Optionally, scene controls may be set, for example, controls for living room, main bed, sub bed, outdoor, and the like may be set. Then, the user can click the main-lying control to indicate that the observation position of the user moves to the main-lying position, the execution main body can display the three-dimensional panoramic image of the main-lying position, and then, the user can click the outdoor control to indicate that the observation position of the user moves to the outdoor position, and the execution main body can display the outdoor three-dimensional panoramic image.
Optionally, a scene switching control may be set, for example, a scene switching control is clicked, and the displayed scene may be switched among a living room, a main bed, a secondary bed, an outdoor environment, and the like.
Alternatively, the user may change the user viewing position by a predefined viewing position change operation (e.g., drawing in an entry door image) after simulated opening of the entry door. It is understood that the scenario simulated by this situation may be that the user walks to the entrance door, opens the entrance door, walks out of the entrance door, and goes to the environment outside the entrance door.
Here, the execution subject described above may present an outdoor panoramic image if it is determined that the user observation position is outdoors. The panoramic image seen by the user may be an outdoor panoramic image, and specifically, may be a three-dimensional panoramic image outside the entrance door.
Here, the three-dimensional panoramic image outside the entrance may be displayed according to the user observation position, or the simulated view of the user may be determined according to the user observation position, and then the three-dimensional panoramic image outside the entrance within the simulated view may be displayed.
It should be noted that, if it is determined that the observation position of the user is outdoors, the displayed scene can be switched to the three-dimensional panoramic image outside the entrance door in time. Therefore, the user can observe the environment outside the house of the house source in a virtual reality mode. Therefore, the user can comprehensively know the house source and improve the house watching efficiency.
In some embodiments, the outdoor three-dimensional image may include at least one presentation mode, and the outdoor three-dimensional image of each presentation mode corresponds to a lighting condition.
In some embodiments, the step 102 may include: according to at least one of: determining the illumination condition at the time point and the house orientation; and displaying the outdoor three-dimensional image according to the display mode corresponding to the illumination condition.
Optionally, the lighting condition may also be determined according to a user's selection. As an example, a night mode control and a day mode control may be set, and a user clicking on the day mode control may present an outdoor three-dimensional image simulating an outdoor environment during the day.
Here, the presentation mode may include at least one of, but is not limited to: image brightness, image sharpness, object state (e.g., opening and closing of a street lamp).
As an example, the lighting condition may be set to three levels, which are a high lighting condition, a medium lighting condition, and a low lighting condition, respectively.
As an example, a house in the south direction may correspond to high lighting conditions in the morning, low lighting conditions at night, and medium lighting conditions in the morning and afternoon. The presentation mode corresponding to the high illumination condition may be a high-brightness outdoor three-dimensional image. The presentation mode corresponding to the middle lighting condition may be an outdoor three-dimensional image of medium brightness. The display mode corresponding to the low light condition may be an outdoor three-dimensional image with low brightness, and the street lamp image on the outdoor three-dimensional image is in a street lamp lighting state.
It should be noted that by setting the display mode for the outdoor three-dimensional image, various real outdoor environments can be simulated, and thus, the sense of reality of the virtual reality display can be improved.
In some embodiments, the method may further include: in response to determining to change the simulated field of view of the user, determining a pre-change field of view determination parameter and a post-change field of view determination parameter, the field of view determination parameters including a user viewing position and a user gaze point; the user observation position is moved from the user observation position before the change to the user observation position after the change along a first Bezier curve between the user observation position before the change and the user observation position after the change; the user fixation point moves from the user fixation point before the change to the user fixation point after the change along a second Bezier curve between the user fixation point before the change and the user fixation point after the change.
Here, it is possible to generate a complex smooth curve, that is, a bezier curve, with a few control points (e.g., user observation positions before and after changing).
It should be noted that, the change of the camera position (i.e. the user observation position) or the change of the gaze point position (i.e. the user gaze point) may be simulated according to a curve with a radian; the visual field can be changed in a similar way to the process of the human eyes watching the surrounding scenery, so that the visual field can be smoother, and the real degree of virtual reality scene display can be improved.
Referring to fig. 3, a flow of another embodiment of a presentation method according to the present disclosure is shown. The demonstration method as shown in fig. 3 comprises the following steps:
and 301, displaying the indoor three-dimensional panoramic image.
In this embodiment, an execution subject (for example, a terminal device) of the presentation method may present a house three-dimensional panoramic image.
Step 302, in response to determining that the user's simulated field of view includes an expandable-field-of-view object image, presenting an outdoor three-dimensional image associated with the expandable-field-of-view object image at the expandable-field-of-view object image.
And 303, in response to the detection of the predefined entrance door opening operation, displaying the external three-dimensional panoramic image of the entrance door at the position of the entrance door.
And 304, in response to the fact that the user observation point is detected to move outdoors, displaying the three-dimensional panoramic image outside the entrance door according to the user observation point position.
It should be noted that the display method provided by the embodiment corresponding to fig. 3 can provide multiple occasions for displaying the outdoor three-dimensional image, and further provide various types of outdoor three-dimensional images displayed at different occasions. Therefore, a more comprehensive technical scheme about outdoor three-dimensional images can be provided, and the reality degree of the virtual reality display house source is improved.
It should be noted that, for details of implementation and technical effects in this embodiment, reference may be made to relevant descriptions of other parts in this application, and details are not described herein again.
With further reference to fig. 4, as an implementation of the methods shown in the above figures, the present disclosure provides an embodiment of a display apparatus, which corresponds to the embodiment of the method shown in fig. 1, and which may be applied in various electronic devices.
As shown in fig. 4, the display device of the present embodiment includes: a first presentation unit 401 and a second presentation unit 402; the display device comprises a first display unit, a second display unit and a third display unit, wherein the first display unit is used for displaying an indoor three-dimensional panoramic image, the indoor three-dimensional panoramic image comprises an expandable visual field object image, the expandable visual field object can communicate indoor and outdoor, and the expandable visual field object image is associated with an outdoor three-dimensional image; and the second display unit is used for responding to the preset display condition and displaying the outdoor three-dimensional image.
In this embodiment, specific processing of the first display unit 401 and the second display unit 402 of the display apparatus and technical effects thereof can refer to the related descriptions of step 101 and step 102 in the corresponding embodiment of fig. 1, which are not repeated herein.
In some embodiments, the displaying the outdoor three-dimensional image in response to the preset display condition being met includes: in response to determining that an extensible-view object image is included in the simulated field of view of the user, an outdoor three-dimensional image associated with the extensible-view object image is presented at the extensible-view object image.
In some embodiments, the expandable-field-of-view object comprises a window, and the outdoor three-dimensional image comprises a three-dimensional image outside the window; and said presenting, in response to determining that an extended-field-of-view object image is included in the simulated field of view of the user, an outdoor three-dimensional image associated with the extended-field-of-view object image at the extended-field-of-view object image, comprises: in response to determining that the window image is included in the user's simulated field of view, a three-dimensional image of an exterior of the window is presented at the window image location.
In some embodiments, the expandable-field-of-view object comprises an entrance door, wherein the outdoor three-dimensional image comprises a three-dimensional panoramic image outside the entrance door; and the above-mentioned response is preserved the show condition and is satisfied, show outdoor three-dimensional image, include: and in response to detecting the predefined entrance door opening operation, displaying a three-dimensional panoramic image outside the entrance door at the entrance door position.
In some embodiments, the displaying the outdoor three-dimensional image in response to the preset display condition being met includes: and in response to determining that the user observation position is located outdoors, displaying a three-dimensional panoramic image outside the entrance door according to the user observation position.
In some embodiments, the outdoor three-dimensional image comprises at least one display mode, and the outdoor three-dimensional image of each display mode corresponds to the lighting condition; and the above-mentioned response is preserved the show condition and is satisfied, show outdoor three-dimensional image, include: according to at least one of: time point, house orientation, user selection, and determining illumination conditions; and displaying the outdoor three-dimensional image according to a display mode corresponding to the illumination condition. In some embodiments, the above apparatus further comprises: a determination unit (not shown in the figure) for determining a field of view determination parameter before the change and a field of view determination parameter after the change in response to determining to change the simulated field of view of the user, the field of view determination parameters including a user observation position and a user gaze point; an execution unit (not shown in the figures) for executing at least one of: the user observation position is moved from the user observation position before the change to the user observation position after the change along a first Bezier curve between the user observation position before the change and the user observation position after the change; the user fixation point moves from the user fixation point before the change to the user fixation point after the change along a second Bezier curve between the user fixation point before the change and the user fixation point after the change.
Referring to fig. 5, fig. 5 illustrates an exemplary system architecture to which the presentation method of one embodiment of the present disclosure may be applied.
As shown in fig. 5, the system architecture may include terminal devices 501, 502, 503, a network 504, and a server 505. The network 504 serves to provide a medium for communication links between the terminal devices 501, 502, 503 and the server 505. Network 504 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The terminal devices 501, 502, 503 may interact with a server 505 over a network 504 to receive or send messages or the like. The terminal devices 501, 502, 503 may have various client applications installed thereon, such as a web browser application, a search-type application, and a news-information-type application. The client application in the terminal device 501, 502, 503 may receive the instruction of the user, and complete the corresponding function according to the instruction of the user, for example, add the corresponding information in the information according to the instruction of the user.
The terminal devices 501, 502, 503 may be hardware or software. When the terminal devices 501, 502, 503 are hardware, they may be various electronic devices having a display screen and supporting web browsing, including but not limited to smart phones, tablet computers, e-book readers, MP3 players (Moving Picture Experts Group Audio Layer III, mpeg compression standard Audio Layer 3), MP4 players (Moving Picture Experts Group Audio Layer IV, mpeg compression standard Audio Layer 4), laptop portable computers, desktop computers, and the like. When the terminal devices 501, 502, and 503 are software, they can be installed in the electronic devices listed above. It may be implemented as multiple pieces of software or software modules (e.g., software or software modules used to provide distributed services) or as a single piece of software or software module. And is not particularly limited herein.
The server 505 may be a server providing various services, for example, receiving an information acquisition request sent by the terminal device 501, 502, 503, and acquiring the presentation information corresponding to the information acquisition request in various ways according to the information acquisition request. And the relevant data of the presentation information is sent to the terminal equipment 501, 502, 503.
It should be noted that the display method provided by the embodiment of the present disclosure may be executed by a terminal device, and accordingly, the display apparatus may be disposed in the terminal device 501, 502, 503. In addition, the display method provided by the embodiment of the disclosure can also be executed by the server 505, and accordingly, the display apparatus can be disposed in the server 505.
It should be understood that the number of terminal devices, networks, and servers in fig. 5 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Referring now to fig. 6, shown is a schematic diagram of an electronic device (e.g., a terminal device or a server of fig. 5) suitable for use in implementing embodiments of the present disclosure. The terminal device in the embodiments of the present disclosure may include, but is not limited to, a mobile terminal such as a wearable electronic device, a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle terminal (e.g., a car navigation terminal), and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. The electronic device shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 6, the electronic device may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 601, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage means 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the electronic apparatus 600 are also stored. The processing device 601, the ROM 602, and the RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
Generally, the following devices may be connected to the I/O interface 605: input devices 606 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 607 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 608 including, for example, tape, hard disk, etc.; and a communication device 609. The communication means 609 may allow the electronic device to communicate with other devices wirelessly or by wire to exchange data. While fig. 6 illustrates an electronic device having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program carried on a non-transitory computer readable medium, the computer program containing program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 609, or may be installed from the storage means 608, or may be installed from the ROM 602. The computer program, when executed by the processing device 601, performs the above-described functions defined in the methods of the embodiments of the present disclosure.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText transfer protocol), and may be interconnected with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: displaying an indoor three-dimensional panoramic image, wherein the indoor three-dimensional panoramic image comprises an expandable visual field object image, the expandable visual field object can communicate the indoor space and the outdoor space, and the expandable visual field object image is associated with the outdoor three-dimensional image; and displaying the outdoor three-dimensional image in response to the preset display condition being met.
Computer program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including but not limited to an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of a cell does not in some cases constitute a limitation of the cell itself, for example, the first presentation cell may also be described as a "cell presenting an indoor three-dimensional panoramic image".
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (10)

1. A method of displaying, comprising:
displaying an indoor three-dimensional panoramic image, wherein the indoor three-dimensional panoramic image comprises an expandable visual field object image, the expandable visual field object can communicate the indoor space and the outdoor space, and the expandable visual field object image is associated with the outdoor three-dimensional image;
and displaying the outdoor three-dimensional image in response to the preset display condition being met.
2. The method of claim 1, wherein the displaying the outdoor three-dimensional image in response to the preset display condition being met comprises
In response to determining that an expandable-field object image is included in the simulated field of view of the user, an outdoor three-dimensional image associated with the expandable-field object image is presented at the expandable-field object image.
3. The method of claim 1, wherein the expandable-field-of-view object comprises a window, and the outdoor three-dimensional image comprises a window-exterior three-dimensional image; and
the responsive to determining that an extensible-view object image is included in the simulated field of view of the user, presenting, at the extensible-view object image, an outdoor three-dimensional image associated with the extensible-view object image, comprising:
in response to determining that a window image is included in the user's simulated field of view, a three-dimensional image of an exterior of the window is presented at the window image location.
4. The method of claim 1, wherein the expandable-field-of-view object comprises an entrance door, wherein the outdoor three-dimensional image comprises an entrance door exterior three-dimensional panoramic image; and
the responding to the preset display condition is met, and the displaying of the outdoor three-dimensional image comprises the following steps:
in response to detecting a predefined entry door opening operation, at the entry door position, presenting a three-dimensional panoramic image outside the entry door.
5. The method according to claim 4, wherein the displaying the outdoor three-dimensional image in response to the preset display condition being met comprises:
and in response to determining that the user observation position is located outdoors, displaying a three-dimensional panoramic image outside the entrance door according to the user observation position.
6. The method according to any one of claims 1-5, wherein the outdoor three-dimensional image comprises at least one presentation mode, and the outdoor three-dimensional image of each presentation mode corresponds to a lighting condition; and
the responding to the preset display condition is met, and the displaying of the outdoor three-dimensional image comprises the following steps:
according to at least one of: time point, house orientation, user selection, and determining illumination conditions;
and displaying the outdoor three-dimensional image according to a display mode corresponding to the illumination condition.
7. The method according to any one of claims 1-5, further comprising:
in response to determining to change the simulated field of view of the user, determining a pre-change field of view determination parameter and a post-change field of view determination parameter, the field of view determination parameters including a user viewing position and a user gaze point;
performing at least one of: the user observation position is moved from the user observation position before the change to the user observation position after the change along a first Bezier curve between the user observation position before the change and the user observation position after the change; the user fixation point moves from the user fixation point before the change to the user fixation point after the change along a second Bezier curve between the user fixation point before the change and the user fixation point after the change.
8. A display device, comprising:
the system comprises a first display unit, a second display unit and a third display unit, wherein the first display unit is used for displaying an indoor three-dimensional panoramic image, the indoor three-dimensional panoramic image comprises an expandable visual field object image, the expandable visual field object can communicate indoor and outdoor, and the expandable visual field object image is associated with an outdoor three-dimensional image;
and the second display unit is used for responding to the preset display condition and displaying the outdoor three-dimensional image.
9. An electronic device, comprising:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-7.
10. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-7.
CN202010509749.4A 2020-06-05 2020-06-05 Display method and device and electronic equipment Active CN111710048B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010509749.4A CN111710048B (en) 2020-06-05 2020-06-05 Display method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010509749.4A CN111710048B (en) 2020-06-05 2020-06-05 Display method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN111710048A true CN111710048A (en) 2020-09-25
CN111710048B CN111710048B (en) 2023-11-28

Family

ID=72539613

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010509749.4A Active CN111710048B (en) 2020-06-05 2020-06-05 Display method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN111710048B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112232899A (en) * 2020-09-25 2021-01-15 北京五八信息技术有限公司 Data processing method and device
CN114003322A (en) * 2021-09-16 2022-02-01 北京城市网邻信息技术有限公司 Method, equipment and device for displaying real scene space of house and storage medium
CN117274423A (en) * 2023-08-23 2023-12-22 瑞庭网络技术(上海)有限公司 House source information processing method and device, electronic equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050280706A1 (en) * 2004-08-25 2005-12-22 Chiou-muh Jong Method and apparatus to simulate an outdoor window for panorama viewing from a room
US20100226535A1 (en) * 2009-03-05 2010-09-09 Microsoft Corporation Augmenting a field of view in connection with vision-tracking
CN102496131A (en) * 2011-11-08 2012-06-13 莫健新 Hotel room outdoor landscape display system and method and data generation system and method
CN107481316A (en) * 2017-06-30 2017-12-15 百度在线网络技术(北京)有限公司 Indoor and outdoor panorama switching method, device and the computer-readable recording medium of D Urban model
CN108960947A (en) * 2017-05-19 2018-12-07 深圳市掌网科技股份有限公司 Show house methods of exhibiting and system based on virtual reality
CN108961387A (en) * 2018-05-30 2018-12-07 链家网(北京)科技有限公司 A kind of display methods and terminal device of house virtual three-dimensional model

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050280706A1 (en) * 2004-08-25 2005-12-22 Chiou-muh Jong Method and apparatus to simulate an outdoor window for panorama viewing from a room
US20100226535A1 (en) * 2009-03-05 2010-09-09 Microsoft Corporation Augmenting a field of view in connection with vision-tracking
CN102496131A (en) * 2011-11-08 2012-06-13 莫健新 Hotel room outdoor landscape display system and method and data generation system and method
CN108960947A (en) * 2017-05-19 2018-12-07 深圳市掌网科技股份有限公司 Show house methods of exhibiting and system based on virtual reality
CN107481316A (en) * 2017-06-30 2017-12-15 百度在线网络技术(北京)有限公司 Indoor and outdoor panorama switching method, device and the computer-readable recording medium of D Urban model
CN108961387A (en) * 2018-05-30 2018-12-07 链家网(北京)科技有限公司 A kind of display methods and terminal device of house virtual three-dimensional model

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112232899A (en) * 2020-09-25 2021-01-15 北京五八信息技术有限公司 Data processing method and device
CN114003322A (en) * 2021-09-16 2022-02-01 北京城市网邻信息技术有限公司 Method, equipment and device for displaying real scene space of house and storage medium
CN117274423A (en) * 2023-08-23 2023-12-22 瑞庭网络技术(上海)有限公司 House source information processing method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN111710048B (en) 2023-11-28

Similar Documents

Publication Publication Date Title
EP3465620B1 (en) Shared experience with contextual augmentation
US9992429B2 (en) Video pinning
CN111710048B (en) Display method and device and electronic equipment
CN111510645B (en) Video processing method and device, computer readable medium and electronic equipment
CN111414225B (en) Three-dimensional model remote display method, first terminal, electronic device and storage medium
EP4117313A1 (en) Audio processing method and apparatus, readable medium, and electronic device
CN111599020B (en) House display method and device and electronic equipment
CN114461064B (en) Virtual reality interaction method, device, equipment and storage medium
JP2023528398A (en) Live distribution room creation method, device, electronic device and storage medium
CN110766780A (en) Method and device for rendering room image, electronic equipment and computer readable medium
CN111652675A (en) Display method and device and electronic equipment
US11592906B2 (en) Ocular focus sharing for digital content
CN109688381B (en) VR monitoring method, device, equipment and storage medium
CN111710046A (en) Interaction method and device and electronic equipment
CN111597414B (en) Display method and device and electronic equipment
CN114332224A (en) Method, device and equipment for generating 3D target detection sample and storage medium
JP2022551671A (en) OBJECT DISPLAY METHOD, APPARATUS, ELECTRONIC DEVICE, AND COMPUTER-READABLE STORAGE MEDIUM
CN111696214A (en) House display method and device and electronic equipment
US20240078734A1 (en) Information interaction method and apparatus, electronic device and storage medium
US20230394614A1 (en) Image collection method and apparatus, terminal, and storage medium
CN115756176B (en) Application display method, head-mounted display device, and computer-readable medium
CN114417204A (en) Information generation method and device and electronic equipment
CN117354484A (en) Shooting processing method, device, equipment and medium based on virtual reality
CN117632391A (en) Application control method, device, equipment and medium based on virtual reality space
CN114357348A (en) Display method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant