CN112051956A - House source interaction method and device - Google Patents

House source interaction method and device Download PDF

Info

Publication number
CN112051956A
CN112051956A CN202010942227.3A CN202010942227A CN112051956A CN 112051956 A CN112051956 A CN 112051956A CN 202010942227 A CN202010942227 A CN 202010942227A CN 112051956 A CN112051956 A CN 112051956A
Authority
CN
China
Prior art keywords
reference object
target
user
user operation
attribute information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010942227.3A
Other languages
Chinese (zh)
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing 58 Information Technology Co Ltd
Original Assignee
Beijing 58 Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing 58 Information Technology Co Ltd filed Critical Beijing 58 Information Technology Co Ltd
Priority to CN202010942227.3A priority Critical patent/CN112051956A/en
Publication of CN112051956A publication Critical patent/CN112051956A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0641Shopping interfaces
    • G06Q30/0643Graphical representation of items or shoppers

Abstract

The embodiment of the invention provides a house source interaction method and a house source interaction device, wherein a user terminal can establish a three-dimensional house space according to house source data of an entity house source and display the three-dimensional house space through an image user interface, the three-dimensional house source space can comprise at least one interaction area, the interaction area can correspond to at least one reference object, when the user browses the three-dimensional house space, the terminal can respond to the fact that the current position information of the user meets a target interaction area, and display the corresponding target reference object in the target interaction area, so that in the process that the user browses the house source, the space perception of the user on the current house source can be effectively improved by providing the reference object in the three-dimensional house space.

Description

House source interaction method and device
Technical Field
The invention relates to the technical field of data processing, in particular to a house source interaction method and a house source interaction device.
Background
With the development of network technology, users can realize online services such as online shopping, online house finding, online work finding and the like through user terminals, so that the convenience of life of people is greatly improved, and the diversity of life is enriched. For network house finding, currently some brokers and landlords upload house source information to the network for house finding users to browse. However, the traditional house source information is information composed of pictures, videos, audios and the like, so that the browsing mode is single, and the real space perception of the user on the house source information is low, and the house finding experience is poor.
Disclosure of Invention
The embodiment of the invention provides a house source interaction method, and aims to solve the problems that in the prior art, a network house source information browsing mode is single, user space perception is low, and house finding experience is poor.
Correspondingly, the embodiment of the invention also provides an interaction device of the house resources, which is used for ensuring the realization and the application of the method.
In order to solve the above problem, an embodiment of the present invention discloses an interaction method for a house source, where a content displayed through an image user interface of a preset terminal at least includes a three-dimensional house space of the house source, the three-dimensional house space includes at least one interaction area, and the interaction area corresponds to at least one reference object, the method includes:
in response to detecting that the current user position satisfies a target interaction region, presenting a first target reference object matching the target interaction region in the target interaction region.
Optionally, the content presented by the graphical user interface further includes a reference control for the house source, and the method further includes:
and responding to the user operation acting on the reference object control, and displaying a second target reference object matched with the user operation in the target interaction area, or updating the first target reference object into a third target reference object matched with the user operation.
Optionally, the method further comprises:
and responding to the user operation acting on the first target reference object, and controlling the first target reference object to execute interactive operation corresponding to the user operation.
Optionally, the method further comprises:
responding to user operation acting on the preset terminal, and displaying a reference object definition interface, wherein the reference object definition interface comprises attribute information of a reference object;
responding to the user operation acting on the attribute information, and acquiring target attribute information matched with the user operation;
the presenting, in response to detecting that the current user position satisfies a target interaction region, a first target reference object matching the target interaction region in the target interaction region includes:
and in response to the fact that the current user position is detected to meet the target interaction area, displaying a first target reference object matched with the target interaction area in the target interaction area according to the target attribute information.
Optionally, the presenting, in response to a user operation on the reference object control, a second target reference object matched with the user operation in the target interaction region includes:
Responding to the user operation acting on the reference object control, and displaying a reference object selection interface, wherein the reference object selection interface comprises at least one reference object and attribute information aiming at the reference object;
selecting a second target reference object in response to a user operation acting on the reference object;
acquiring first target attribute information for the second target reference object in response to a user operation acting on the attribute information;
and displaying the second target reference object in the three-dimensional house space according to the first target attribute information.
Optionally, the updating, in response to a user operation on the reference object control, the first target reference object to a third target reference object matching the user operation in the target interaction region includes:
acquiring second target attribute information for the first target reference object in response to a user operation acting on the attribute information;
and updating the first target reference object to a third target reference object matched with the second target attribute information in the three-dimensional house space.
Optionally, the displaying, in response to a user operation on the reference object control, a second target reference object corresponding to the user operation in the three-dimensional room space includes:
responding to the user operation acting on the reference object control, selecting a second target reference object aiming at the house source, and displaying a property selection interface aiming at the second target reference object;
determining target attribute information for the second target reference object in response to a user operation acting on the attribute selection interface;
and displaying the second target reference object in the three-dimensional house space according to the target attribute information.
Optionally, the first target reference object corresponds to at least one interactive control, and the controlling, in response to a user operation acting on the first target reference object, the first target reference object to perform an interactive operation corresponding to the user operation includes:
and responding to the user operation acted on the interactive control, and controlling the first target reference object to execute the interactive operation corresponding to the user operation.
Optionally, the method further comprises:
and responding to the visual angle switching operation of the user, and carrying out scene angle switching on the three-dimensional house space.
The embodiment of the invention also discloses an interactive device of a house source, the content displayed through the image user interface of the preset terminal at least comprises a three-dimensional house space of the house source, the three-dimensional house space comprises at least one interactive area, and the interactive area corresponds to at least one reference object, the device comprises:
and the first reference object display module is used for displaying a first target reference object matched with the target interaction area in response to detecting that the current user position meets the target interaction area.
Optionally, the content presented by the image user interface further includes a reference control for the house source, and the apparatus further includes:
and the second reference object display module is used for responding to the user operation acting on the reference object control, displaying a second target reference object matched with the user operation in the target interaction area, or updating the first target reference object into a third target reference object matched with the user operation.
Optionally, the method further comprises:
and the interactive operation execution module is used for responding to the user operation acting on the first target reference object and controlling the first target reference object to execute the interactive operation corresponding to the user operation.
Optionally, the method further comprises:
the reference object definition interface display module is used for responding to user operation acting on the preset terminal and displaying a reference object definition interface, and the reference object definition interface comprises attribute information of a reference object;
the first attribute information acquisition module is used for responding to user operation acting on the attribute information and acquiring target attribute information matched with the user operation;
the first reference object display module is specifically configured to:
and in response to the fact that the current user position is detected to meet the target interaction area, displaying a first target reference object matched with the target interaction area in the target interaction area according to the target attribute information.
Optionally, the second reference object display module comprises:
the reference object selection interface display sub-module is used for responding to the user operation acting on the reference object control and displaying a reference object selection interface, and the reference object selection interface comprises at least one reference object and attribute information aiming at the reference object;
a reference object selection sub-module for selecting a second target reference object in response to a user operation acting on the reference object;
A first attribute information selection submodule configured to acquire first target attribute information for the second target reference object in response to a user operation acting on the attribute information;
and the first reference object display submodule is used for displaying the second target reference object in the three-dimensional house space according to the first target attribute information.
Optionally, the second reference object display module comprises:
a second attribute information selection submodule for acquiring second target attribute information for the first target reference object in response to a user operation acting on the attribute information;
and the second reference object display submodule is used for updating the first target reference object into a third target reference object matched with the second target attribute information in the three-dimensional house space.
Optionally, the second reference object display module comprises:
a reference object selection submodule for selecting a second target reference object for the house source in response to a user operation on the reference object control, and presenting a property selection interface for the second target reference object;
An attribute information determination submodule configured to determine, in response to a user operation acting on the attribute selection interface, target attribute information for the second target reference object;
and the reference object display submodule is used for displaying the second target reference object in the three-dimensional house space according to the target attribute information.
Optionally, the first target reference object corresponds to at least one interactive control, and the interactive operation execution module is specifically configured to:
and responding to the user operation acted on the interactive control, and controlling the first target reference object to execute the interactive operation corresponding to the user operation.
Optionally, the method further comprises:
and the scene angle switching module is used for responding to the visual angle switching operation of the user and switching the scene angle of the three-dimensional house space.
The embodiment of the invention also discloses an electronic device, which comprises:
one or more processors; and
one or more machine-readable media having instructions stored thereon, which when executed by the one or more processors, cause the electronic device to perform the method as described above.
Embodiments of the invention also disclose one or more machine-readable media having instructions stored thereon, which when executed by one or more processors, cause the processors to perform the methods as described above.
The embodiment of the invention has the following advantages:
in the embodiment of the invention, the preset terminal can establish the three-dimensional house space according to the house source data of the entity house source and display the three-dimensional house space through the image user interface, wherein the three-dimensional house source space can comprise at least one interaction area, and the interaction area can correspond to at least one reference object.
Drawings
FIG. 1 is a flow chart of the steps of a first embodiment of a house source interaction method of the present invention;
FIG. 2 is a flow chart of the steps of a second embodiment of the interaction method of the house resources of the present invention;
FIG. 3 is a schematic illustration of a reference definition interface in an embodiment of the invention;
FIG. 4 is a schematic illustration of a reference object in an embodiment of the invention;
FIG. 5 is a schematic illustration of a reference object in an embodiment of the invention;
FIG. 6 is a diagram of an interaction control in an embodiment of the invention;
FIG. 7 is a flow chart of the third step of the interaction method of the house source according to the third embodiment of the present invention;
FIG. 8 is a schematic diagram of a reference object control in an embodiment of the invention;
FIG. 9 is a schematic view of a reference selection interface in an embodiment of the present invention;
FIG. 10 is a schematic view of a reference object in an embodiment of the invention;
FIG. 11 is a schematic view of a reference selection interface in an embodiment of the present invention;
FIG. 12 is a schematic view of a property selection interface in an embodiment of the present invention;
fig. 13 is a block diagram of an embodiment of an interaction device of a house source according to the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
The interaction method of the house resources in the embodiment of the invention can be operated on the terminal equipment or the server. The terminal device may be a local terminal device. When the interactive method of the house source operates as a server, the cloud presentation can be performed.
In an optional embodiment, the cloud presentation refers to an information presentation manner based on cloud computing. In the cloud display operation mode, an operation main body and an information picture presentation main body of an information processing program are separated, storage and operation of a room source interaction method are completed on a cloud display server, and a cloud display client is used for receiving and sending data and presenting an information picture, for example, the cloud display client can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; however, the terminal device for processing the information data is a cloud display server at the cloud end. When browsing commodity information, a user operates the cloud display client to send an operation instruction to the cloud display server, the cloud display server displays relevant commodity information according to the operation instruction, codes and compresses data such as the commodity information, returns the data to the cloud display client through a network, and finally decodes the data through the cloud display client and outputs the commodity information.
In another alternative embodiment, the terminal device may be a local terminal device. The local terminal device stores an application program and is used for presenting an application interface. The local terminal device is used for interacting with a user through a graphical user interface, namely, downloading and installing an application program through the electronic device and running the application program conventionally. The manner in which the local terminal device provides the graphical user interface to the user may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal or provided to the user by holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including an application screen and a processor for running the application, generating the graphical user interface, and controlling display of the graphical user interface on the display screen.
When the preset terminal is a local terminal device, the preset terminal may be a desktop computer, a notebook computer, a tablet computer, a mobile terminal, a VR (Virtual Reality) device, and other terminal devices. The VR device can comprise a computer, VR head-mounted equipment, VR control equipment and the like, and a user can roam in a specified area through a virtual three-dimensional house space displayed in the VR head-mounted equipment, so that the user can roam really in a virtual house source, and meanwhile can interact with the virtual house source through the VR control equipment.
The terminal can run application programs, such as life application programs, audio application programs, game application programs and the like. The life-type application programs can be further divided according to different types, such as a rental and sale room application program, a home service application program, a leisure and entertainment application program and the like. The embodiment of the present application is exemplified by running a life application on a mobile terminal, and it is understood that the present invention is not limited thereto.
Referring to fig. 1, which is a flowchart illustrating a first step of an interaction method for a house source according to an embodiment of the present invention, content displayed through a graphical user interface of a preset terminal at least includes a three-dimensional house space of the house source, the three-dimensional house space at least includes an interaction area, and the interaction area may correspond to at least one reference object. The preset terminal may be the aforementioned local terminal device, or may also be the aforementioned cloud display client, and the following takes the local terminal device (especially, the mobile terminal) as an example for explanation. The method specifically comprises the following steps:
step 101, in response to detecting that the current user position satisfies a target interaction area, displaying a first target reference object matched with the target interaction area in the target interaction area.
As an example, the information processing on the house source may include panoramic image acquisition, image augmented reality processing, virtual reality technology processing, and the like, so that the on-line house source is no longer displayed in simple pictures, videos, and the like, thereby not only enriching the display mode of the on-line house source, but also improving the sense of reality of the user finding the house on line.
For a set of room sources, it may comprise a plurality of functional spaces, with different functional spaces corresponding to different functions, such as bedrooms, living rooms, dining rooms, toilets, balconies, etc. When the house source is processed and displayed through the panoramic collection technology, the augmented reality technology and the virtual reality technology, 3D space-based house source display can be provided for a user, so that the user can browse the internal condition of the house source in the 3D space, and the reality of on-line house viewing is improved. However, when a user browses in a 3D space, the user cannot accurately sense the size of the space, the spatial sensing is poor, and even if the user marks the space with simple size information, the user cannot know the placement and the moving range of an article or a person in the space. Therefore, there is a need to improve the spatial perception of a user browsing house sources in a 3D space in an efficient way.
In the embodiment of the present invention, the content displayed on the image user interface of the terminal may include a three-dimensional house space of a function space where the user is currently located, the three-dimensional house space may include real furniture layout, design style and the like in the current function space, and at the same time, a house type diagram of the house source may be displayed through a floating floor in the image user interface, and a current location of the user is marked in the house type diagram, so that when the user roams in the house source, the marking point may move along with the movement of the user. In addition, the three-dimensional house space may include at least one interaction area, and the interaction area may correspond to at least one reference object, that is, the terminal may display the corresponding reference object in the interaction area, so as to improve the spatial perception of the user on the three-dimensional house space.
For example, to different functional spaces, can set up at least one interactive region in functional space, according to functional space's function, set up reference object simultaneously for interactive region in reference object and the functional space matches, guarantees authenticity and the rationality of house source show.
In the specific implementation, in the house source browsing process, the terminal can collect the position of the user in the entity house source space in real time, determine the current user position, then match the current user position with an interaction area preset in the three-dimensional house space, and if the matching is successful, display a reference object corresponding to the interaction area in the corresponding target interaction area, so that the user can improve the space perception capability of the user in the 3D space according to the reference object.
The reference object may be a user object selected according to the user's own information, may also be a conventional article object, such as a sofa, a cabinet, etc., and may also be a reference object preset by the terminal, which is not limited in the present invention. By responding to the user operation in the 3D space browsed by the user, the target reference object is provided for the user, so that the user can improve the space perception capability of the user in the 3D space according to the target reference object.
In the embodiment of the invention, the preset terminal can establish the three-dimensional house space according to the house source data of the entity house source and display the three-dimensional house space through the image user interface, wherein the three-dimensional house source space can comprise at least one interaction area, and the interaction area can correspond to at least one reference object.
Referring to fig. 2, which is a flowchart illustrating steps of a second embodiment of the house source interaction method according to the present invention, the content displayed through the graphical user interface of the preset terminal at least includes a three-dimensional house space of the house source, where the three-dimensional house space may include at least one interaction area, and the interaction area may correspond to at least one reference object. The method specifically comprises the following steps:
step 201, responding to a user operation acting on the preset terminal, and displaying a reference object definition interface, wherein the reference object definition interface comprises attribute information of a reference object;
step 202, responding to a user operation acting on the attribute information, and acquiring target attribute information matched with the user operation;
in the embodiment of the invention, a set of house resources can comprise a plurality of different functional spaces, different types of furniture can be arranged in the different functional spaces, and if a user uses a single browsing mode, the user cannot effectively perceive the current three-dimensional house space. Therefore, in the embodiment of the present invention, dotting may be performed in a three-dimensional room space, and a corresponding interaction region may be set, and different functional spaces may include at least one interaction region, where the interaction region corresponds to at least one reference object, which is not limited in the present invention.
For the setting of the interaction region, at least one display point can be set in the function space, and the interaction region in the function space is determined by taking the point as the circle center and taking the preset distance as the radius; the area occupied by the furniture corresponding to the functional space, or the corresponding decoration area, etc. may also be used as the interaction area, etc., which is not limited in the present invention.
The user can set the target reference object firstly, the terminal can respond to the user operation of the user on the preset terminal and display a reference object definition interface, the reference object definition interface comprises the attribute information of the reference object, then the user operation of the user operation on the attribute information is responded, the target attribute information matched with the user operation is obtained, so that the reference object is displayed according to the target attribute information, the user can select the attribute information according with the preference of the user, and the humanization and the diversified attributes of the reference object are improved.
In an example, referring to fig. 3, which shows a schematic diagram of a reference object definition interface in an embodiment of the present invention, a terminal may display a reference object control, and when a user clicks the reference object control, a reference object definition interface may be displayed, where the reference object definition interface includes a user object (virtual character model), and attribute information of the user object, and the user may set a height, a weight, a body type, and the like of the user object by adjusting the attribute information, so that the terminal may display the user object according to target attribute information finally selected by the user. The reference object definition interface can be a semitransparent interface, so that a user avoids shielding a three-dimensional house space while performing personalized customization on a reference object, and the house-watching experience of the user is improved.
It should be noted that the embodiment of the present invention includes, but is not limited to, the above examples, and it is understood that, under the guidance of the idea of the embodiment of the present invention, a user may set attribute information of a plurality of reference object objects at the same time, so that a terminal displays the plurality of reference object objects; the user can also control the display reference object definition interface through voice and set the attribute information; if the preset terminal is a VR terminal, the user may also set the attribute information of the reference object by holding the terminal in a VR handle, and the like, which is not limited in the present invention.
Step 203, in response to detecting that the current user position meets a target interaction area, displaying a first target reference object matched with the target interaction area in the target interaction area;
when a user browses the three-dimensional house space, the terminal can acquire the position of the user in real time, after acquiring the current position information of the user, the terminal can match the position information with a plurality of preset interaction areas, if the matching is successful, the user is indicated to be in the corresponding area, and the associated target reference object can be displayed in the corresponding target interaction area.
In an example, if the distance between the position of the user and the interaction area is less than or equal to a preset distance threshold, it may be determined that the current position information of the user satisfies the interaction area, and a corresponding target reference object is displayed in the interaction area. For example, referring to fig. 4, which shows a schematic diagram of a reference object in an embodiment of the present invention, in a three-dimensional functional space where a kitchen is located, a first interaction region corresponding to a sink and a second interaction region corresponding to a gas cooker are provided, so that when a user enters the kitchen, if a distance between a current position and the first interaction region and/or the second interaction region is less than or equal to a preset distance threshold, the user object corresponding to an exhibition mode is in the first interaction region and/or the second interaction region, thereby improving spatial perception and reality of the user on the kitchen by displaying at least one corresponding user object in the kitchen.
Referring to fig. 5, which is a schematic diagram illustrating a reference object in an embodiment of the present invention, in a three-dimensional functional space where a living room is located, a corresponding interaction area may be set for a sofa, and when a user moves to a vicinity of the interaction area, a corresponding user object may be displayed in the sofa, so that size perception of the sofa and spatial perception of the living room by the user are improved.
It should be noted that, the distance between the user and the interaction area may include a distance between the position of the user and the boundary of the interaction area, a distance between the position of the user and the center of the interaction area, and the like, which is not limited by the present invention.
Step 204, responding to a user operation acting on the first target reference object, and controlling the target reference object to execute an interactive operation corresponding to the user operation;
in an optional embodiment of the present invention, the target reference object may correspond to at least one interactive control, and the terminal controls the reference object to execute an interactive operation corresponding to the user operation in response to the user operation acting on the interactive control. Wherein the interactive controls may include a move control, a spin control, and a close control.
In an example, as shown in fig. 6, which illustrates a schematic diagram of an interaction control in an embodiment of the present invention, when a user selects a reference object in a three-dimensional house space, a terminal may provide interaction controls such as a spin control, a move control, and a delete control for the reference object, and simultaneously expose attribute parameters of a current reference object. The user can rotate the angle of the reference object through the rotation control, or move the position of the reference object through the movement control, or delete the reference object through the deletion control, and when the reference object is not selected, the interactive control and the attribute parameters are hidden.
In addition, aiming at the standard reference object, the user can also perceive the spatial information of the current house source by superposing the same or different standard reference objects, so that the user can perceive the spatial information of the house source through different angles, different positions and the like, the method for browsing the house source by the user is enriched, and the spatial perception of the user on the house source is further improved.
It should be noted that, the embodiments of the present invention include, but are not limited to, the above examples, and it is understood that, under the guidance of the idea of the embodiments of the present invention, a person skilled in the art may perform setting according to actual situations, for example, the terminal may directly respond to a user operation on the target reference object, the user may select the target reference object and drag the target reference object to perform position movement on the target reference object, may also rotate an angle of the target reference object through a sliding operation, and adjust an attribute parameter of the target reference object through a zooming operation, and the like, which is not limited by the present invention.
And step 205, responding to the sliding operation acting on the three-dimensional house space, and carrying out scene angle switching on the three-dimensional house space.
In a specific implementation, a user can perform angle switching on a scene of a three-dimensional house space through sliding operation on the three-dimensional house space displayed by the image user interface, for example, the current three-dimensional house space can be moved to the left through left sliding; the current three-dimensional house space can be moved to the right by sliding to the right, etc. For the VR terminal, the scene angle switching can be performed on the three-dimensional house space through the user turning head, which is not limited by the present invention.
In the embodiment of the invention, the preset terminal can establish the three-dimensional house space according to the house source data of the entity house source and display the three-dimensional house space through the image user interface, wherein the three-dimensional house source space can comprise at least one interaction area, and the interaction area can correspond to at least one reference object.
Referring to fig. 7, which is a flowchart illustrating a third step of an interaction method embodiment of a house source according to the present invention, content displayed through a graphical user interface of a preset terminal at least includes a three-dimensional house space of the house source and a reference object control for the house source, where the three-dimensional house space includes at least one interaction area, and the interaction area corresponds to at least one reference object. The method specifically comprises the following steps:
step 701, in response to detecting that the current user position meets a target interaction region, displaying a first target reference object matched with the target interaction region in the target interaction region;
when a user browses the three-dimensional house space, the terminal can acquire the position of the user in real time, after acquiring the current position information of the user, the terminal can match the position information with a plurality of preset interaction areas, if the matching is successful, the user is indicated to be in the corresponding area, and the associated target reference object can be displayed in the corresponding target interaction area.
Step 702, responding to the user operation acting on the reference object control, displaying a second target reference object matched with the user operation in the target interaction area, or updating the first target reference object into a third target reference object matched with the user operation.
In the embodiment of the invention, after the terminal displays the target reference object, the user can add another target reference object in the target interaction region through the reference object control displayed by the terminal, and can change the originally displayed target reference object through the reference object control, so that the subjective activity of the user is improved, and the user can add a certain number of reference objects or adjust the reference objects in the three-dimensional house space according to the requirement of the user, thereby further improving the spatial perception of the user on the 3D space.
In a specific implementation, in the process of browsing house resources, the terminal can respond to user operation and display the target reference object in the picture. The target reference object may be a user object selected according to the user's own information, or may be a conventional object, such as a sofa, a cabinet, or the like. By responding to the user operation in the 3D space browsed by the user, the target reference object is provided for the user, so that the user can improve the space perception capability of the user in the 3D space according to the target reference object.
In an optional embodiment of the present invention, the terminal may respond to a user operation acting on the reference object control, and display a reference object selection interface, where the reference object selection interface includes at least one reference object and attribute information for the reference object; selecting a second target reference object in response to a user operation on the reference object; acquiring first target attribute information for a second target reference object in response to a user operation acting on the attribute information; and displaying the second target reference object in the three-dimensional house space according to the first target attribute information.
In a specific implementation, the content displayed on the image user interface of the terminal may further include a reference object control, when the user touches the reference object control, the terminal may respond to the user operation and display a reference object selection interface, the reference object selection interface includes at least one reference object and attribute information of the reference object, and the reference object may include an article object and a user object. The object can include a conventional furniture object, such as a sofa, a cabinet and other objects, and the user object can be a virtual human model determined according to the height and body type information set by the user. After the user selects the target reference object in the reference object selection interface, the attribute information corresponding to the target reference object can be provided in the reference object selection interface for the user to select, so that the personalized customization of the target reference object is realized. After the user sets the attribute information of the target reference object, the terminal may display the target reference object in the three-dimensional house space according to the target attribute information, for example, when the terminal has displayed the first target reference object, add at least one second target reference object set by the user, and so on, so that the user may improve the spatial perception capability of the user for browsing the house source in the 3D space according to the target reference object by providing the target reference object for the user.
In an example, as shown in fig. 8, a schematic diagram of a reference object control in an embodiment of the present invention is shown, a reference object control is provided in an image user interface of a terminal, when a user touches the reference object control, the terminal may display a reference object selection interface, as shown in fig. 9, a schematic diagram of a reference object selection interface in an embodiment of the present invention is shown, in the reference object selection interface, references such as but not limited to a human-shaped reference object, a sofa reference object, a cabinet reference object, and a kitchen and toilet reference object are included, when the user selects the human-shaped reference object, attribute information of different body types may be provided in the reference object selection interface for the user to select the most appropriate attribute information according to the height and body type information of the user, so as to implement personalized customization of the reference object, and by combining a preset reference object provided by the terminal and a personalized reference object set by the user, the spatial perception of the 3D space by the user is further improved. As shown in fig. 10, which illustrates a schematic diagram of a reference object in an embodiment of the present invention, after a user determines attribute information of a human-shaped reference object, a terminal may display the human-shaped reference object in a three-dimensional room space according to the attribute information selected by the user, so as to provide a human-shaped reference object similar to the user in the three-dimensional room space, and improve a spatial perception capability of the user for browsing a room source in a 3D space.
In addition, as shown in fig. 11, which illustrates a schematic diagram of a reference object selection interface in an embodiment of the present invention, when a user selects a sofa reference object, the terminal may provide attribute information of different sizes in the reference object selection interface for the user to select, for example, sofa reference objects of different sizes such as single sofa, folding sofa, collapsed sofa, and combination sofa, and then the terminal may provide a corresponding sofa reference object in a three-dimensional house space according to the attribute information selected by the user, so that the user may design the placement of the object in the house source according to the reference object, thereby improving the spatial perception of the house source.
In another optional embodiment of the present invention, the terminal may further obtain, in response to a user operation on the attribute information, second target attribute information for the first target reference object, and then update the first target reference object to a third target reference object matching the second target attribute information in the three-dimensional house space. Specifically, in one case, if the target reference object is a reference object that is default at the terminal, the user may set the attribute of the reference object, thereby implementing the personalized display of the reference object; in another case, if the target reference object is a reference object displayed according to attribute information previously set by the user, the user may set the attribute information again to implement attribute adjustment, so that the user can sense different angles, dimensions, and the like of the 3D space by adjusting the attribute of the reference object, thereby further improving the spatial perception of the user.
In another optional embodiment of the present invention, the terminal may select a second target reference object for the room source information in response to a user operation acting on the reference object control, and display an attribute selection interface for the target reference object, obtain target attribute information for the target reference object in response to the user operation acting on the attribute selection interface, and then display the second target reference object in the three-dimensional room space according to the target attribute information. The attribute selection interface may include, among other things, a size attribute and an angle attribute.
In a specific implementation, the terminal may further provide a reference object control of the standard reference object, and when the user touches the reference object control, the terminal may display an attribute selection interface for the standard reference object, so that the user may select corresponding attribute information such as a size attribute and an angle attribute in the attribute selection interface to perform personalized customization on the standard reference object. After the user finishes setting the attribute information of the standard reference object, the terminal can display the standard reference object in the three-dimensional house space according to the target attribute information, so that the user can browse the space perception capability of the house source in the 3D space according to the standard reference object by providing the standard reference object for the user.
In an example, as shown in fig. 12, which shows a schematic diagram of a property selection interface in an embodiment of the present invention, after a user touches a reference object control, a terminal may lift a pop-up window to which a standard reference object is added. In the pop-up window, the user can adjust the size and the display angle of the standard reference object. As shown in fig. 8, a schematic diagram ii of a reference object in the embodiment of the present invention is shown, after the user adjusts the reference object, the standard reference object is displayed according to the attribute information set by the user, so that the user can design the placement of the object in the house source according to the reference object, and the spatial perception of the house source is improved.
It should be noted that, different from the reference object control in the above optional embodiment, in this optional embodiment, the reference object control may correspondingly provide a regular cube reference object such as a cube, a cuboid, a cylinder, and the like, and in the above optional embodiment, the reference object control may correspondingly provide a specific reference object such as a user object and an article object, the former may be used to sense the spatial size of the house source, and the latter may be used to sense the movement of the user in the house source and the setting condition in the furniture house source, and the spatial sensing of the user on the house source may be improved from different angles by using different reference object objects.
In addition, for the embodiment of the present invention, a reference object control provided by a mobile terminal is taken as an example for description, it may be understood that, under the guidance of the idea of the embodiment of the present invention, a person skilled in the art may also provide a reference object for a user in other manners, for example, when the terminal is a VR terminal, a reference object control may be provided in a virtual room source environment where the user is located, and when the user interacts with the reference object control, a different reference object is provided for the user, so that the user selects a corresponding reference object, and the reference object is displayed in the virtual room source environment, so as to improve the spatial perception and immersion of browsing the room source in the 3D space by the user, which is not limited by the present invention.
In the embodiment of the invention, the preset terminal can establish the three-dimensional house space according to the house source data of the entity house source and display the three-dimensional house space through the image user interface, wherein the three-dimensional house source space can comprise at least one interaction area, and the interaction area can correspond to at least one reference object.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 13, which is a block diagram illustrating an embodiment of an interaction apparatus for a house source according to the present invention, a content displayed through a graphical user interface of a preset terminal at least includes a three-dimensional house space of the house source, where the three-dimensional house space includes at least one interaction area, and the interaction area corresponds to at least one reference object. The preset terminal may be the aforementioned local terminal device, or may also be the aforementioned cloud display client, and the following takes the local terminal device (especially, the mobile terminal) as an example for explanation. The method specifically comprises the following modules:
a first reference object display module 1301, configured to, in response to detecting that the current user position satisfies the target interaction region, display a first target reference object matching the target interaction region in the target interaction region.
In an optional embodiment of the present invention, the content displayed by the graphical user interface further includes a reference object control for the house source, and the apparatus further comprises:
and the second reference object display module is used for responding to the user operation acting on the reference object control, displaying a second target reference object matched with the user operation in the target interaction area, or updating the first target reference object into a third target reference object matched with the user operation.
In an optional embodiment of the present invention, further comprising:
and the interactive operation execution module is used for responding to the user operation acting on the first target reference object and controlling the first target reference object to execute the interactive operation corresponding to the user operation.
In an optional embodiment of the present invention, further comprising:
the reference object definition interface display module is used for responding to user operation acting on the preset terminal and displaying a reference object definition interface, and the reference object definition interface comprises attribute information of a reference object;
the first attribute information acquisition module is used for responding to user operation acting on the attribute information and acquiring target attribute information matched with the user operation;
The first reference object display module 1301 is specifically configured to:
and in response to the fact that the current user position is detected to meet the target interaction area, displaying a first target reference object matched with the target interaction area in the target interaction area according to the target attribute information.
In an optional embodiment of the invention, the second reference object display module comprises:
the reference object selection interface display sub-module is used for responding to the user operation acting on the reference object control and displaying a reference object selection interface, and the reference object selection interface comprises at least one reference object and attribute information aiming at the reference object;
a reference object selection sub-module for selecting a second target reference object in response to a user operation acting on the reference object;
a first attribute information selection submodule configured to acquire first target attribute information for the second target reference object in response to a user operation acting on the attribute information;
and the first reference object display submodule is used for displaying the second target reference object in the three-dimensional house space according to the first target attribute information.
In an optional embodiment of the invention, the second reference object display module comprises:
a second attribute information selection submodule for acquiring second target attribute information for the first target reference object in response to a user operation acting on the attribute information;
and the second reference object display submodule is used for updating the first target reference object into a third target reference object matched with the second target attribute information in the three-dimensional house space.
In an optional embodiment of the invention, the second reference object display module comprises:
a reference object selection submodule for selecting a second target reference object for the house source in response to a user operation on the reference object control, and presenting a property selection interface for the second target reference object;
an attribute information determination submodule configured to determine, in response to a user operation acting on the attribute selection interface, target attribute information for the second target reference object;
and the reference object display submodule is used for displaying the second target reference object in the three-dimensional house space according to the target attribute information.
In an optional embodiment of the present invention, the first target reference object corresponds to at least one interactive control, and the interactive operation execution module is specifically configured to:
and responding to the user operation acted on the interactive control, and controlling the first target reference object to execute the interactive operation corresponding to the user operation.
In an optional embodiment of the present invention, further comprising:
and the scene angle switching module is used for responding to the visual angle switching operation of the user and switching the scene angle of the three-dimensional house space.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
An embodiment of the present invention further provides an electronic device, including:
one or more processors; and
one or more machine-readable media having instructions stored thereon, which when executed by the one or more processors, cause the electronic device to perform methods as described in embodiments of the invention.
Embodiments of the invention also provide one or more machine-readable media having instructions stored thereon, which when executed by one or more processors, cause the processors to perform the methods described in embodiments of the invention.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The house source interaction method and the house source interaction device provided by the invention are described in detail, and the principle and the implementation mode of the invention are explained by applying specific examples, and the description of the examples is only used for helping to understand the method and the core idea of the invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (12)

1. An interaction method for a house source is characterized in that contents displayed through a graphical user interface of a preset terminal at least comprise a three-dimensional house space of the house source, the three-dimensional house space comprises at least one interaction area, and the interaction area corresponds to at least one reference object, and the method comprises the following steps:
in response to detecting that the current user position satisfies a target interaction region, presenting a first target reference object matching the target interaction region in the target interaction region.
2. The method of claim 1, wherein the content presented by the graphical user interface further comprises a reference control for the house source, the method further comprising:
and responding to the user operation acting on the reference object control, and displaying a second target reference object matched with the user operation in the target interaction area, or updating the first target reference object into a third target reference object matched with the user operation.
3. The method of claim 1, further comprising:
and responding to the user operation acting on the first target reference object, and controlling the first target reference object to execute interactive operation corresponding to the user operation.
4. The method of any of claims 1 to 3, further comprising:
responding to user operation acting on the preset terminal, and displaying a reference object definition interface, wherein the reference object definition interface comprises attribute information of a reference object;
responding to the user operation acting on the attribute information, and acquiring target attribute information matched with the user operation;
the presenting, in response to detecting that the current user position satisfies a target interaction region, a first target reference object matching the target interaction region in the target interaction region includes:
and in response to the fact that the current user position is detected to meet the target interaction area, displaying a first target reference object matched with the target interaction area in the target interaction area according to the target attribute information.
5. The method of claim 2, wherein presenting a second target reference object in the target interaction area that matches the user operation in response to the user operation acting on the reference control comprises:
responding to the user operation acting on the reference object control, and displaying a reference object selection interface, wherein the reference object selection interface comprises at least one reference object and attribute information aiming at the reference object;
Selecting a second target reference object in response to a user operation acting on the reference object;
acquiring first target attribute information for the second target reference object in response to a user operation acting on the attribute information;
and displaying the second target reference object in the three-dimensional house space according to the first target attribute information.
6. The method of claim 5, wherein updating the first target reference object to a third target reference object matching the user operation in the target interaction region in response to the user operation acting on the reference control comprises:
acquiring second target attribute information for the first target reference object in response to a user operation acting on the attribute information;
and updating the first target reference object to a third target reference object matched with the second target attribute information in the three-dimensional house space.
7. The method of claim 2, wherein said presenting a second target reference object in the three-dimensional housing space corresponding to the user operation in response to the user operation acting on the reference control comprises:
Responding to the user operation acting on the reference object control, selecting a second target reference object aiming at the house source, and displaying a property selection interface aiming at the second target reference object;
determining target attribute information for the second target reference object in response to a user operation acting on the attribute selection interface;
and displaying the second target reference object in the three-dimensional house space according to the target attribute information.
8. The method of claim 3, wherein the first target reference object corresponds to at least one interactive control, and wherein controlling the first target reference object to perform an interactive operation corresponding to the user operation in response to the user operation on the first target reference object comprises:
and responding to the user operation acted on the interactive control, and controlling the first target reference object to execute the interactive operation corresponding to the user operation.
9. The method of claim 1 or 2, further comprising:
and responding to the visual angle switching operation of the user, and carrying out scene angle switching on the three-dimensional house space.
10. An interactive device of a house source, characterized in that, the content displayed through a graphical user interface of a preset terminal at least comprises a three-dimensional house space of the house source, the three-dimensional house space comprises at least one interactive area, the interactive area corresponds to at least one reference object, the device comprises:
and the reference object display module is used for displaying a first target reference object matched with the target interaction area in response to detecting that the current user position meets the target interaction area.
11. An electronic device, comprising:
one or more processors; and
one or more machine-readable media having instructions stored thereon that, when executed by the one or more processors, cause the electronic device to perform the method of any of claims 1-9.
12. One or more machine-readable media having instructions stored thereon, which when executed by one or more processors, cause the processors to perform the method of any of claims 1-9.
CN202010942227.3A 2020-09-09 2020-09-09 House source interaction method and device Pending CN112051956A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010942227.3A CN112051956A (en) 2020-09-09 2020-09-09 House source interaction method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010942227.3A CN112051956A (en) 2020-09-09 2020-09-09 House source interaction method and device

Publications (1)

Publication Number Publication Date
CN112051956A true CN112051956A (en) 2020-12-08

Family

ID=73609953

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010942227.3A Pending CN112051956A (en) 2020-09-09 2020-09-09 House source interaction method and device

Country Status (1)

Country Link
CN (1) CN112051956A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112764629A (en) * 2021-01-28 2021-05-07 北京城市网邻信息技术有限公司 Augmented reality interface display method, device, equipment and computer readable medium
CN114463104A (en) * 2022-04-12 2022-05-10 贝壳技术有限公司 Method, apparatus and computer program product for processing VR scenarios
CN113918044B (en) * 2021-09-14 2022-06-14 北京城市网邻信息技术有限公司 Information display method and device, electronic equipment and readable medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120157123A1 (en) * 2010-12-15 2012-06-21 Google Inc. Peer-to-peer location service
CN103248671A (en) * 2013-03-13 2013-08-14 王龙飞 Service delivery method, device and server
CN111610997A (en) * 2020-05-26 2020-09-01 北京市商汤科技开发有限公司 AR scene content generation method, display system and device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120157123A1 (en) * 2010-12-15 2012-06-21 Google Inc. Peer-to-peer location service
CN103248671A (en) * 2013-03-13 2013-08-14 王龙飞 Service delivery method, device and server
CN111610997A (en) * 2020-05-26 2020-09-01 北京市商汤科技开发有限公司 AR scene content generation method, display system and device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112764629A (en) * 2021-01-28 2021-05-07 北京城市网邻信息技术有限公司 Augmented reality interface display method, device, equipment and computer readable medium
CN113918044B (en) * 2021-09-14 2022-06-14 北京城市网邻信息技术有限公司 Information display method and device, electronic equipment and readable medium
CN114463104A (en) * 2022-04-12 2022-05-10 贝壳技术有限公司 Method, apparatus and computer program product for processing VR scenarios

Similar Documents

Publication Publication Date Title
US11422671B2 (en) Defining, displaying and interacting with tags in a three-dimensional model
US10755485B2 (en) Augmented reality product preview
US9940404B2 (en) Three-dimensional (3D) browsing
US11348315B2 (en) Generating and presenting a 3D virtual shopping environment
US20180225885A1 (en) Zone-based three-dimensional (3d) browsing
WO2019126002A1 (en) Recommending and presenting products in augmented reality
US20150185825A1 (en) Assigning a virtual user interface to a physical object
CN112051956A (en) House source interaction method and device
KR20230108352A (en) Matching content to a spatial 3d environment
US20130024819A1 (en) Systems and methods for gesture-based creation of interactive hotspots in a real world environment
CN112232900A (en) Information display method and device
CN112068751A (en) House resource display method and device
CN113178015A (en) House source interaction method and device, electronic equipment and storage medium
CN112068754B (en) House resource display method and device
CN112596694B (en) Method and device for processing house source information
CN112053204A (en) House resource display method and device
CN114442872A (en) Layout and interaction method of virtual user interface and three-dimensional display equipment
CN117337443A (en) Customized virtual store
US20220254114A1 (en) Shared mixed reality and platform-agnostic format
JP7287509B2 (en) Method and apparatus for displaying item information in current space and media
CN111494948B (en) Editing method of game lens, electronic equipment and storage medium
WO2020259328A1 (en) Interface generation method, computer device and storage medium
CN112651801B (en) Method and device for displaying house source information
CN115624740A (en) Virtual reality equipment, control method, device and system thereof, and interaction system
US20200327699A1 (en) Image processing device, image providing server, image display method, and image provision method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination