CN111710047A - Information display method and device and electronic equipment - Google Patents

Information display method and device and electronic equipment Download PDF

Info

Publication number
CN111710047A
CN111710047A CN202010509747.5A CN202010509747A CN111710047A CN 111710047 A CN111710047 A CN 111710047A CN 202010509747 A CN202010509747 A CN 202010509747A CN 111710047 A CN111710047 A CN 111710047A
Authority
CN
China
Prior art keywords
virtual
user
difference
determining
terminal device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010509747.5A
Other languages
Chinese (zh)
Inventor
吴畏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Youzhuju Network Technology Co Ltd
Original Assignee
Beijing Youzhuju Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Youzhuju Network Technology Co Ltd filed Critical Beijing Youzhuju Network Technology Co Ltd
Priority to CN202010509747.5A priority Critical patent/CN111710047A/en
Publication of CN111710047A publication Critical patent/CN111710047A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The embodiment of the disclosure discloses an information display method, an information display device and electronic equipment. The method comprises the following steps: monitoring gesture adjustment operation executed by a user on a terminal device when first virtual three-dimensional space information of a target building is displayed at a first visual angle of the user in the terminal device; in response to monitoring gesture adjustment operation performed on the terminal equipment by a user, determining a second visual angle of the user based on the gesture adjustment operation; and displaying second virtual three-dimensional space information of the target building at the second view angle. The corresponding visual angle switching purpose can be realized by monitoring the posture adjustment operation of the user without touching any virtual visual angle adjusting button on the display interface of the terminal equipment, so that the user experience is improved.

Description

Information display method and device and electronic equipment
Technical Field
The present disclosure relates to the field of internet technologies, and in particular, to an information display method and apparatus, and an electronic device.
Background
In the process that a user browses building information through terminal equipment, the effect of visiting a building in the same place can be achieved through browsing a virtual scene provided by the terminal equipment applying virtual reality technology (VR).
In the process of browsing the virtual scene, a user can adjust the virtual scene in a manner of switching the view angle, so as to browse all corners of the building.
Disclosure of Invention
This disclosure is provided to introduce concepts in a simplified form that are further described below in the detailed description. This disclosure is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
The embodiment of the disclosure provides an information display method, an information display device and electronic equipment, which can achieve the purpose of corresponding visual angle switching by monitoring gesture adjustment operation of a user without touching any virtual visual angle adjusting button on a display interface of terminal equipment, thereby improving user experience.
In a first aspect, an embodiment of the present disclosure provides an information display method, where the method includes: monitoring gesture adjustment operation executed by a user on a terminal device when first virtual three-dimensional space information of a target building is displayed at a first visual angle of the user in the terminal device; in response to monitoring gesture adjustment operation performed on the terminal equipment by a user, determining a second visual angle of the user based on the gesture adjustment operation; and displaying second virtual three-dimensional space information of the target building at the second view angle.
In a second aspect, an embodiment of the present disclosure provides an information display apparatus, including: the monitoring module is used for monitoring gesture adjustment operation executed by a user on the terminal equipment when first virtual three-dimensional space information of a target building is displayed at a first visual angle of the user in the terminal equipment; the determining module is used for responding to the monitoring of gesture adjusting operation executed on the terminal equipment by the user, and determining a second visual angle of the user based on the gesture adjusting operation; and the display module is used for displaying the second virtual three-dimensional space information of the target building at the second visual angle.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including: one or more processors; a storage device, on which one or more programs are stored, which, when executed by the one or more processors, cause the one or more processors to implement the information presentation method of the first aspect.
In a fourth aspect, the disclosed embodiments provide a computer-readable medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the information presentation method according to the first aspect.
According to the information display method, the information display device and the electronic equipment, when first virtual three-dimensional space information of a target building is displayed in terminal equipment at a first visual angle of a user, gesture adjustment operation executed by the user on the terminal equipment is monitored; then, responding to the monitoring of the posture adjustment operation executed by the user on the terminal equipment, and determining a second visual angle of the user based on the posture adjustment operation; and finally, displaying second virtual three-dimensional space information of the target building at the second visual angle. The corresponding visual angle switching purpose can be realized by monitoring the posture adjustment operation of the user without touching any virtual visual angle adjusting button on the display interface of the terminal equipment, so that the user experience is improved.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and features are not necessarily drawn to scale.
FIG. 1 is a flow diagram of one embodiment of an information presentation method according to the present disclosure;
FIG. 2 is a schematic flow chart diagram illustrating one embodiment of determining a second perspective involved in the present disclosure;
FIG. 3 is a schematic structural diagram of one embodiment of an information presentation device according to the present disclosure;
FIG. 4 is an exemplary system architecture to which the information presentation method of one embodiment of the present disclosure may be applied;
fig. 5 is a schematic diagram of a basic structure of an electronic device provided according to an embodiment of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order, and/or performed in parallel. Moreover, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "include" and variations thereof as used herein are open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
It should be noted that, in the present disclosure, the embodiments and features of the embodiments may be combined with each other without conflict.
Referring to fig. 1, a flowchart of an embodiment of an information presentation method according to the present disclosure is shown, as shown in fig. 1, the information presentation method includes the following steps 101 to 103.
Step 101, when displaying first virtual three-dimensional space information of a target building at a first view angle of a user in a terminal device, monitoring gesture adjustment operation executed by the user on the terminal device.
The terminal device includes a mobile device applying a virtual reality technology, for example, a mobile phone, a tablet computer, and the like.
The terminal device may have an application installed therein for browsing virtual three-dimensional spatial information of a target building. The user can browse the virtual three-dimensional space information of the target building through the application. In addition, the user can browse the virtual three-dimensional space information of the target building through the browser.
When the virtual three-dimensional space information of the target building is presented in the terminal equipment, the user can roam in the virtual three-dimensional space.
In practice, the process of constructing the virtual three-dimensional space of the target building includes: and organizing the collected picture information of the building, and constructing a virtual three-dimensional space corresponding to the building through the picture information. It should be noted that constructing a virtual three-dimensional space of a building through multiple pictures of a target building is a well-known technology widely applied and researched at present, and is not described herein again.
When the user is in the virtual three-dimensional space of the target building, the terminal device may display the first virtual three-dimensional space information of the target building at a current viewing angle (the first viewing angle) of the user. For example, the target building is a room source a, the terminal device may display information in the living room at the first viewing angle, where the information in the living room may include specific scene information such as a tea table and a sofa, and may also include actual floor height information of the living room, such as display information representing "floor height 3 m" with a floor height of 3 m. Here, the information in the living room can be regarded as the first virtual three-dimensional space information for the house source a.
The virtual three-dimensional space may include virtual information of each part in the corresponding real space, and when a user browses the first virtual three-dimensional space information, if the user wants to browse information of other parts in the virtual three-dimensional space, a gesture adjustment operation may be performed on the terminal device, where the gesture adjustment operation may include operations such as flicking or rotating in any direction performed on the terminal device. In some application scenarios, the posture adjustment operation may be an operation of swinging the terminal device to the right, swinging the terminal device to the left, tilting forward or tilting backward, and the like, relative to the posture of the terminal device at the current time.
In practice, the terminal device may monitor the adjustment of the attitude of the terminal device by monitoring the change of the parameters of the sensor associated with the terminal device. The sensor may include, for example, a gyroscope, a gravitational acceleration sensor, and the like. It should be noted that determining the posture of the terminal device through the parameter change of the sensor on the terminal device is a well-known technology widely studied and applied at present, and is not described herein again.
And 102, in response to monitoring the gesture adjusting operation executed by the user on the terminal equipment, determining a second visual angle of the user based on the gesture adjusting operation.
After the terminal device monitors the posture adjustment operation, the terminal device can analyze and judge the posture adjustment operation to determine the specific posture adjustment operation executed by the user. For example, the terminal device detects that its posture is rotated clockwise by 20 ° with respect to the previous posture, and can be regarded as monitoring that the user performs a specific posture adjustment operation of rotating to the right on the terminal device.
After the terminal device analyzes and judges the specific gesture adjustment operation performed by the monitored user, the second viewing angle of the user can be determined based on the specific gesture adjustment operation. The second perspective herein may include a perspective of the user in the virtual three-dimensional space after the pose adjustment. For example, when browsing the living room of the house source a, the user may flick the terminal device to the left or to the right if the user wants to browse the dining room behind the living room. At this time, the terminal device monitors the specific gesture adjustment operation to the left or the right, and the gesture-adjusted viewing angle may be regarded as the second viewing angle. In the process of adjusting from the living room to the dining room, the terminal device can detect the adjusted postures in real time, and the user visual angle corresponding to each posture can be regarded as a second visual angle at the current moment. For example, when the viewing angle corresponding to the living room is adjusted to the viewing angle corresponding to the dining room, the terminal device swings 30 ° to the right, and then the second viewing angle corresponding to the browsing corridor can be obtained. Then, the terminal device can continue to swing to the right, and then a second visual angle corresponding to the dining room can be browsed.
In some application scenarios, a second viewing angle corresponding to each gesture adjustment operation may also be set. For example, for the gesture adjustment operation of performing the rightward adjustment on the terminal device, a unique second viewing angle may also be determined, that is, the terminal device may be adjusted rightward by 10 °, 20 ° or 50 °, and the corresponding second viewing angles may be consistent.
And 103, displaying second virtual three-dimensional space information of the target building at a second visual angle.
After the terminal device determines the second viewing angle, the terminal device may display the second virtual three-dimensional space information at the second viewing angle. Here, the second virtual three-dimensional space information may show, as with the first virtual three-dimensional space information described above, virtual scene information corresponding to a certain portion of the virtual three-dimensional space, or size information representing the actual floor height, the actual floor width, and the like of the target building.
In the prior art, in the process of browsing a target building on a terminal device applying a virtual reality technology, a user generally adjusts a browsing angle of the target building by performing an adjustment operation on a virtual adjustment button displayed on a terminal interface. When a target building is browsed through terminal equipment with a large display screen such as a computer, the browsing visual angle can be quickly and accurately adjusted. However, if the target building is browsed through a terminal device with a smaller display screen such as a mobile phone, the situation that the adjustment button is touched by mistake due to the smaller display screen may occur, and then the situation that the adjusted viewing angle is not the target viewing angle of the user occurs, so that the use experience of the user is reduced.
In the embodiment, when first virtual three-dimensional space information of a target building is displayed in a terminal device at a first visual angle of a user, gesture adjustment operation performed on the terminal device by the user is monitored; then, in response to monitoring the gesture adjustment operation executed by the user on the terminal equipment, determining a second visual angle of the user based on the gesture adjustment operation; and finally, displaying second virtual three-dimensional space information of the target building at a second visual angle. The corresponding visual angle switching purpose can be realized by monitoring the posture adjustment operation of the user without touching any virtual visual angle adjusting button on the display interface of the terminal equipment, so that the user experience is improved.
Referring to fig. 2, which shows a flowchart of an embodiment of determining the second viewing angle according to the present disclosure, as shown in fig. 2, the step 102 may include the following steps 201 and 203.
Step 201, determining the reference posture of the terminal device corresponding to the second visual angle.
When the terminal device determines the second viewing angle according to the gesture adjustment operation, the gesture of the terminal device before the gesture adjustment can be determined, so that the specific action performed on the terminal device by the user can be determined.
In some optional implementation manners, the reference posture of the terminal device may be set according to an application scene of the terminal device. For example, the attitude of the terminal device when it is placed horizontally is used as the reference attitude of the terminal device. And taking the posture of the terminal equipment when the terminal equipment is vertically placed as the reference posture of the terminal equipment. In practice, the reference posture of the terminal device in a period of time including the current time may be determined according to the duration of each posture corresponding to the terminal device. For example, the posture of the terminal device with the longest duration in the period of time is determined as the reference posture of the terminal device.
In some alternative implementations, the reference gesture here may be a gesture of the terminal device before the gesture adjustment operation. For example, before the user performs the posture adjustment operation, the terminal device is horizontally placed, and at this time, the reference posture of the terminal device may be regarded as a horizontal posture. If the terminal device is placed obliquely before the user performs the posture adjustment operation, the reference posture of the terminal device may be regarded as an oblique posture having the same inclination angle at this time.
Step 202, determining the termination posture of the terminal device after executing the posture adjustment operation.
After the user performs the gesture adjustment operation, the current gesture of the terminal device may be regarded as the termination gesture. For example, after the user performs the gesture adjustment operation, the terminal device is in the horizontal gesture, and the horizontal gesture may be regarded as the termination gesture at the current time.
In some application scenarios, the terminal device may detect the reference gesture and the termination gesture through the sensor. That is, when the relevant parameter detected by the sensor is within the preset range, it may be regarded that the current posture adjustment operation is ended, that is, the terminal device is in a stationary state, and the posture of the terminal device at the current time may be regarded as the termination posture. The relevant parameter in the preset range here may include, for example, an angular velocity of 0 (radians/per second).
Step 203, determining a second viewing angle according to the attitude difference between the termination attitude and the reference attitude.
After the user performs the gesture adjustment operation, when the terminal device changes from the reference gesture to the termination gesture, the relative directions and the relative angles corresponding to the two may be different, and then the differences may be regarded as the gesture differences. For example, the reference posture of the terminal device is a horizontal posture, and the termination posture is a posture rotated by 30 ° clockwise with respect to the reference posture. A clockwise rotation of 30 ° here can be regarded as a difference in attitude between the termination attitude and the reference attitude.
After determining the pose difference between the termination pose and the reference pose, a second perspective may be determined based on the pose difference. For example, the posture difference between the termination posture and the reference posture is 30 ° clockwise rotation, and it can be considered that the user performs the posture adjustment operation of 30 ° clockwise rotation on the terminal device. Then, corresponding to the posture difference of 30 ° clockwise, the position of the previous viewing angle in the virtual three-dimensional space is taken as a reference position, and the corresponding angle is rotated to obtain the second viewing angle at the current moment. Here, the correspondence between the posture difference and the angle of view rotation in the virtual three-dimensional space may be set. For example, the terminal device may be set to rotate 1 ° in a clockwise direction, corresponding to a 5 ° angular rotation of the view in the virtual three-dimensional space. That is, if the detected posture difference is 30 ° rotated clockwise, the position of the current view angle may be used as the reference position, and the view angle rotated clockwise by 150 ° may be determined as the second view angle.
In some application scenarios, the reference posture is a posture of the terminal device when the first virtual three-dimensional space information is displayed at the first visual angle of the user.
That is, when the user browses the first virtual three-dimensional space information at the first viewing angle, the posture of the terminal device at the current time may be determined as the reference posture. Therefore, the first visual angle and the reference posture form a corresponding relation, and a user can have a sense of direction when performing posture adjustment operation. For example, when the user roams in the virtual three-dimensional space, if the user wants to browse the left part of the current virtual scene, the terminal device may be performed with a gesture adjustment operation of rotating to the left.
In some alternative implementations, the pose difference comprises a first pose difference in a horizontal plane between the termination pose and the reference pose; step 203 may include step 2031.
Step 2031, determining a second view angle corresponding to the first virtual three-dimensional subspace shown by the first view angle according to the corresponding direction and size of the first attitude difference; the first virtual three-dimensional subspace is a subspace of the virtual three-dimensional space of the target building.
A rectangular spatial coordinate system may be established, and the attitude of the terminal device at the current time may be used as a reference attitude (for example, the attitude of the terminal device when the terminal device is laid on a desktop), and the center of gravity corresponding to the reference attitude may be located at the origin of the coordinate system. In a horizontal coordinate plane formed with a horizontal axis and a vertical axis, if the termination posture has a posture difference with respect to the vertical axis, the posture difference can be regarded as the above-described first posture difference.
In some application scenarios, the first pose difference may include an angle difference and may also include a direction difference. For example, in the horizontal coordinate plane described above, the left side of the reference attitude may be set as the negative half axis of the vertical axis, and the right side of the reference attitude may be set as the positive half axis of the vertical axis. Then if the terminating gesture corresponds to a first gesture difference in the direction of 30 deg. of the positive half of the longitudinal axis (the opening of the change angle corresponding to the first gesture difference toward the negative half of the longitudinal axis) relative to the reference gesture, it can be considered that the user performed a gesture adjustment operation of 30 deg. to the right on the terminal device. If the terminal attitude corresponds to a first attitude difference in the direction of 30 ° from the negative half axis of the longitudinal axis (the opening of the change angle corresponding to the first attitude difference toward the positive half axis of the longitudinal axis) with respect to the reference attitude, it can be regarded that the user has performed an attitude adjustment operation of 30 ° to the left on the terminal device.
After monitoring the corresponding gesture adjustment operation, the terminal device may adjust the viewing angle according to a preset rule in the current virtual three-dimensional subspace (i.e., the first virtual three-dimensional subspace) to obtain the second viewing angle. The preset rule here may be, for example, that after detecting the gesture adjustment operation adjusted by 30 ° to the left, the viewing angle is rotated by 30 ° counterclockwise in the current virtual three-dimensional subspace, so as to obtain the corresponding second viewing angle.
For example, in the virtual three-dimensional space corresponding to the current target building as the house source a, the first virtual three-dimensional subspace information displayed at the first viewing angle is a living room, the first viewing angle may be near a tea table of the living room, at this time, the first virtual three-dimensional space information displayed at the terminal device may include a sofa and a wall on the back of the sofa, after a posture adjustment operation of swinging the terminal device 30 ° to the right performed by the user is monitored, based on the posture adjustment operation, the first viewing angle may be regarded as a reference viewing angle, and the viewing angle after rotating 30 ° clockwise in the virtual three-dimensional space is determined as a second viewing angle. At this time, the virtual three-dimensional space information of the house source a at the second viewing angle may include a television in the living room and a wall behind the television.
In some alternative implementations, the attitude difference includes a second attitude difference in the vertical direction at the termination attitude from the reference attitude; step 203 may include step 2032.
Step 2032, determining a second view angle for displaying a second virtual three-dimensional subspace corresponding to the second attitude difference according to the corresponding direction and size of the second attitude difference; and the second virtual three-dimensional subspace is a subspace of the virtual three-dimensional space of the target building.
In the above-described established spatial rectangular coordinate system, the attitude difference with respect to the vertical axis in the vertical coordinate plane formed with the vertical axis and the vertical axis may be regarded as the above-described second attitude difference.
In some application scenarios, the second posture difference may also include an angle difference and may also include a direction difference. For example, in the above-described vertical coordinate plane, the rear of the reference attitude may be set as the negative half axis of the vertical axis, and the front of the reference attitude may be set as the positive half axis of the vertical axis. Then if the terminal attitude is relative to the reference attitude with the corresponding second attitude difference being in the direction of 30 ° of the positive half axis of the vertical axis (the opening of the variation angle corresponding to the second attitude difference being toward the negative half axis of the vertical axis), it can be regarded that the user has performed an attitude adjustment operation of tilting forward by 30 ° on the terminal device. If the terminal attitude corresponds to a second attitude difference in the direction of 30 ° from the negative half axis of the vertical axis (the opening of the change angle corresponding to the second attitude difference toward the positive half axis of the vertical axis) with respect to the reference attitude, it can be regarded that the user has performed an attitude adjustment operation of tilting the terminal device backward by 30 °.
After monitoring the corresponding posture adjustment operation, the terminal device may adjust the viewing angle in the virtual three-dimensional space according to a preset rule to obtain the second viewing angle. The preset rule here may be, for example, that when the forward tilting operation is monitored, the preset viewing angle of the last virtual three-dimensional subspace browsed by the user is determined as the second viewing angle. And when the backward tilting operation is monitored, switching to the preset visual angle of the next virtual three-dimensional subspace to determine the preset visual angle as a second visual angle. The preset view angle here may be a view angle set in advance to switch from the current virtual three-dimensional subspace (e.g., the first virtual three-dimensional subspace) to another virtual three-dimensional subspace (the second virtual three-dimensional subspace) when the user performs the forward tilting or backward tilting operation. For example, in the house source a, a preset viewing angle a of a living room, a preset viewing angle b of a dining room, and a preset viewing angle c of a bedroom may be set. After browsing the virtual scene of the dining room, the user enters the virtual scene of the living room. At this time, the first virtual three-dimensional space information at the first viewing angle of the user may include a virtual scene of a living room, and when the forward tilting operation is monitored, the preset viewing angle b may be determined as a second viewing angle; when the reclining operation is monitored, the preset angle of view c may be determined as a second angle of view.
In other application scenarios, if the gesture difference of the user is monitored to include the first gesture difference and the second gesture difference, that is, the current gesture difference has a difference in a horizontal plane and also has a difference in a vertical direction. A first candidate second perspective corresponding to the first virtual three-dimensional subspace exhibited by the first perspective and a second candidate second perspective for exhibiting the second virtual three-dimensional subspace corresponding to the second pose difference may be determined. The priority of the first candidate second view and the second candidate second view may then be determined, and finally the second view may be determined according to the priority. For example, in the virtual three-dimensional space of the house source a, the first candidate second viewing angle is in the living room, the second candidate second viewing angle is in the kitchen, the last virtual three-dimensional subspace browsed by the user is the bedroom, and the preset viewing angle of the bedroom is the viewing angle a. If it is detected that the attitude adjustment operation performed by the user on the terminal device is rotated rightward while being tilted forward. At this time, if the priority of the preset first candidate second view is higher than that of the second candidate second view, the virtual three-dimensional subspace may be rotated to the right by a corresponding angle, and then switched to the preset view a. If the priority of the preset first candidate second visual angle is lower than that of the second candidate second visual angle, the preset visual angle a can be directly switched to. The preset viewing angle a may be regarded as the current second viewing angle.
Referring to fig. 3, which is a schematic structural diagram illustrating an embodiment of an information presentation apparatus according to the present disclosure, as shown in fig. 3, the information presentation apparatus includes a listening module 301, a determining module 302, and a presentation module 303. The monitoring module 301 is configured to monitor a posture adjustment operation performed by a user on a terminal device when first virtual three-dimensional space information of a target building is displayed in the terminal device at a first viewing angle of the user; a determining module 302, configured to determine, in response to monitoring a gesture adjustment operation performed on the terminal device by the user, a second viewing angle of the user based on the gesture adjustment operation; and the display module 303 is configured to display second virtual three-dimensional space information of the target building at a second viewing angle.
It should be noted that specific processing of the monitoring module 301, the determining module 302, and the displaying module 303 of the information displaying apparatus and technical effects thereof can refer to the related descriptions of step 101 to step 103 in the corresponding embodiment of fig. 1, which are not described herein again.
In some optional implementations of this embodiment, the determining module 302 is further configured to: determining a reference posture of the terminal equipment corresponding to the second visual angle; determining a termination attitude of the terminal equipment after executing the attitude adjustment operation; a second perspective is determined based on a difference in pose between the termination pose and the reference pose.
In some optional implementations of the embodiment, the reference gesture is a gesture of the terminal device when the first virtual three-dimensional space information is displayed at the first perspective of the user.
In some optional implementations of the embodiment, the attitude difference includes a first attitude difference in a horizontal plane between the termination attitude and the reference attitude; and the determination module 302 is further configured to: determining a second visual angle corresponding to the first virtual three-dimensional subspace shown by the first visual angle according to the corresponding direction and the size of the first posture difference; the first virtual three-dimensional subspace is a subspace of the virtual three-dimensional space of the target building.
In some optional implementations of the present embodiment, the attitude difference includes a second attitude difference in a vertical direction between the termination attitude and the reference attitude; and the determination module 302 is further configured to: and determining a second visual angle for displaying a second virtual three-dimensional subspace corresponding to the second attitude difference according to the corresponding direction and the corresponding size of the second attitude difference, wherein the second virtual three-dimensional subspace is a subspace of the virtual three-dimensional space of the target building.
Referring to fig. 4, an exemplary system architecture to which the information presentation method of one embodiment of the present disclosure may be applied is shown.
As shown in fig. 4, the system architecture may include terminal devices 401, 402, 403, a network 404, and a server 405. The network 404 serves as a medium for providing communication links between the terminal devices 401, 402, 403 and the server 405. Network 404 may include various types of connections, such as wire, wireless communication links, or fiber optic cables, to name a few. The terminal devices and servers described above may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., Ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The terminal devices 401, 402, 403 may interact with a server 405 over a network 404 to receive or send messages or the like. The terminal devices 401, 402, 403 may have various client applications installed thereon, such as a video distribution application, a search-type application, and a news-information-type application.
The terminal devices 401, 402, and 403 may be hardware or software. When the terminal devices 401, 402, and 403 are hardware, they may be various electronic devices having a display screen and supporting web browsing, including but not limited to smart phones, tablet computers, e-book readers, MP3 players (Moving Picture Experts Group Audio Layer III, mpeg compression standard Audio Layer 3), MP4 players (Moving Picture Experts Group Audio Layer IV, mpeg compression standard Audio Layer 4), laptop portable computers, desktop computers, and the like. When the terminal devices 401, 402, and 403 are software, they can be installed in the electronic devices listed above. It may be implemented as multiple pieces of software or software modules (e.g., software or software modules used to provide distributed services) or as a single piece of software or software module. And is not particularly limited herein.
The server 405 may be a server that can provide various services, for example, receives a virtual three-dimensional space information acquisition request transmitted by the terminal devices 401, 402, and 403, performs analysis processing on the virtual three-dimensional space information acquisition request, and transmits an analysis processing result (for example, virtual three-dimensional space information corresponding to the acquisition request) to the terminal devices 401, 402, and 403.
It should be noted that the information displaying method provided by the embodiment of the present disclosure may be executed by a terminal device, and accordingly, the information displaying apparatus may be disposed in the terminal device.
It should be understood that the number of terminal devices, networks, and servers in fig. 4 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Referring now to FIG. 5, a block diagram of an electronic device (e.g., the server of FIG. 4) suitable for use in implementing embodiments of the present disclosure is shown. The electronic device shown in fig. 5 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 5, the electronic device may include a processing means (e.g., central processing unit, graphics processor, etc.) 501 that may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)502 or a program loaded from a storage means 508 into a Random Access Memory (RAM) 503. In the RAM 503, various programs and data necessary for the operation of the electronic apparatus are also stored. The processing device 501, the ROM 502, and the RAM 503 are connected to each other through a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
Generally, the following devices may be connected to the I/O interface 505: input devices 506 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 507 including, for example, a Liquid Crystal Display (LCD), speakers, vibrators, and the like; storage devices 508 including, for example, magnetic tape, hard disk, etc.; and a communication device 509. The communication means 509 may allow the electronic device to communicate with other devices wirelessly or by wire to exchange data. While fig. 5 illustrates an electronic device having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program carried on a non-transitory computer readable medium, the computer program containing program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 509, or installed from the storage means 508, or installed from the ROM 502. The computer program performs the above-described functions defined in the methods of the embodiments of the present disclosure when executed by the processing device 501.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: monitoring gesture adjustment operation executed by a user on a terminal device when first virtual three-dimensional space information of a target building is displayed at a first visual angle of the user in the terminal device; in response to monitoring the gesture adjustment operation executed by the user on the terminal equipment, determining a second visual angle of the user based on the gesture adjustment operation; and displaying second virtual three-dimensional space information of the target building at a second visual angle.
Computer program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including but not limited to an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of a module does not in some cases constitute a limitation of the unit itself, for example, the presentation module may also be described as a "module presenting second virtual three-dimensional spatial information of the target building at a second viewing angle".
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (12)

1. An information display method, comprising:
monitoring gesture adjustment operation executed by a user on a terminal device when first virtual three-dimensional space information of a target building is displayed at a first visual angle of the user in the terminal device;
in response to monitoring gesture adjustment operation performed on the terminal equipment by a user, determining a second visual angle of the user based on the gesture adjustment operation;
and displaying second virtual three-dimensional space information of the target building at the second view angle.
2. The method of claim 1, wherein determining a second perspective of the user based on the pose adjustment operation comprises:
determining a reference posture of the terminal device corresponding to the second visual angle;
determining a termination posture of the terminal device after the posture adjustment operation is executed;
determining the second perspective based on a pose difference between the termination pose and the reference pose.
3. The method of claim 2, wherein the reference gesture is a gesture of the terminal device while presenting the first virtual three-dimensional spatial information from the first perspective of the user.
4. The method of claim 2, wherein the pose difference comprises a first pose difference in a horizontal plane at the termination pose and the reference pose; and
the determining the second perspective from the pose difference between the termination pose and the reference pose comprises:
determining a second visual angle corresponding to a first virtual three-dimensional subspace shown by the first visual angle according to the corresponding direction and the size of the first posture difference; wherein the first virtual three-dimensional subspace is a subspace of a virtual three-dimensional space of the target building.
5. The method of claim 2, wherein the attitude difference comprises a second attitude difference in a vertical direction at the termination attitude and the reference attitude; and
the determining the second perspective from the pose difference between the termination pose and the reference pose comprises:
determining a second visual angle for displaying a second virtual three-dimensional subspace corresponding to the second posture difference according to the corresponding direction and the corresponding size of the second posture difference; wherein the second virtual three-dimensional subspace is a subspace of a virtual three-dimensional space of the target building.
6. An information presentation device, comprising:
the monitoring module is used for monitoring gesture adjustment operation executed by a user on the terminal equipment when first virtual three-dimensional space information of a target building is displayed at a first visual angle of the user in the terminal equipment;
the determining module is used for responding to the monitoring of gesture adjusting operation executed on the terminal equipment by the user, and determining a second visual angle of the user based on the gesture adjusting operation;
and the display module is used for displaying the second virtual three-dimensional space information of the target building at the second visual angle.
7. The apparatus of claim 6, wherein the determining module is further configured to:
determining a reference posture of the terminal device corresponding to the second visual angle;
determining a termination posture of the terminal device after the posture adjustment operation is executed;
determining the second perspective based on a pose difference between the termination pose and the reference pose.
8. The apparatus of claim 7, wherein the reference gesture is a gesture of the terminal device while the first virtual three-dimensional spatial information is presented from the first perspective of the user.
9. The apparatus of claim 7, wherein the attitude difference comprises a first attitude difference in a horizontal plane at the termination attitude and the reference attitude; and
the determination module is further to:
determining a second visual angle corresponding to a first virtual three-dimensional subspace shown by the first visual angle according to the corresponding direction and the size of the first posture difference; wherein the first virtual three-dimensional subspace is a subspace of a virtual three-dimensional space of the target building.
10. The apparatus of claim 7, wherein the attitude difference comprises a second attitude difference in a vertical direction at the termination attitude and the reference attitude; and
the determination module is further to:
determining a second visual angle for displaying a second virtual three-dimensional subspace corresponding to the second posture difference according to the corresponding direction and the corresponding size of the second posture difference; wherein the second virtual three-dimensional subspace is a subspace of a virtual three-dimensional space of the target building.
11. An electronic device, comprising:
one or more processors;
storage means having one or more programs stored thereon which, when executed by the one or more processors, cause the one or more processors to implement the method of any of claims 1-5.
12. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-5.
CN202010509747.5A 2020-06-05 2020-06-05 Information display method and device and electronic equipment Pending CN111710047A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010509747.5A CN111710047A (en) 2020-06-05 2020-06-05 Information display method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010509747.5A CN111710047A (en) 2020-06-05 2020-06-05 Information display method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN111710047A true CN111710047A (en) 2020-09-25

Family

ID=72539207

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010509747.5A Pending CN111710047A (en) 2020-06-05 2020-06-05 Information display method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN111710047A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113608616A (en) * 2021-08-10 2021-11-05 深圳市慧鲤科技有限公司 Virtual content display method and device, electronic equipment and storage medium
CN114566132A (en) * 2022-02-28 2022-05-31 北京京东方显示技术有限公司 Parameter processing method and device, electronic equipment and computer readable storage medium
CN115016721A (en) * 2022-05-09 2022-09-06 北京城市网邻信息技术有限公司 Information display method and device, electronic equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150154803A1 (en) * 2009-10-12 2015-06-04 Metaio Gmbh Method for representing virtual information in a view of a real environment
CN106249918A (en) * 2016-08-18 2016-12-21 南京几墨网络科技有限公司 Virtual reality image display packing, device and apply its terminal unit
CN107122107A (en) * 2017-04-26 2017-09-01 网易(杭州)网络有限公司 Visual angle regulating method, device, medium and electronic equipment in virtual scene
CN108499105A (en) * 2018-04-16 2018-09-07 腾讯科技(深圳)有限公司 The method, apparatus and storage medium of visual angle adjustment are carried out in virtual environment
CN110825333A (en) * 2018-08-14 2020-02-21 广东虚拟现实科技有限公司 Display method, display device, terminal equipment and storage medium
CN111158469A (en) * 2019-12-12 2020-05-15 广东虚拟现实科技有限公司 Visual angle switching method and device, terminal equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150154803A1 (en) * 2009-10-12 2015-06-04 Metaio Gmbh Method for representing virtual information in a view of a real environment
CN106249918A (en) * 2016-08-18 2016-12-21 南京几墨网络科技有限公司 Virtual reality image display packing, device and apply its terminal unit
CN107122107A (en) * 2017-04-26 2017-09-01 网易(杭州)网络有限公司 Visual angle regulating method, device, medium and electronic equipment in virtual scene
CN108499105A (en) * 2018-04-16 2018-09-07 腾讯科技(深圳)有限公司 The method, apparatus and storage medium of visual angle adjustment are carried out in virtual environment
CN110825333A (en) * 2018-08-14 2020-02-21 广东虚拟现实科技有限公司 Display method, display device, terminal equipment and storage medium
CN111158469A (en) * 2019-12-12 2020-05-15 广东虚拟现实科技有限公司 Visual angle switching method and device, terminal equipment and storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113608616A (en) * 2021-08-10 2021-11-05 深圳市慧鲤科技有限公司 Virtual content display method and device, electronic equipment and storage medium
CN114566132A (en) * 2022-02-28 2022-05-31 北京京东方显示技术有限公司 Parameter processing method and device, electronic equipment and computer readable storage medium
CN115016721A (en) * 2022-05-09 2022-09-06 北京城市网邻信息技术有限公司 Information display method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN111710047A (en) Information display method and device and electronic equipment
CN112488783B (en) Image acquisition method and device and electronic equipment
EP3561667B1 (en) Method for displaying 2d application in vr device, and terminal
CN107329671B (en) Model display method and device
US11586255B2 (en) Method and apparatus for adjusting view for target device, electronic device and medium
CN111414225A (en) Three-dimensional model remote display method, first terminal, electronic device and storage medium
US11561651B2 (en) Virtual paintbrush implementing method and apparatus, and computer readable storage medium
CN114168250A (en) Page display method and device, electronic equipment and storage medium
CN111652675A (en) Display method and device and electronic equipment
CN113989470A (en) Picture display method and device, storage medium and electronic equipment
US20230421857A1 (en) Video-based information displaying method and apparatus, device and medium
CN111582993A (en) Method and device for acquiring target object, electronic equipment and storage medium
CN111726666A (en) Video display control method and device
CN111597414B (en) Display method and device and electronic equipment
WO2021244651A1 (en) Information display method and device, and terminal and storage medium
CN114116081B (en) Interactive dynamic fluid effect processing method and device and electronic equipment
CN109472873B (en) Three-dimensional model generation method, device and hardware device
JP2022551671A (en) OBJECT DISPLAY METHOD, APPARATUS, ELECTRONIC DEVICE, AND COMPUTER-READABLE STORAGE MEDIUM
CN111696214A (en) House display method and device and electronic equipment
WO2021185047A1 (en) Sticker processing method and apparatus
WO2023030079A1 (en) Article display method and apparatus, and electronic device and storage medium
CN110070600B (en) Three-dimensional model generation method, device and hardware device
CN111105345B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN114359362A (en) House resource information acquisition method and device and electronic equipment
CN114549435A (en) Method and device for detecting virtual reality camera and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination