CN108499102B - Information interface display method and device, storage medium and electronic equipment - Google Patents
Information interface display method and device, storage medium and electronic equipment Download PDFInfo
- Publication number
- CN108499102B CN108499102B CN201810299608.7A CN201810299608A CN108499102B CN 108499102 B CN108499102 B CN 108499102B CN 201810299608 A CN201810299608 A CN 201810299608A CN 108499102 B CN108499102 B CN 108499102B
- Authority
- CN
- China
- Prior art keywords
- information
- dimensional entity
- information interface
- interface
- virtual object
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/30—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
- A63F2300/308—Details of the user interface
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/66—Methods for processing data by generating or executing the game program for rendering three dimensional images
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8082—Virtual reality
Abstract
The present disclosure relates to the field of computer technologies, and in particular, to a method and an apparatus for displaying an information interface in virtual reality, a storage medium, and an electronic device. The method can comprise the following steps: responding to a request for displaying information, determining a virtual object for displaying the information in a virtual reality scene according to a preset rule, and acquiring model information of a three-dimensional entity of the virtual object; generating an information interface comprising the information according to the model information, and projecting the information interface to the surface of the three-dimensional entity; and controlling the information interface to perform self-adaptive scaling according to the three-dimensional entity so as to fit the surface of the three-dimensional entity. The information interface is more fit with the surface of the three-dimensional entity, and the size and the shape of the three-dimensional entity are more fit, so that the separation sense of the information interface and the virtual object is greatly reduced, the integral sense of the information interface and the virtual object is greatly improved, and the interactive immersion sense and the reality sense of a user and the virtual object are improved.
Description
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method and an apparatus for displaying an information interface in virtual reality, a storage medium, and an electronic device.
Background
With the rapid development of mobile communication technology, more and more games are applied to virtual reality scenes. In a game based on a virtual reality scene, a user interacts with a virtual object in the virtual reality scene by controlling, for example, a line of sight or the like to acquire information of the virtual object, and further interacts according to the information of the virtual object, so that the playability and the searchability of the game are determined by the display of the information of the virtual object in the virtual reality scene.
At present, in a virtual reality scene, information of a virtual object is usually displayed in the form of an information interface, that is, the information interface is used as a carrier to display the information. The information interface comprises a non-plot information interface and a space information interface. The non-plot information interface is displayed at a fixed position in a virtual reality scene, for example, the information interface is used for displaying information such as a life value and a score of a user; the spatial information interface refers to an information interface which is displayed beside the virtual object in a square interface mode after the virtual object is triggered.
Obviously, for the non-scenario information interface, because the non-scenario information interface is always displayed at a fixed position of the virtual reality scene, the non-scenario information interface has obvious separation sense with the virtual object, and further breaks down the immersion sense of interaction between the user and the virtual object, and for the space information interface, although the spatial information interface is displayed beside the virtual object, the displayed position changes along with the visual angle of the user, so that the non-scenario information interface also has obvious separation sense with the virtual object, and simultaneously breaks down the immersion sense of interaction between the user and the virtual object.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The invention aims to provide an information interface display method and device in virtual reality, a storage medium and electronic equipment, and further solves the problem that the information interface is obviously separated from a virtual object at least to a certain extent, so that the immersion sense of interaction between a user and the virtual object is damaged.
In one aspect of the present disclosure, an information interface display method in virtual reality is provided, including:
responding to a request for displaying information, determining a virtual object for displaying the information in a virtual reality scene according to a preset rule, and acquiring model information of a three-dimensional entity of the virtual object;
generating an information interface comprising the information according to the model information, and projecting the information interface to the surface of the three-dimensional entity;
and controlling the information interface to perform self-adaptive scaling according to the three-dimensional entity so as to fit the surface of the three-dimensional entity.
In an exemplary embodiment of the present disclosure, the model information includes maximum contour information of the three-dimensional entity;
the generating an information interface including the information according to the model information includes:
and generating an information interface comprising the information according to the maximum outline information of the three-dimensional entity.
In an exemplary embodiment of the disclosure, the controlling the information interface to adaptively zoom according to the three-dimensional entity to conform to the surface of the three-dimensional entity includes:
and analyzing the form information of the three-dimensional entity, and controlling the information interface to perform self-adaptive scaling according to the form information of the three-dimensional entity so as to attach to the surface of the three-dimensional entity.
In an exemplary embodiment of the present disclosure, the responding to the request for presentation information includes:
detecting whether the quasi-center ray moves to an information display triggering area or not;
and responding to a request for displaying information when the quasi-center ray is detected to move to the information display triggering area.
In an exemplary embodiment of the present disclosure, the information includes character information and/or picture information.
According to one aspect of the present disclosure, there is provided an information interface display apparatus in virtual reality, including:
the acquisition module is used for responding to a request for displaying information, determining a virtual object for displaying the information in a virtual reality scene according to a preset rule, and acquiring model information of a three-dimensional entity of the virtual object;
the projection module is used for generating an information interface comprising the information according to the model information and projecting the information interface to the surface of the three-dimensional entity;
and the zooming module is used for controlling the information interface to carry out self-adaptive zooming according to the three-dimensional entity so as to fit the surface of the three-dimensional entity.
In an exemplary embodiment of the present disclosure, the model information includes maximum contour information of the three-dimensional entity;
the generating an information interface including the information according to the model information includes:
and generating an information interface comprising the information according to the maximum outline information of the three-dimensional entity.
In an exemplary embodiment of the disclosure, the controlling the information interface to adaptively zoom according to the three-dimensional entity to conform to the surface of the three-dimensional entity includes:
and analyzing the form information of the three-dimensional entity, and controlling the information interface to perform self-adaptive scaling according to the form information of the three-dimensional entity so as to attach to the surface of the three-dimensional entity.
According to an aspect of the present disclosure, there is provided a computer-readable storage medium, on which a computer program is stored, wherein the computer program, when executed by a processor, implements the method for displaying an information interface in virtual reality as described in any one of the above.
According to an aspect of the present disclosure, there is provided an electronic device including:
a processor; and
a memory for storing executable instructions of the processor;
the processor is configured to execute the information interface display method in the virtual reality through executing the executable instructions.
The information interface display method and device in virtual reality, storage medium and electronic device provided in the exemplary embodiment. Generating an information interface comprising the information according to model information of a three-dimensional entity of a virtual object, projecting the information interface to the surface of the three-dimensional entity, and controlling the information interface to adaptively zoom according to the three-dimensional entity so as to fit the surface of the three-dimensional entity. On one hand, the information interface comprising the information is projected to the surface of the three-dimensional entity of the virtual object, namely the information interface is displayed on the surface of the three-dimensional entity of the virtual object instead of being displayed beside the three-dimensional entity or at a fixed position in the virtual reality scene, so that compared with the prior art, the space of the virtual reality scene is saved, the virtual reality scene is simpler and the information display is more visual; on the other hand, the information interface comprising the information is projected to the surface of the three-dimensional entity of the virtual object, and the information interface is controlled to be adaptively zoomed according to the three-dimensional entity, so that the information interface is more attached to the surface of the three-dimensional entity and more fit with the size and the shape of the three-dimensional entity, the separation sense of the information interface and the virtual object is greatly reduced, the integral sense of the information interface and the virtual object is greatly improved, and the interactive immersion sense and the sense of reality of a user and the virtual object are improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The above and other features and advantages of the present disclosure will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty. In the drawings:
FIG. 1 is a flow chart of an information interface display method in virtual reality according to the present disclosure;
FIG. 2 is a schematic illustration of projecting an information interface provided in an exemplary embodiment of the present disclosure;
FIG. 3 is an illustration of an effect of adaptive zooming on an information interface provided in an exemplary embodiment of the present disclosure;
FIG. 4 is a block diagram of an information interface display apparatus in virtual reality according to the present disclosure;
FIG. 5 is a block diagram view of an electronic device in an exemplary embodiment of the disclosure;
FIG. 6 is a schematic diagram illustrating a program product in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The same reference numerals denote the same or similar parts in the drawings, and thus, a repetitive description thereof will be omitted.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the embodiments of the disclosure can be practiced without one or more of the specific details, or with other methods, components, materials, devices, steps, and so forth. In other instances, well-known structures, methods, devices, implementations, materials, or operations are not shown or described in detail to avoid obscuring aspects of the disclosure.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. That is, these functional entities may be implemented in the form of software, or in one or more software-hardened modules, or in different networks and/or processor devices and/or microcontroller devices.
The exemplary embodiment first discloses a virtual reality information interface display method, which is applied to an intelligent terminal capable of presenting virtual objects and virtual reality scenes. The intelligent terminal may be, for example, virtual reality glasses, a virtual reality helmet, and the like, which is not particularly limited in this exemplary embodiment. Referring to fig. 1, the information interface display method in virtual reality may include the following steps:
step S110, responding to a request for displaying information, determining a virtual object for displaying the information in a virtual reality scene according to a preset rule, and acquiring model information of a three-dimensional entity of the virtual object;
step S120, generating an information interface comprising the information according to the model information, and projecting the information interface to the surface of the three-dimensional entity;
and S130, controlling the information interface to perform self-adaptive scaling according to the three-dimensional entity so as to fit the surface of the three-dimensional entity.
In the information interface display method in virtual reality provided in the present exemplary embodiment, on one hand, the information interface including the information is projected onto the surface of the three-dimensional entity of the virtual object, that is, the information interface is displayed on the surface of the three-dimensional entity of the virtual object, instead of displaying the information interface beside the three-dimensional entity or at a fixed position in the virtual reality scene, compared with the prior art, the space of the virtual reality scene is saved, so that the virtual reality scene is more concise and the information display is more intuitive; on the other hand, the information interface comprising the information is projected to the surface of the three-dimensional entity of the virtual object, and the information interface is controlled to be adaptively zoomed according to the three-dimensional entity, so that the information interface is more attached to the surface of the three-dimensional entity and more fit with the size and the shape of the three-dimensional entity, the separation sense of the information interface and the virtual object is greatly reduced, the integral sense of the information interface and the virtual object is greatly improved, and the interactive immersion sense and the sense of reality of a user and the virtual object are improved.
Next, each step in the information interface presentation method in virtual reality will be described in detail with reference to fig. 1.
In step S110, in response to the request for displaying information, a virtual object for displaying the information is determined in a virtual reality scene according to a preset rule, and model information of a three-dimensional entity of the virtual object is obtained.
In the present exemplary embodiment, the information may include information related to a virtual object for presenting the information, and may further include information unrelated to the virtual object presenting the information. The method for determining the virtual object for displaying the information in the virtual reality scene according to the preset rule may include: whether the displayed information is associated with a certain virtual object in the virtual reality scene may be determined, if the displayed information is associated with a certain virtual object in the virtual reality scene, the virtual object associated with the presence is determined as the virtual object for displaying the information, if the displayed information is not associated with all virtual objects in the virtual reality scene, any virtual object in the virtual reality scene may be determined as the virtual object for displaying the information, the virtual object located in the middle of the virtual reality scene may be determined as the virtual object for displaying the information, or the virtual object with the largest volume in the virtual reality scene may be determined as the virtual object for displaying the information, which is not particularly limited in this exemplary embodiment. The above-described manner of determining the virtual object of the presentation information is merely exemplary, and is not intended to limit the present invention.
Model information of a three-dimensional entity of a virtual object for presenting information may be acquired by an acquisition module. The model information of the three-dimensional entity of the virtual object may include maximum contour information of the three-dimensional entity, a height, a length, a width, a volume, a shape, etc., and the exemplary embodiment is not particularly limited thereto. The virtual object may be a virtual object, a virtual character, and the like, which is not particularly limited in this exemplary embodiment.
The user can control the intelligent terminal to trigger the request for displaying the information through the quasi-center ray, so that the intelligent terminal responds to the request for displaying the information. Taking the quasi-centric ray as an example, the responding to the request for the presentation information may include: detecting whether the quasi-center ray moves to an information display triggering area or not; and responding to a request for displaying information when the quasi-center ray is detected to move to the information display triggering area. In the present exemplary embodiment, the virtual reality scene may include a plurality of information display trigger areas, and each information display trigger area is used for displaying different information. The information display triggering area can be an area surrounded by the outline of a virtual object in the virtual reality scene, and can also be an information display control in the virtual reality scene. The developer can make one-to-one correspondence between each information display triggering area and each displayed information. For example, when the information display trigger area is an area surrounded by the outline of the virtual object, the information of each virtual object is in one-to-one correspondence with the area surrounded by the outline of the corresponding virtual object, that is, when the isocentric ray moves to the area surrounded by the outline of a virtual object, the request for displaying the information of the virtual object is responded. It should be noted that the correspondence between the information display triggering area and each displayed information is merely exemplary, and is not intended to limit the present invention. For example, when the information display trigger area is an area surrounded by the outline of the virtual object, the area surrounded by the outline of one virtual object is associated with information of another virtual object, which is not particularly limited in the present exemplary embodiment.
Next, a process of responding to the request for the presentation information will be described by taking the smart terminal as the virtual reality glasses. The user can control the movement of the collimation center ray by rotating the head or the physical control on the virtual reality glasses, and when the collimation center ray moves to the information display triggering area, the request for displaying the information corresponding to the information display triggering area is responded.
In step S120, an information interface including the information is generated according to the model information, and the information interface is projected to the surface of the three-dimensional entity.
In the present exemplary embodiment, the information may include character information and/or picture information. The character information may include numerical information, literal information, and alphabetic information. The size of the character information and the picture information may be determined according to the area of the information interface and the visible surface of the three-dimensional entity of the virtual object. The shape of the information interface may be set according to the shape of the virtual object or may be set by a developer, for example, the information interface may be a forward direction, a rectangle, a circle, or the like, which is not particularly limited in the present exemplary embodiment. The size, color and transparency information of the information interface can be set by a developer.
The model information may include maximum outline information of the three-dimensional entity, and the maximum outline information may include a position and an area of a region with a maximum area in an outline of the three-dimensional entity, for example, in fig. 2, the virtual object is a bulb, and since the outline of the three-dimensional entity 201 of the bulb is small in top and small in middle, the maximum outline information of the three-dimensional entity 201 of the bulb is a middle region of the three-dimensional entity 201 of the bulb.
Based on this, the generating an information interface including the information according to the model information may include: and generating an information interface comprising the information according to the maximum outline information of the three-dimensional entity.
In the present exemplary embodiment, for example, as shown in fig. 2, an information interface 202 including information may be generated from maximum outline information of a three-dimensional entity 201 of a bulb. That is, an information interface 202 corresponding to the area of the region with the largest area in the outline of the bulb may be generated according to the position and the area of the region with the largest area in the outline of the three-dimensional entity 201 of the bulb, and information may be set in the information interface 202.
The specific process of projecting the information interface to the surface of the three-dimensional entity may be as follows: and arranging a datum point on the surface of the three-dimensional entity, arranging a datum point in the information interface, and projecting the information interface to the surface of the three-dimensional entity on the basis of the alignment of the two datum points.
The specific process of projecting the information interface to the surface of the three-dimensional entity may also be: and projecting the information interface to the surface of the three-dimensional entity according to the maximum outline information of the three-dimensional entity, namely setting a reference point in the area with the maximum area in the outline of the three-dimensional entity, setting a reference point in the information interface, and projecting the information interface to the surface of the three-dimensional entity on the basis of the alignment of the two reference points.
In step S130, the information interface is controlled to perform adaptive scaling according to the three-dimensional entity so as to fit the surface of the three-dimensional entity.
In the exemplary embodiment, in order to make the information interface more fit to the surface of the three-dimensional entity and further to make the information interface and the three-dimensional entity be integrated, after the information interface is projected onto the surface of the three-dimensional entity, the information interface is controlled to perform adaptive scaling according to the three-dimensional entity. The specific adaptive scaling process may include: and analyzing the form information of the three-dimensional entity, and controlling the information interface to perform self-adaptive scaling according to the form information of the three-dimensional entity so as to attach to the surface of the three-dimensional entity. In the present exemplary embodiment, the surface of the three-dimensional entity may be divided into a plurality of sub-regions, and the areas of the sub-regions may be set by a developer, which is not particularly limited in the present exemplary embodiment; meanwhile, the information interface is divided into a plurality of sub-regions, the areas of the sub-regions may also be set by a developer, which is not particularly limited in this exemplary embodiment; and establishing a corresponding relation between each sub-region in the surface of the three-dimensional entity and each sub-region in the information interface. Then, analyzing the form of the three-dimensional entity to obtain form information of the three-dimensional entity, wherein the form information comprises form information of each subarea in the surface of the three-dimensional entity; and finally, controlling the corresponding sub-regions in the information interface to zoom according to the morphological information of the sub-regions in the surface of the three-dimensional entity. Fig. 3 shows the effect diagram after the information interface is adaptively scaled according to the three-dimensional solid body 201 of the bulb in fig. 2, and it can be seen from fig. 3 that the information interface 202 is very closely attached to the surface of the three-dimensional solid body 201 of the bulb, and visually, the information interface 202 is integrated with the surface of the three-dimensional solid body 201 of the bulb.
In addition, when the shape of the three-dimensional entity of the virtual object changes, the information interface is controlled to change along with the change of the shape of the three-dimensional entity, namely, the information interface is controlled to be adaptively zoomed again according to the changed shape of the three-dimensional entity in the mode, so that the information interface is more attached to the surface of the three-dimensional entity, and the user experience is improved. When the visual angle of the user changes, the information interface is controlled to change the display position on the surface of the virtual object along with the change of the visual angle of the user, and meanwhile, self-adaptive scaling is carried out according to the form of the three-dimensional entity of the virtual object at the changed display position, so that the information of the virtual object can still be seen by the user after the visual angle is changed, the information interface at the changed display position is ensured to be more attached to the surface of the three-dimensional entity, and better experience is brought to the user.
In summary, the information interface including the information is projected onto the surface of the three-dimensional entity of the virtual object, that is, the information interface is displayed on the surface of the three-dimensional entity of the virtual object, rather than being displayed beside the three-dimensional entity or at a fixed position in the virtual reality scene, compared with the prior art, the space of the virtual reality scene is saved, so that the virtual reality scene is simpler and the information display is more intuitive; in addition, the information interface comprising the information is projected to the surface of the three-dimensional entity of the virtual object, and the information interface is controlled to be adaptively zoomed according to the three-dimensional entity, so that the information interface is more attached to the surface of the three-dimensional entity and more fit with the size and the shape of the three-dimensional entity, the separation sense of the information interface and the virtual object is greatly reduced, the integral sense of the information interface and the virtual object is greatly improved, and the interactive immersion sense and the sense of reality of a user and the virtual object are improved.
It should be noted that although the various steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that these steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
In an exemplary embodiment of the present disclosure, an information interface display apparatus in virtual reality is further provided, where the information interface display apparatus is applied to an intelligent terminal capable of presenting a virtual reality scene and a virtual object, and the intelligent terminal may be, for example, virtual reality glasses, a virtual reality helmet, and the like, and this exemplary embodiment is not particularly limited in this respect. As shown in fig. 4, the information interface presentation apparatus 400 in virtual reality may include: an acquisition module 401, a projection module 402, and a scaling module 403, wherein:
the obtaining module 401 may be configured to respond to a request for displaying information, determine a virtual object for displaying the information in a virtual reality scene according to a preset rule, and obtain model information of a three-dimensional entity of the virtual object;
a projection module 402, configured to generate an information interface including the information according to the model information, and project the information interface to a surface of the three-dimensional entity;
the scaling module 403 may be configured to control the information interface to perform adaptive scaling according to the three-dimensional entity to fit the surface of the three-dimensional entity.
In an exemplary embodiment of the present disclosure, the model information may include maximum contour information of the three-dimensional entity. On this basis, the generating an information interface including the information according to the model information may include: and generating an information interface comprising the information according to the maximum outline information of the three-dimensional entity.
In an exemplary embodiment of the present disclosure, the controlling the information interface to adaptively zoom according to the three-dimensional entity to fit the surface of the three-dimensional entity may include: and analyzing the form information of the three-dimensional entity, and controlling the information interface to perform self-adaptive scaling according to the form information of the three-dimensional entity so as to attach to the surface of the three-dimensional entity.
The specific details of the information interface display device module in each virtual reality are already described in detail in the corresponding information interface display method in the virtual reality, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the apparatus for performing are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
In an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Thus, various aspects of the invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 500 according to this embodiment of the invention is described below with reference to fig. 5. The electronic device 500 shown in fig. 5 is only an example and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 5, the electronic device 500 is embodied in the form of a general purpose computing device. The components of the electronic device 500 may include, but are not limited to: the at least one processing unit 510, the at least one memory unit 520, a bus 530 connecting various system components (including the memory unit 520 and the processing unit 510), and a display unit 540.
Wherein the storage unit stores program code that is executable by the processing unit 510 to cause the processing unit 510 to perform steps according to various exemplary embodiments of the present invention as described in the above section "exemplary methods" of the present specification. For example, the processing unit 510 may execute step S110 shown in fig. 1, in response to a request for displaying information, determine a virtual object for displaying the information in a virtual reality scene according to a preset rule, and obtain model information of a three-dimensional entity of the virtual object; step S120, generating an information interface comprising the information according to the model information, and projecting the information interface to the surface of the three-dimensional entity; and S130, controlling the information interface to perform self-adaptive scaling according to the three-dimensional entity so as to fit the surface of the three-dimensional entity.
The memory unit 520 may include a readable medium in the form of a volatile memory unit, such as a random access memory unit (RAM)5201 and/or a cache memory unit 5202, and may further include a read only memory unit (ROM) 5203.
The electronic device 500 may also communicate with one or more external devices 570 (e.g., keyboard, pointing device, Bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 500, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 500 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interfaces 550. Also, the electronic device 500 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via the network adapter 560. As shown, the network adapter 560 communicates with the other modules of the electronic device 500 over the bus 530. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 500, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above section "exemplary methods" of the present description, when said program product is run on the terminal device.
Referring to fig. 6, a program product 600 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Furthermore, the above-described figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.
Claims (10)
1. An information interface display method in virtual reality is characterized by comprising the following steps:
responding to a request for displaying information, determining a virtual object for displaying the information in a virtual reality scene according to a preset rule, and acquiring model information of a three-dimensional entity of the virtual object;
generating an information interface comprising the information according to the model information, and projecting the information interface to the surface of the three-dimensional entity;
controlling the information interface to perform self-adaptive scaling according to the three-dimensional entity so as to be attached to the surface of the three-dimensional entity;
when the visual angle of a user changes, the information interface is controlled to change the display position on the surface of the three-dimensional entity along with the visual angle change of the user, and adaptive scaling is carried out according to the form of the three-dimensional entity at the changed display position, so that the user can see the information of the virtual object after the visual angle is changed, and the information interface is ensured to be attached to the surface of the three-dimensional entity.
2. The information interface presentation method of claim 1, wherein the model information comprises maximum contour information of the three-dimensional entity;
the generating an information interface including the information according to the model information includes:
and generating an information interface comprising the information according to the maximum outline information of the three-dimensional entity.
3. The method for displaying an information interface according to claim 1, wherein the controlling the information interface to adaptively zoom according to the three-dimensional entity to fit the surface of the three-dimensional entity comprises:
and analyzing the form information of the three-dimensional entity, and controlling the information interface to perform self-adaptive scaling according to the form information of the three-dimensional entity so as to attach to the surface of the three-dimensional entity.
4. The information interface presentation method of claim 1, wherein said responding to the request for presentation information comprises:
detecting whether the quasi-center ray moves to an information display triggering area or not;
and responding to a request for displaying information when the quasi-center ray is detected to move to the information display triggering area.
5. An information interface display method as claimed in any one of claims 1 to 4, wherein the information includes character information and/or picture information.
6. An information interface display device in virtual reality is characterized by comprising:
the acquisition module is used for responding to a request for displaying information, determining a virtual object for displaying the information in a virtual reality scene according to a preset rule, and acquiring model information of a three-dimensional entity of the virtual object;
the projection module is used for generating an information interface comprising the information according to the model information and projecting the information interface to the surface of the three-dimensional entity;
the zooming module is used for controlling the information interface to perform self-adaptive zooming according to the three-dimensional entity so as to be attached to the surface of the three-dimensional entity;
the zooming module is further configured to control the information interface to change a display position on the surface of the three-dimensional entity along with the change of the user's viewing angle when the user's viewing angle changes, and perform adaptive zooming according to the changed form of the three-dimensional entity at the display position, so as to ensure that the user sees the information of the virtual object after changing the viewing angle, and ensure that the information interface is attached to the surface of the three-dimensional entity.
7. The information interface presentation device of claim 6, wherein the model information comprises maximum profile information of the three-dimensional entity;
the generating an information interface including the information according to the model information includes:
and generating an information interface comprising the information according to the maximum outline information of the three-dimensional entity.
8. The information interface presentation device of claim 6, wherein said controlling said information interface to adaptively zoom according to said three-dimensional entity to conform to a surface of said three-dimensional entity comprises:
and analyzing the form information of the three-dimensional entity, and controlling the information interface to perform self-adaptive scaling according to the form information of the three-dimensional entity so as to attach to the surface of the three-dimensional entity.
9. A computer-readable storage medium, on which a computer program is stored, wherein the computer program, when executed by a processor, implements the method for displaying an information interface in virtual reality according to any one of claims 1 to 5.
10. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
the processor is configured to execute the information interface display method in the virtual reality according to any one of claims 1 to 5 through executing the executable instructions.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810299608.7A CN108499102B (en) | 2018-04-04 | 2018-04-04 | Information interface display method and device, storage medium and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810299608.7A CN108499102B (en) | 2018-04-04 | 2018-04-04 | Information interface display method and device, storage medium and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108499102A CN108499102A (en) | 2018-09-07 |
CN108499102B true CN108499102B (en) | 2021-04-23 |
Family
ID=63380724
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810299608.7A Active CN108499102B (en) | 2018-04-04 | 2018-04-04 | Information interface display method and device, storage medium and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108499102B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111182278B (en) * | 2018-11-09 | 2022-06-14 | 上海云绅智能科技有限公司 | Projection display management method and system |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101916175B (en) * | 2010-08-20 | 2012-05-02 | 浙江大学 | Intelligent projecting method capable of adapting to projection surface automatically |
US9319842B2 (en) * | 2011-06-27 | 2016-04-19 | At&T Intellectual Property I, L.P. | Mobile device configured point and shoot type weapon |
CN104915986B (en) * | 2015-06-26 | 2018-04-17 | 北京航空航天大学 | A kind of solid threedimensional model method for automatic modeling |
CN106924970B (en) * | 2017-03-08 | 2020-07-07 | 网易(杭州)网络有限公司 | Virtual reality system, information display method and device based on virtual reality |
CN107393017A (en) * | 2017-08-11 | 2017-11-24 | 北京铂石空间科技有限公司 | Image processing method, device, electronic equipment and storage medium |
CN107870672B (en) * | 2017-11-22 | 2021-01-08 | 腾讯科技(成都)有限公司 | Method and device for realizing menu panel in virtual reality scene and readable storage medium |
-
2018
- 2018-04-04 CN CN201810299608.7A patent/CN108499102B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN108499102A (en) | 2018-09-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102435628B1 (en) | Gaze-based object placement within a virtual reality environment | |
KR101845217B1 (en) | User interface interaction for transparent head-mounted displays | |
CN108287657B (en) | Skill applying method and device, storage medium and electronic equipment | |
CN108776544B (en) | Interaction method and device in augmented reality, storage medium and electronic equipment | |
US11423518B2 (en) | Method and device of correcting image distortion, display device, computer readable medium, electronic device | |
US10488918B2 (en) | Analysis of user interface interactions within a virtual reality environment | |
CN107204044B (en) | Picture display method based on virtual reality and related equipment | |
WO2020131592A1 (en) | Mode-changeable augmented reality interface | |
CN108211350B (en) | Information processing method, electronic device, and storage medium | |
CN109189302B (en) | Control method and device of AR virtual model | |
US20180005440A1 (en) | Universal application programming interface for augmented reality | |
CN108355352B (en) | Virtual object control method and device, electronic device and storage medium | |
CN110502097B (en) | Motion control portal in virtual reality | |
CN111773709A (en) | Scene map generation method and device, computer storage medium and electronic equipment | |
EP3528094A1 (en) | Method and device for inputting password in virtual reality scene | |
CN111481923B (en) | Rocker display method and device, computer storage medium and electronic equipment | |
CN110286906B (en) | User interface display method and device, storage medium and mobile terminal | |
CN108499102B (en) | Information interface display method and device, storage medium and electronic equipment | |
CN112965773A (en) | Method, apparatus, device and storage medium for information display | |
US10416761B2 (en) | Zoom effect in gaze tracking interface | |
CN112987924A (en) | Method, apparatus, device and storage medium for device interaction | |
CN110908568B (en) | Control method and device for virtual object | |
US20230147561A1 (en) | Metaverse Content Modality Mapping | |
CN113559501B (en) | Virtual unit selection method and device in game, storage medium and electronic equipment | |
CN113457144B (en) | Virtual unit selection method and device in game, storage medium and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |