CN110286906B - User interface display method and device, storage medium and mobile terminal - Google Patents

User interface display method and device, storage medium and mobile terminal Download PDF

Info

Publication number
CN110286906B
CN110286906B CN201910555316.XA CN201910555316A CN110286906B CN 110286906 B CN110286906 B CN 110286906B CN 201910555316 A CN201910555316 A CN 201910555316A CN 110286906 B CN110286906 B CN 110286906B
Authority
CN
China
Prior art keywords
plane
user interface
real
preset
target plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910555316.XA
Other languages
Chinese (zh)
Other versions
CN110286906A (en
Inventor
林森
刘晚成
赵鸣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201910555316.XA priority Critical patent/CN110286906B/en
Publication of CN110286906A publication Critical patent/CN110286906A/en
Application granted granted Critical
Publication of CN110286906B publication Critical patent/CN110286906B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/38Creation or generation of source code for implementing user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The disclosure provides a user interface display method, a user interface display device, a computer-readable storage medium and augmented reality equipment, and belongs to the technical field of man-machine interaction. The method is applied to an augmented reality device, and comprises the following steps: identifying a real object plane in a real scene, and generating a reference plane according to the real object plane; determining the reference plane meeting preset conditions as a target plane; acquiring a user interface to be displayed; the user interface is displayed in the target plane. The method and the device can enable the user interface in the augmented reality application to be integrated into the real scene, improve the sense of reality displayed by the user interface, increase the sense of immersion of the user and improve the user experience.

Description

User interface display method and device, storage medium and mobile terminal
Technical Field
The disclosure relates to the technical field of human-computer interaction, and in particular relates to a user interface display method, a user interface display device, a computer readable storage medium and electronic equipment.
Background
The augmented reality technology (Augmented Reality, abbreviated as AR) is a technology for calculating the position and angle of a camera image in real time and adding corresponding images, videos and 3D models, and can combine the virtual world with the real world to present to users, so that the users are provided with richer interaction feeling.
At present, in an augmented reality application, a three-dimensional virtual model and a real scene can be combined to form an augmented reality scene, the augmented reality scene is displayed in a screen, a User Interface (UI) is displayed at the uppermost layer of the screen, so that a User can perform interactive operation, after the User inputs information or reads information, the UI disappears, and the complete augmented reality scene is displayed again. However, the displayed user interface may obscure the augmented reality scene, disrupting the user's immersion.
It should be noted that the information disclosed in the above background section is only for enhancing understanding of the background of the present disclosure and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present inventors have found that in the related art, in order to facilitate interaction with a user, a user interface is typically displayed over an augmented reality scene, which is blocked, resulting in a lack of immersion. Taking an AR application on a mobile phone as an example, acquiring a real scene through a camera of the mobile phone, and then combining and displaying the virtual three-dimensional scene and the real scene on a screen of the mobile phone, wherein a user interface of the AR application is displayed at the uppermost layer. As shown in fig. 1, in the displayed screen, the dialog box 100 obscures or is obscured by the virtual scene.
In view of the above, the present disclosure provides a user interface display method, a user interface display device, a computer-readable storage medium, and a mobile terminal, which overcome, at least in part, the above-described problems in the related art.
Other features and advantages of the present disclosure will be apparent from the following detailed description, or may be learned in part by the practice of the disclosure.
According to one aspect of the present disclosure, there is provided a user interface display method applied to an augmented reality device, the method including: identifying a real object plane in a real scene, and generating a reference plane according to the real object plane; determining the reference plane meeting preset conditions as a target plane; acquiring a user interface to be displayed; the user interface is displayed in the target plane.
In an exemplary embodiment of the present disclosure, the method further comprises: detecting movement of the augmented reality device; and when the moving distance of the augmented reality equipment is larger than a preset distance, the target plane is redetermined.
In an exemplary embodiment of the present disclosure, the method further comprises: detecting a rotation of the augmented reality device; and when the rotation angle of the augmented reality device is larger than a preset angle, the target plane is redetermined.
In an exemplary embodiment of the present disclosure, the displaying the user interface in the target plane includes: and adjusting the display size of the user interface according to the width or the height of the target plane so as to display the user interface on the target plane.
In an exemplary embodiment of the present disclosure, the adjusting the display size of the user interface according to the width or the height of the target plane includes: acquiring a preset width and a preset height of the user interface; determining a scaling of the user interface by the preset width and the width of the target plane, or determining a scaling of the user interface by the preset height and the height of the target plane; and determining the display size of the user interface according to the scaling.
In one exemplary embodiment of the present disclosure, determining a real object plane satisfying a preset condition as a target plane includes determining a reference plane among the reference planes; and displaying a virtual scene according to the reference plane, and determining a reference plane meeting a preset position relation with the virtual scene as a target plane.
In an exemplary embodiment of the present disclosure, a reference plane satisfying a preset positional relationship with the virtual scene includes: the reference plane is perpendicular to a horizontal plane of the virtual scene.
In an exemplary embodiment of the present disclosure, the determining the reference plane satisfying a preset condition as a target plane includes: calculating a plurality of bounding boxes of reference planes meeting a preset position relation with the virtual scene, and determining the reference plane with the largest bounding box as a target plane.
In an exemplary embodiment of the present disclosure, the reference plane satisfying a preset positional relationship with the virtual scene includes: and a reference plane in the positive direction of the virtual scene.
In an exemplary embodiment of the present disclosure, the method further comprises: and if the reference plane meeting the preset condition does not exist, displaying the user interface in a preset mode.
According to one aspect of the present disclosure, there is provided a user interface display method including: identifying a real object plane in a real scene, determining a target plane from the real object plane; acquiring a user interface to be displayed; the user interface is projected onto the target plane.
According to one aspect of the present disclosure, there is provided a user interface display apparatus applied to an augmented reality device, the apparatus comprising: the plane identification unit is used for identifying a real object plane in a real scene and generating a reference plane according to the real object plane; a plane screening unit, configured to determine the reference plane satisfying a preset condition as a target plane; the interface acquisition unit is used for acquiring a user interface to be displayed; and the interface display unit is used for displaying a user interface in the target plane.
In an exemplary embodiment of the present disclosure, the apparatus further comprises: a first movement detection unit for detecting movement of the augmented reality device; and the first plane determining unit is used for determining the target plane again when the moving distance of the augmented reality equipment is larger than a preset distance.
In one exemplary embodiment of the present disclosure, a second movement detection unit for detecting a rotation of the augmented reality device; and the second plane determining unit is used for determining the target plane again when the rotation angle of the augmented reality device is larger than a preset angle.
In one exemplary embodiment of the present disclosure, the interface display unit may include: and the first display unit is used for adjusting the display size of the user interface according to the width or the height of the target plane so as to display the user interface on the target plane.
In one exemplary embodiment of the present disclosure, the first display unit may be configured to: acquiring a preset width and a preset height of the user interface; determining a scaling of the user interface by the preset width and the width of the target plane, or determining a scaling of the user interface by the preset height and the height of the target plane; and determining the display size of the user interface according to the scaling.
In an exemplary embodiment of the present disclosure, the planar screening unit may further include: a third plane determining unit configured to determine a reference plane among the reference planes; and the scene display unit is used for displaying a virtual scene on the reference plane and determining a reference plane meeting the preset position relation with the virtual scene as a target plane.
In an exemplary embodiment of the present disclosure, the reference plane satisfying a preset positional relationship with the virtual scene includes the reference plane perpendicular to a horizontal plane of the virtual scene. In an exemplary embodiment of the present disclosure, the plane screening unit may be further configured to calculate bounding boxes of a plurality of reference planes that satisfy a preset positional relationship with the virtual scene, and determine a reference plane with a largest bounding box as the target plane.
In an exemplary embodiment of the present disclosure, the reference plane satisfying a preset positional relationship with the virtual scene includes a reference plane in a positive direction of the virtual scene. In an exemplary embodiment of the present disclosure, the user interface display apparatus may further include: and the default display unit is used for displaying the user interface in a preset mode if the fact that the reference plane meeting the preset condition does not exist is determined.
According to an aspect of the present disclosure, there is also provided a user interface display apparatus including: a plane acquisition unit configured to identify a real object plane in a real scene, from which a target plane is determined; the interface acquisition unit is used for acquiring a user interface to be displayed; and the interface projection unit is used for projecting the user interface on the target plane.
According to one aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the user interface display method described above.
According to one aspect of the present disclosure, there is provided an electronic device including: a processor; a memory for storing executable instructions of the processor; an image pickup device; wherein the processor is configured to perform the user interface display method described above via execution of the executable instructions.
Exemplary embodiments of the present disclosure have the following advantageous effects:
by generating a reference plane from a real object plane in a real scene, a target plane is determined from the reference plane, based on which the user interface is displayed. On the one hand, the user interface can be integrated into a real scene based on the target plane display user interface, so that the user interface is mutually independent from an augmented reality scene, shielding or cutting of the augmented reality scene caused by the user interface is avoided, the immersion of a user is improved, and the user experience is improved. On the other hand, the user interface is displayed without considering the superposition display effect of the augmented reality scene, so that the flexibility is improved. On the other hand, the user interface is fused into the real scene, so that the user interface can be prevented from cutting the virtual scene, and the sense of reality of the user is increased.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those of ordinary skill in the art that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived from them without undue effort.
FIG. 1 illustrates a prior art user interface display effect;
FIG. 2 shows a flowchart of a user interface display method in the present exemplary embodiment;
FIG. 3 is a flowchart showing another user interface display method in the present exemplary embodiment;
FIG. 4 is a flowchart showing another user interface display method in the present exemplary embodiment;
FIG. 5 shows a flowchart of yet another user interface display method in the present exemplary embodiment;
fig. 6 shows a schematic diagram of a coordinate system of a virtual scene in the present exemplary embodiment;
fig. 7 shows a block diagram of a user interface display apparatus in the present exemplary embodiment;
fig. 8 shows a block diagram of another user interface display apparatus in the present exemplary embodiment;
fig. 9 shows a computer-readable storage medium for implementing the above-described method in the present exemplary embodiment;
fig. 10 shows an electronic device for implementing the above method in the present exemplary embodiment.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
In this context, "first", "second", "third", etc. are used merely as labels for the respective objects and do not limit the number or order.
Exemplary embodiments of the present disclosure first provide a user interface display method, an execution subject of which may be an augmented reality device. The augmented reality device is a terminal device capable of implementing an augmented reality technology, such as a wearable device, a mobile phone, a computer, a tablet computer, and the like. Deploying the augmented reality application in the terminal device enables the device to implement the corresponding augmented reality function. Currently, many terminal devices support augmented reality applications, such as mobile AR map navigation, mobile AR games, and the like, and there are also many dedicated devices for AR, such as AR glasses, AR head-mounted devices, and the like, which are not limited by the present disclosure.
Fig. 2 shows a flow of the present exemplary embodiment, which may include the following steps:
step S210, identifying a real object plane in a real scene, and generating a reference plane according to the real object plane;
step S220, determining the reference plane meeting the preset condition as a target plane;
step S230, obtaining a user interface to be displayed;
Step S240, displaying the user interface in the target plane.
The real scene is an actual environment where a user using the augmented reality device is located, and may include a plurality of real objects, such as a table and a chair, a wall, a ground, and the like. The real object plane may refer to a surface of a real object, such as a wall surface, a table top, a floor surface, etc. The augmented reality device can recognize real objects in a real scene through the camera, so that a virtual scene is projected in the real scene; or according to the real object in the real scene acquired by the camera, combining and displaying the virtual three-dimensional model and the model of the real object on the screen.
In the method, the image data or the video data of the real scene can be obtained through the camera, so that all real objects in the real scene are identified according to the image and the video data in the real scene, and the real object plane in the real scene is obtained. The real scene may include a plurality of real objects, so that a plurality of real object planes may be acquired, for example, a plane parallel to a horizontal plane (such as a ground, a desktop, etc.), a plane perpendicular to the horizontal plane (such as a wall, a window, a mirror, etc.), or other planes, such as an inclined plane, etc., which is not limited in this embodiment. The reference plane may refer to a virtual plane parallel to the real plane, which may be generated from the real object plane in the real scene in order to display the user interface. The reference plane may be generated by projection on or parallel to the real object plane. Alternatively, the reference plane is generated by modeling in the virtual scene.
The target plane may include a plane with a satisfactory angle in the reference plane, and specifically, a virtual scene and a virtual three-dimensional coordinate system may be established according to the real scene, so that the reference plane corresponds to the three-dimensional coordinate system, and the target plane may be determined by using the three-dimensional coordinate system. If the coordinate system is established with the ground as the x-y plane, the target plane may include a plane perpendicular to the x-y plane. In other words, the preset condition may include a plane whose positional relationship satisfies the requirement as the target plane. And, the preset condition may further include a plane of which the shape satisfies the requirement, such as a regular-shaped plane of a rectangle, a triangle, a circle, or the like, or an irregular plane, among the reference planes.
Further, the user interface can be displayed on the target plane based on the target plane. The user interface may be projected onto a target plane or embedded onto a target plane in a virtual scene. The user interface may include an interface that the augmented reality application presents when implementing the function. The user interface is an interface for implementing interaction with a user in an augmented reality application, for example, acquiring information input by the user or displaying the information to the user.
In an example embodiment of the disclosure, on the one hand, according to a reference plane generated by a real plane in a real scene, a target plane is determined from the reference plane, and based on the target plane, the user interface and the augmented reality scene can be mutually independent by displaying the user interface, so that shielding or cutting of the augmented reality scene by the user interface is avoided, the immersion of a user is improved, and the user experience is improved. On the other hand, the user interface is displayed without considering the superposition display effect of the augmented reality scene, so that the flexibility is improved. In still another aspect, displaying the user interface in the real scene may avoid the user interface from splitting the augmented reality scene, thereby increasing the realism of the user.
The embodiment further includes a virtual scene, which may be a scene presented to the user by the augmented reality application, such as a game scene, a test scene, a medical scene, or the like, or may be a scene of art design, such as an indoor scene, a landscape scene, a garden scene, or the like, or other scenes, and the virtual scene may be projected into the real scene, so that the user has an immersive experience. The augmented reality device of the present exemplary embodiment may thus enable the application of learning, entertainment, medicine, art, or other scenes. Thus, in the present embodiment, the step S220 may specifically include the steps of:
Step S301, determining a reference plane in the reference planes;
step S302, displaying a virtual scene according to the reference plane, and determining a reference plane meeting a preset position relation with the virtual scene as the target plane;
a reference plane is selected from the reference planes, upon which the virtual scene is displayed, the virtual scene may be projected onto the reference plane. After the virtual scene is projected on the reference plane, a reference plane satisfying a preset positional relationship with the virtual scene may be determined as a target plane, for example, a reference plane parallel or perpendicular to a plane of coordinates (x, 0, z) in the virtual scene may be determined as a target plane. In an exemplary embodiment, the reference plane satisfying the preset positional relationship with the virtual scene may include a reference plane perpendicular to a horizontal plane of the virtual scene. The horizontal plane of the virtual scene may include a virtual ground, a virtual desktop, etc. in the virtual scene, or may include a horizontal plane, such as a ground, etc. in the real scene, which is not particularly limited in this embodiment. The reference plane perpendicular to the horizontal plane of the virtual scene may be determined as a target plane, such as a wall surface, a counter surface, a window surface, a mirror-corresponding reference plane, etc.
The reference plane satisfying the preset positional relationship with the virtual scene may further include a reference plane in a positive direction of the virtual scene. The positive direction in three dimensions of the virtual scene may be determined according to the virtual coordinate system of the virtual scene, and a reference plane of the positive direction in the three dimensions may be determined as the target plane. As shown in fig. 6, when the virtual scene is projected into the real scene, the virtual scene is oriented to the user at the first person viewing angle, and the direction oriented to the user may be the positive direction of the virtual coordinate system. For example, if the plane parallel to the user's line of sight is the x-z plane when the user is on the plane, then the point z >0 or x >0 is the point in the positive direction of the virtual coordinate system. The point of z >0 in the virtual scene may be considered to be on the front side of the virtual scene, and the point of z <0 may be considered to be on the back side of the virtual scene, then the reference plane in the direction of z >0, or the reference plane in the direction of x >0, is taken as the target plane to more conform to the viewing angle of the user.
The reference planes satisfying the preset condition may include a plurality of, and the target plane in the reference planes may be determined by calculating the area of each reference plane, for example, the largest area in the reference planes is taken as the target plane. The target plane may be selected from the reference planes by other means, for example, a target plane having the largest length among the reference planes may be selected as the target plane, or a plane may be selected as the target plane from the reference planes at random.
In an exemplary embodiment, the target plane may be determined using bounding boxes for the respective reference planes. Specifically, bounding boxes of all the reference planes are calculated, and the reference plane with the largest bounding box is determined as the target plane. The bounding box may refer to an AABB bounding box, an OBB bounding box, or a bounding sphere, etc., to which the present exemplary embodiment does not particularly limit.
By calculating bounding boxes of the respective reference planes, a plane of maximum width of the bounding box and a plane of maximum height of the bounding box can be determined. Therefore, the target plane can be determined according to the display effect of the user interface after the user interface is zoomed. Specifically, a plane with the largest bounding box width is used as a first plane to calculate a first scaling factor of the user interface, a plane with the largest bounding box height is used as a second plane to calculate a second scaling factor, the user interface is scaled by the first scaling factor, if the user interface cannot be displayed on the first plane completely, the user interface is scaled by the second scaling factor, and if the user interface can be displayed on the second plane completely, the second plane is used as a final target plane. And if the user interface is scaled by the first scaling factor and the second scaling factor respectively and then exceeds the first plane and the second plane, taking the plane with the largest area of the displayed user interface as a target plane. In addition, the target plane may be determined from the multiple reference planes in other manners, for example, a plane with the largest bounding box width is taken as the target plane by default, and the like.
After the target plane is obtained, the user interface may be displayed on the target plane. The user interface may include a plurality of preset display parameters, such as color, size, etc. of the user interface, or coordinates of the user interface in a virtual coordinate system. Coordinates of the projection of the user interface into the target plane, and coordinates of each component in the user interface are calculated according to the size of the target plane, thereby projecting the user interface into the target plane. In addition, in order to enable the user interface to be more attached to the real scene, the immersion of the user is increased, and the user interface can be scaled, so that the user interface can be better adapted to the size of the target plane. That is, the display size of the user interface may be adjusted according to the width or height of the target plane. For example, the display size of the user interface is adjusted so that the display width of the user interface is equal to the width of the target plane within a certain error range.
In an exemplary embodiment, the adjusting of the display size of the user interface may specifically include the steps of:
step S401, obtaining a preset width and a preset height of the user interface;
step S402, determining the scaling of the user interface through the preset width and the width of the target plane, or determining the scaling of the user interface through the preset height and the height of the target plane;
Step S403, determining a display size of the user interface according to the scaling.
The preset width and preset height of the user interface are typically set during the development of the augmented reality application, and various parameters of the user interface, i.e., the width and height of the user interface and the color, layout of the user interface, or the addition of various components in the user interface, are set by the developer. For example, a width of 500px, a height of 300px, or the like may be set. The height or width of the target plane can be obtained through a camera, and geometric data of the target plane can be determined after the camera scans a real scene. The geometric data may refer to actual data of the target plane, for example, the target plane is 1m wide, 0.5m high, etc., and may refer to data after the actual data of the target plane is converted into virtual units, different engines used by different augmented reality applications may be different, and different engines may be different for the data after the actual data is converted. If the target plane is an irregular geometric plane, the width and height of the target plane may be the width and height of the bounding box obtained when the bounding box is calculated.
For example, if the preset width of the user interface is actual_w and the width of the target plane is virtual_w, the scaling of the user interface can be obtained according to virtual_w ()/actual_w, where ToEngineScale () represents a function of converting the virtual unit into the corresponding engine unit. Similarly, calculating the scale of the user interface from the height virtual_h of the target plane may be performed by virtual_h to engineering scale ()/actual_h. The result of the calculation is the scaling of the user interface.
The user interface may be scaled according to the scale so as to be displayed on the target plane. The user interface and the virtual scene in the real scene are mutually independent, so that the problem that the user interface shields the virtual scene is avoided, and the user experience is smoother. Meanwhile, the display of the user interface is not limited by the virtual scene, so that the user can conveniently perform various interactive operations on the user interface, and the flexibility of interaction can be improved.
In an exemplary embodiment, as the augmented reality device camera moves, the identified real scene may change, and in response to the real scene changing, the target plane is redetermined from the changed real scene. When the recognized real scene changes, the reference plane generated according to the real object plane in the real scene also transmits the change, and the target plane is redetermined according to the changed reference plane, so that the user interface is displayed on the redetermined target plane.
In particular, movement of the augmented reality device may be detected; when the moving distance of the augmented reality device is greater than the preset distance, the target plane may be redetermined. The preset distance may include 0.5 m, 1 m, etc., and may also include other distances, for example, 0.8 m, 2 m, etc., which is not limited in this embodiment. The movement of the augmented reality device may be detected by a distance sensor, such as an infrared sensor, etc., the distance detected by the distance sensor may be acquired at regular intervals, then a difference of the distances acquired by adjacent time points is calculated, and if the difference is greater than a preset distance, the target plane may be redetermined.
Alternatively, rotation of the augmented reality device may be detected; and when the rotation angle of the augmented reality device is larger than a preset angle, the target plane is redetermined. The preset angle may include 15 degrees, 30 degrees, etc., and may also include other angle values, for example, 10 degrees, 20 degrees, etc., which is not limited in this embodiment. The rotation angle of the augmented reality device can be detected through the direction sensor, if the direction angle of the augmented reality device is detected to change, the change amplitude, namely the rotation angle, is calculated, and if the rotation angle is larger than a preset angle, the target plane is determined again.
In an exemplary embodiment, if there is no real object plane in the real scene that satisfies the preset condition, i.e., there is no target plane, the user interface is displayed in a preset manner. The preset manner may include displaying the user interface at n units of the virtual scene, for example, if a virtual scene is generated based on a reference plane, which coincides with an x-z plane of the virtual scene, the user interface may be displayed at n units from a maximum x coordinate of the virtual scene. Alternatively, the user interface is randomly displayed at n units from the virtual scene.
Fig. 5 schematically shows another flow of the present exemplary embodiment. As shown in fig. 5, the present exemplary embodiment may include steps S501 to S507. For step S501, when the user uses the terminal device with the augmented reality application, the augmented reality application needs to acquire the use permission of the camera, and after the use permission of the camera is successfully acquired, the camera can be started to scan the real scene, and the scanned real object plane is converted into the virtual reference plane through the corresponding engine. For step S502, it is determined whether there is a reference plane satisfying a preset condition. Specifically, the ground or other plane parallel to the ground in the real scene may be used as a reference plane, the virtual scene is displayed on the reference plane, if the reference plane is perpendicular to the reference plane, it is determined that the reference plane satisfying the preset condition exists, and step S503 is executed; if there is no reference plane perpendicular to the reference plane, step S507 is performed.
For step S503, the AABB bounding box is calculated by the geometric data of the reference planes satisfying the preset condition, thereby deriving the maximum height of each reference plane. After obtaining the maximum and high of each reference plane, the maximum is selected from the data, and the scaling of the user interface is calculated by the maximum, or the scaling of the user interface is calculated by the maximum height in step S504. The maximum height or maximum reference plane is determined as the target plane by the scaling of the user interface. After determining the scale of the user interface, in step S505, the position and angle of the user interface are set according to the target plane. And determining the position and angle of the user interface display according to the corresponding data of the target plane in the virtual coordinate system, and further executing step S506 to enable the user interface to be displayed on the target plane.
For step S507, the user interface is set as a default display mode. Illustratively, as shown in fig. 6, when the reference plane for displaying the virtual scene is the x-z plane, the default display angle of the user interface is perpendicular to the x-z plane, and the default display position of the user interface is z >0, or x > 0. Alternatively, where the maximum value of each point coordinate of the virtual scene in the x direction is determined to be x1 and the maximum value of the virtual scene in the z direction is determined to be z1, the default user interface display location may include a plane at z1+n, or a plane at x1+n, where n is a positive integer. After determining the display mode of the user interface, step S506 is performed to display the user interface.
In an exemplary embodiment, a real object plane in the real scene is identified, a target plane may be determined from the real object plane, i.e., the target plane may be the real plane, and then the user interface may be projected directly onto the target plane. The user interface is an interface for interaction with a user in a virtual reality application, and the virtual reality application can also provide a virtual scene, and the virtual scene can also be projected on a real object plane in the real scene, so that the target plane can comprise other real object planes perpendicular to the plane.
Exemplary embodiments of the present disclosure also provide a user interface display apparatus that may be applied to an augmented reality device. As shown in fig. 7, the user interface display device 700 may include: a plane acquisition unit 710, configured to identify a real object plane in a real scene, and generate a reference plane according to the real object plane; a target plane determining unit 720, configured to determine the reference plane satisfying a preset condition as a target plane; an interface acquisition unit 730 for acquiring a user interface; an interface display unit 740 for displaying the user interface in the target plane.
In an exemplary embodiment of the present disclosure, the apparatus further comprises: a first movement detection unit for detecting movement of the augmented reality device; and the first plane determining unit is used for determining the target plane again when the moving distance of the augmented reality equipment is larger than a preset distance.
In one exemplary embodiment of the present disclosure, a second movement detection unit for detecting a rotation of the augmented reality device; and the second plane determining unit is used for determining the target plane again when the rotation angle of the augmented reality device is larger than a preset angle.
In an exemplary embodiment, the interface display unit 720 may include: and the first display unit is used for adjusting the display size of the user interface according to the width or the height of the target plane so as to display the user interface on the target plane.
In an exemplary embodiment, the first display unit may be configured to acquire a preset width and a preset height of the user interface; determining a scaling of the user interface by the preset width and the width of the target plane, or determining a scaling of the user interface by the preset height and the height of the target plane; and determining the display size of the user interface according to the scaling.
In an exemplary embodiment, the planar screening unit may further include: a third plane determining unit configured to determine a reference plane among the reference planes; and the scene display unit is used for displaying a virtual scene on the reference plane and determining a reference plane meeting the preset position relation with the virtual scene as a target plane.
In an exemplary embodiment, the reference plane satisfying a preset positional relationship with the virtual scene includes the reference plane perpendicular to a horizontal plane of the virtual scene.
In an exemplary embodiment, the reference plane satisfying a preset positional relationship with the virtual scene includes a reference plane in a positive direction of the virtual scene.
In an exemplary embodiment, the plane screening unit may be further configured to calculate bounding boxes of a plurality of reference planes that satisfy a preset positional relationship with the virtual scene, and determine a reference plane with a largest bounding box as the target plane.
In an exemplary embodiment, the user interface display apparatus 700 may further include: and the default display unit is used for displaying the user interface in a preset mode if the fact that the reference plane meeting the preset condition does not exist is determined.
Exemplary embodiments of the present disclosure also provide another user interface display apparatus that may be applied to an augmented reality device. As shown in fig. 8, the user interface display apparatus 800 may include a plane recognition unit 810, an interface acquisition unit 810, and an interface projection unit 820. Specifically:
the plane acquisition unit 810 is configured to identify a real object plane in a real scene, from which a target plane is determined; the interface acquisition unit 820 is used for acquiring a user interface to be displayed; the interface projection unit 830 is configured to project the user interface on the target plane.
The specific details of the modules/units in the above apparatus are already described in the embodiments of the method section, and thus are not repeated.
Those skilled in the art will appreciate that the various aspects of the present disclosure may be implemented as a system, method, or program product. Accordingly, various aspects of the disclosure may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" system.
Exemplary embodiments of the present disclosure also provide a computer readable storage medium having stored thereon a program product capable of implementing the method described above in the present specification. In some possible implementations, various aspects of the disclosure may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the disclosure as described in the "exemplary methods" section of this specification, when the program product is run on the terminal device.
Referring to fig. 9, a program product 900 for implementing the above-described method according to an exemplary embodiment of the present disclosure is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
The exemplary embodiments of the present disclosure also provide an electronic device capable of implementing the above method. The electronic device includes a device capable of implementing augmented reality technology. An electronic device 1000 according to such an exemplary embodiment of the present disclosure is described below with reference to fig. 10. The electronic device 1000 shown in fig. 10 is merely an example and should not be construed as limiting the functionality and scope of use of the disclosed embodiments.
As shown in fig. 10, the electronic device 1000 may be embodied in the form of a general purpose computing device. Components of electronic device 1000 may include, but are not limited to: the at least one processing unit 1010, the at least one memory unit 1020, a bus 1030 connecting the different system components (including the memory unit 1020 and the processing unit 1010), a display unit 1040, and an imaging unit 1070.
The image capturing unit 1070 is configured to obtain an image, video, etc. of a real scene, that is, the real scene described in the method and apparatus embodiments.
The memory unit 1020 stores program code that can be executed by the processing unit 1010, such that the processing unit 1010 performs steps according to various exemplary embodiments of the present disclosure described in the above "exemplary methods" section of the present specification. For example, the processing unit 1010 may perform the method steps shown in fig. 2 or fig. 3, etc.
The memory unit 1020 may include readable media in the form of volatile memory units such as Random Access Memory (RAM) 1021 and/or cache memory unit 1022, and may further include Read Only Memory (ROM) 1023.
Storage unit 1020 may also include a program/utility 1024 having a set (at least one) of program modules 1025, such program modules 1025 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
Bus 1030 may be representing one or more of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 1000 can also communicate with one or more external devices 1100 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 1000, and/or with any device (e.g., router, modem, etc.) that enables the electronic device 1000 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 1050. Also, electronic device 1000 can communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet, through network adapter 1060. As shown, the network adapter 1060 communicates with other modules of the electronic device 1000 over the bus 1030. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with the electronic device 1000, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solutions according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, including several instructions to cause a computing device (may be a personal computer, a server, a terminal device, or a network device, etc.) to perform the method according to the exemplary embodiments of the present disclosure.
Furthermore, the above-described figures are only schematic illustrations of processes included in the method according to the exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit in accordance with exemplary embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (15)

1. A user interface display method applied to an augmented reality device, the method comprising:
identifying a real object plane in a real scene, and generating a reference plane according to the real object plane; the real scene comprises a plurality of real objects, and the real object plane is the surface corresponding to each of the plurality of real objects;
determining the reference plane meeting preset conditions as a target plane;
Acquiring a user interface to be displayed;
the user interface is displayed in the target plane.
2. The method according to claim 1, wherein the method further comprises:
detecting movement of the augmented reality device;
and when the moving distance of the augmented reality equipment is larger than a preset distance, the target plane is redetermined.
3. The method according to claim 1, wherein the method further comprises:
detecting a rotation of the augmented reality device;
and when the rotation angle of the augmented reality device is larger than a preset angle, the target plane is redetermined.
4. The method of claim 1, wherein the displaying the user interface in the target plane comprises:
and adjusting the display size of the user interface according to the width or the height of the target plane so as to display the user interface on the target plane.
5. The method of claim 4, wherein the adjusting the display size of the user interface according to the width or height of the target plane comprises:
acquiring a preset width and a preset height of the user interface;
determining a scaling of the user interface by the preset width and the width of the target plane, or determining a scaling of the user interface by the preset height and the height of the target plane;
And determining the display size of the user interface according to the scaling.
6. The method of claim 1, wherein determining the reference plane that satisfies a preset condition as a target plane comprises:
determining a reference plane in the reference plane;
and displaying a virtual scene according to the reference plane, and determining a reference plane meeting a preset position relation with the virtual scene as the target plane.
7. The method of claim 6, the reference plane satisfying a preset positional relationship with the virtual scene, comprising:
the reference plane is perpendicular to a horizontal plane of the virtual scene.
8. The method of claim 6, wherein the determining the reference plane satisfying a preset condition as a target plane comprises:
calculating a plurality of bounding boxes of reference planes meeting a preset position relation with the virtual scene, and determining the reference plane with the largest bounding box as a target plane.
9. The method of claim 6, wherein the reference plane satisfying a preset positional relationship with the virtual scene comprises:
the reference plane in the positive direction of the virtual scene.
10. The method according to claim 1, wherein the method further comprises:
and if the reference plane meeting the preset condition does not exist, displaying the user interface in a preset mode.
11. A method of displaying a user interface, comprising:
identifying a real object plane in a real scene, determining a target plane from the real object plane; the real scene comprises a plurality of real objects, and the real object plane is the surface corresponding to each of the plurality of real objects;
acquiring a user interface to be displayed;
the user interface is projected onto the target plane.
12. A user interface display apparatus for use with an augmented reality device, the apparatus comprising:
the plane acquisition unit is used for identifying a real object plane in a real scene and generating a reference plane according to the real object plane; the real scene comprises a plurality of real objects, and the real object plane is the surface corresponding to each of the plurality of real objects;
a target plane determining unit configured to determine the reference plane satisfying a preset condition as a target plane;
an interface acquisition unit for acquiring a user interface;
And the interface display unit is used for displaying the user interface in the target plane.
13. A user interface display apparatus for use with an augmented reality device, the apparatus comprising:
a plane acquisition unit configured to identify a real object plane in a real scene, from which a target plane is determined; the real scene comprises a plurality of real objects, and the real object plane is the surface corresponding to each of the plurality of real objects;
the interface acquisition unit is used for acquiring a user interface to be displayed;
and the interface projection unit is used for projecting the user interface on the target plane.
14. A computer readable storage medium having stored thereon a computer program, wherein the computer program, when executed by a processor, implements the user interface display method of any of claims 1-11.
15. An electronic device, comprising:
a processor;
a memory for storing executable instructions of the processor; and
an image pickup device;
wherein the processor is configured to perform the user interface display method of any one of claims 1-11 via execution of the executable instructions.
CN201910555316.XA 2019-06-25 2019-06-25 User interface display method and device, storage medium and mobile terminal Active CN110286906B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910555316.XA CN110286906B (en) 2019-06-25 2019-06-25 User interface display method and device, storage medium and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910555316.XA CN110286906B (en) 2019-06-25 2019-06-25 User interface display method and device, storage medium and mobile terminal

Publications (2)

Publication Number Publication Date
CN110286906A CN110286906A (en) 2019-09-27
CN110286906B true CN110286906B (en) 2024-02-02

Family

ID=68005761

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910555316.XA Active CN110286906B (en) 2019-06-25 2019-06-25 User interface display method and device, storage medium and mobile terminal

Country Status (1)

Country Link
CN (1) CN110286906B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112862976B (en) * 2019-11-12 2023-09-08 北京超图软件股份有限公司 Data processing method and device and electronic equipment
CN111124112A (en) * 2019-12-10 2020-05-08 北京一数科技有限公司 Interactive display method and device for virtual interface and entity object
CN111736692B (en) * 2020-06-01 2023-01-31 Oppo广东移动通信有限公司 Display method, display device, storage medium and head-mounted device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102622776A (en) * 2011-01-31 2012-08-01 微软公司 Three-dimensional environment reconstruction
CN104007552A (en) * 2014-05-30 2014-08-27 北京理工大学 Display system of light field helmet with true stereo feeling

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4774684A (en) * 1985-01-31 1988-09-27 Canon Kabushiki Kaisha Electronic apparatus with a display means
CN102646275B (en) * 2012-02-22 2016-01-20 西安华旅电子科技有限公司 The method of virtual three-dimensional superposition is realized by tracking and location algorithm
CN108958466A (en) * 2017-11-08 2018-12-07 北京市燃气集团有限责任公司 Excavation Training Methodology based on virtual reality technology

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102622776A (en) * 2011-01-31 2012-08-01 微软公司 Three-dimensional environment reconstruction
CN104007552A (en) * 2014-05-30 2014-08-27 北京理工大学 Display system of light field helmet with true stereo feeling

Also Published As

Publication number Publication date
CN110286906A (en) 2019-09-27

Similar Documents

Publication Publication Date Title
AU2020202551B2 (en) Method for representing points of interest in a view of a real environment on a mobile device and mobile device therefor
US10460512B2 (en) 3D skeletonization using truncated epipolar lines
EP3769509B1 (en) Multi-endpoint mixed-reality meetings
EP3798801A1 (en) Image processing method and apparatus, storage medium, and computer device
KR101930657B1 (en) System and method for immersive and interactive multimedia generation
CN110473293B (en) Virtual object processing method and device, storage medium and electronic equipment
US20170186219A1 (en) Method for 360-degree panoramic display, display module and mobile terminal
CN110286906B (en) User interface display method and device, storage medium and mobile terminal
US11423518B2 (en) Method and device of correcting image distortion, display device, computer readable medium, electronic device
US20190355170A1 (en) Virtual reality content display method and apparatus
CN103157281B (en) Display method and display equipment of two-dimension game scene
JP2017505933A (en) Method and system for generating a virtual image fixed on a real object
CN116134405A (en) Private control interface for augmented reality
US11107184B2 (en) Virtual object translation
CN114341943A (en) Simple environment solver using plane extraction
US20220215607A1 (en) Method and apparatus for driving interactive object and devices and storage medium
CN108776544B (en) Interaction method and device in augmented reality, storage medium and electronic equipment
US20200329227A1 (en) Information processing apparatus, information processing method and storage medium
US10789766B2 (en) Three-dimensional visual effect simulation method and apparatus, storage medium, and display device
CN114245908A (en) Fast 3D reconstruction with depth information
CN113920282B (en) Image processing method and device, computer readable storage medium, and electronic device
JP2024114712A (en) Imaging device, imaging method, and program
CN113559501B (en) Virtual unit selection method and device in game, storage medium and electronic equipment
CN110192169A (en) Menu treating method, device and storage medium in virtual scene
CN112473138B (en) Game display control method and device, readable storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant