CN110286906A - Method for displaying user interface, device, storage medium and mobile terminal - Google Patents
Method for displaying user interface, device, storage medium and mobile terminal Download PDFInfo
- Publication number
- CN110286906A CN110286906A CN201910555316.XA CN201910555316A CN110286906A CN 110286906 A CN110286906 A CN 110286906A CN 201910555316 A CN201910555316 A CN 201910555316A CN 110286906 A CN110286906 A CN 110286906A
- Authority
- CN
- China
- Prior art keywords
- user interface
- plane
- reference planes
- real
- objective plane
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 57
- 230000003190 augmentative effect Effects 0.000 claims abstract description 65
- 238000004590 computer program Methods 0.000 claims description 4
- 238000007654 immersion Methods 0.000 abstract description 6
- 230000003993 interaction Effects 0.000 abstract description 3
- 238000012545 processing Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 6
- 238000004891 communication Methods 0.000 description 5
- 238000001514 detection method Methods 0.000 description 5
- 238000012216 screening Methods 0.000 description 5
- 241000208340 Araliaceae Species 0.000 description 4
- 235000005035 Panax pseudoginseng ssp. pseudoginseng Nutrition 0.000 description 4
- 235000003140 Panax quinquefolius Nutrition 0.000 description 4
- 210000004027 cell Anatomy 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 235000008434 ginseng Nutrition 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 230000005291 magnetic effect Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000006978 adaptation Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000000712 assembly Effects 0.000 description 1
- 238000000429 assembly Methods 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000000151 deposition Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 235000013399 edible fruits Nutrition 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 239000004744 fabric Substances 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000003014 reinforcing effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 210000000352 storage cell Anatomy 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 239000011800 void material Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/30—Creation or generation of source code
- G06F8/38—Creation or generation of source code for implementing user interfaces
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
Present disclose provides a kind of method for displaying user interface, user interface display device, computer readable storage medium and augmented reality equipment, belong to human-computer interaction technique field.The method is applied to augmented reality equipment, which comprises the real-world object plane in identification real scene generates reference planes according to the real-world object plane;The reference planes for meeting preset condition are determined as objective plane;Obtain user interface to be shown;The user interface is shown in the objective plane.The disclosure can make the user interface in augmented reality application incorporate real scene, improve the sense of reality that user interface is shown, increase the feeling of immersion of user, improve user experience.
Description
Technical field
This disclosure relates to which human-computer interaction technique field more particularly to a kind of method for displaying user interface, user interface are shown
Device, computer readable storage medium and electronic equipment.
Background technique
Augmented reality (Augmented Reality, abbreviation AR) is a kind of position for calculating camera image in real time
It sets and angle and plus the technology of respective image, video, 3D model, this technology can carry out virtual world and real world
In conjunction with being presented to the user, user's interaction impression more abundant is brought.
Currently, three dimensional virtual models can be combined to form augmented reality with reality scene in augmented reality application
Scene, and be displayed on the screen, user interface (User Interface, abbreviation UI) is then shown in screen top layer, in order to
User interacts operation, and after UI inputs information or reads information, UI's user can disappear, and increases so that display is complete again
Strong reality scene.However, the user interface shown can block augmented reality scene, the feeling of immersion of user is destroyed.
It should be noted that information is only used for reinforcing the reason to the background of the disclosure disclosed in above-mentioned background technology part
Solution, therefore may include the information not constituted to the prior art known to persons of ordinary skill in the art.
Summary of the invention
The inventors discovered that in the related art, being interacted for convenience with user, user interface would be typically displayed at
On augmented reality scene, augmented reality scene is caused to block, causes to lack feeling of immersion.By taking the AR application on mobile phone as an example,
Real scene is obtained by mobile phone camera, then virtual three-dimensional scene is combined to display with real scene on mobile phone screen,
The user interface of AR application at this time can be shown in top layer.As shown in fig. 1, in the picture shown, dialog box 100 will be empty
Quasi- scene covers, or is covered by virtual scene.
In consideration of it, present disclose provides a kind of method for displaying user interface, user interface display device, computer-readable depositing
Storage media and mobile terminal, and then the above problem in the related technology is overcome at least to a certain extent.
Other characteristics and advantages of the disclosure will be apparent from by the following detailed description, or partially by the disclosure
Practice and acquistion.
According to one aspect of the disclosure, a kind of method for displaying user interface is provided, is applied in augmented reality equipment, institute
The method of stating includes: the real-world object plane identified in real scene, generates reference planes according to the real-world object plane;It will expire
The reference planes of sufficient preset condition are determined as objective plane;Obtain user interface to be shown;In the objective plane
Show the user interface.
In a kind of exemplary embodiment of the disclosure, the method also includes: detect the shifting of the augmented reality equipment
It is dynamic;When the moving distance of the augmented reality equipment is greater than pre-determined distance, the objective plane is redefined.
In a kind of exemplary embodiment of the disclosure, the method also includes: detect turning for the augmented reality equipment
It is dynamic;When the rotational angle of the augmented reality equipment is greater than predetermined angle, the objective plane is redefined.
It is described that the user interface, packet are shown in the objective plane in a kind of exemplary embodiment of the disclosure
It includes: the displaying size of the user interface is adjusted according to the width of the objective plane or height, by the user interface
It is shown in the objective plane.
In a kind of exemplary embodiment of the disclosure, the width or height adjustment institute according to the objective plane
State the display size of user interface, comprising: obtain the predetermined width and preset height of the user interface;By described default
The width of width and the objective plane determines the scaling of the user interface, or passes through the preset height and described
The height of objective plane determines the scaling of the user interface;The aobvious of the user interface is determined according to the scaling
Show size.
In a kind of exemplary embodiment of the disclosure, the real-world object plane for meeting preset condition is determined as target and is put down
Face is included in the reference planes and determines a datum plane;Show a virtual scene according to the datum plane, will with it is described
The reference planes that virtual scene meets predeterminated position relationship are determined as objective plane.
In a kind of exemplary embodiment of the disclosure, the reference for meeting predeterminated position relationship with the virtual scene is flat
Face, comprising: the reference planes with the horizontal plane of the virtual scene.
It is described that the reference planes for meeting preset condition are determined as mesh in a kind of exemplary embodiment of the disclosure
Mark plane, comprising: the bounding box for calculating multiple reference planes for meeting predeterminated position relationship with the virtual scene is determined and surrounded
The maximum reference planes of box are objective plane.
In a kind of exemplary embodiment of the disclosure, the reference that predeterminated position relationship is met with the virtual scene
Plane, comprising: in the reference planes of the positive direction of the virtual scene.
In a kind of exemplary embodiment of the disclosure, the method also includes: if it is determined that there is no meet default item
The reference planes of part are then shown the user interface with predetermined manner.
According to one aspect of the disclosure, a kind of method for displaying user interface is provided, comprising: true in identification real scene
Real object plane determines objective plane from the real-world object plane;Obtain user interface to be shown;By user circle
Face is projected on the objective plane.
According to one aspect of the disclosure, a kind of user interface display device is provided, augmented reality equipment is applied to, it is described
Device includes: plane recognition unit, for identification the real-world object plane in real scene, raw according to the real-world object plane
At reference planes;Plane screening unit, for the reference planes for meeting preset condition to be determined as objective plane;It obtains at interface
Unit is taken, for obtaining user interface to be shown;Interface display unit, for showing user circle in the objective plane
Face.
In a kind of exemplary embodiment of the disclosure, described device further include: first movement detection unit, for detecting
The movement of the augmented reality equipment;First plane determination unit is greater than for the moving distance when the augmented reality equipment
When pre-determined distance, the objective plane is redefined.
In a kind of exemplary embodiment of the disclosure, the second mobile detection unit is set for detecting the augmented reality
Standby rotation;Second plane determination unit, for when the rotational angle of the augmented reality equipment be greater than predetermined angle when, again
Determine the objective plane.
In a kind of exemplary embodiment of the disclosure, interface display unit may include: the first display unit, be used for root
The displaying size that the user interface is adjusted according to the width or height of the objective plane, the user interface is shown in
The objective plane.
In a kind of exemplary embodiment of the disclosure, the first display unit can be used for: obtain the user interface
Predetermined width and preset height;The contracting of the user interface is determined by the width of the predetermined width and the objective plane
Ratio is put, or determines the scaling of the user interface by the height of the preset height and the objective plane;Root
The display size of the user interface is determined according to the scaling.
In a kind of exemplary embodiment of the disclosure, the plane screening unit can also include: that third plane determines
Unit, for determining a datum plane in the reference planes;Scene display unit, for showing one in the datum plane
The reference planes for meeting predeterminated position relationship with the virtual scene are determined as objective plane by virtual scene.
In a kind of exemplary embodiment of the disclosure, the reference that predeterminated position relationship is met with the virtual scene
Plane includes the reference planes with the horizontal plane of the virtual scene.In a kind of exemplary embodiment of the disclosure
In, the plane screening unit can be also used for calculating multiple reference for meeting predeterminated position relationship with the virtual scene and put down
The bounding box in face determines that the maximum reference planes of bounding box are objective plane.
In a kind of exemplary embodiment of the disclosure, the reference that predeterminated position relationship is met with the virtual scene
Plane includes the reference planes in the positive direction of the virtual scene.In a kind of exemplary embodiment of the disclosure, the user
Interface display apparatus can also include: default display unit, for if it is determined that there is no the reference planes of preset condition are met,
Then the user interface is shown with predetermined manner.
According to one aspect of the disclosure, a kind of user interface display device is also provided, comprising: plane acquiring unit is used
Real-world object plane in identification real scene, determines objective plane from the real-world object plane;Interface acquiring unit,
For obtaining user interface to be shown;Interface projecting cell, for the user interface to be projected in the objective plane.
According to one aspect of the disclosure, a kind of computer readable storage medium is provided, computer program is stored thereon with,
The computer program realizes above-mentioned method for displaying user interface when being executed by processor.
According to one aspect of the disclosure, a kind of electronic equipment is provided, comprising: processor;Memory, it is described for storing
The executable instruction of processor;And photographic device;Wherein, the processor is configured to next via the executable instruction is executed
Execute above-mentioned method for displaying user interface.
The exemplary embodiment of the disclosure has the advantages that
By generating reference planes from the real-world object plane in real scene, the determining objective plane from reference planes,
User interface is shown based on the objective plane.On the one hand, show that user interface can be used family interface and incorporate based on objective plane
Real scene, thus it is independent mutually with augmented reality scene, avoid user interface from causing to cover or isolate augmented reality scene,
To improve the feeling of immersion of user, improve user experience.On the other hand, when showing user interface without consideration and augmented reality
The Overlapping display effect of scene, improves flexibility.It can be to avoid user in real scene in another aspect, user interface is incorporated
Interface causes to isolate to virtual scene, increases the sense of reality of user.
It should be understood that above general description and following detailed description be only it is exemplary and explanatory, not
The disclosure can be limited.
Detailed description of the invention
The drawings herein are incorporated into the specification and forms part of this specification, and shows the implementation for meeting the disclosure
Example, and together with specification for explaining the principles of this disclosure.It should be evident that the accompanying drawings in the following description is only the disclosure
Some embodiments for those of ordinary skill in the art without creative efforts, can also basis
These attached drawings obtain other attached drawings.
Fig. 1 shows a kind of user interface display effect in the prior art;
Fig. 2 shows the flow charts of method for displaying user interface a kind of in the present exemplary embodiment;
Fig. 3 shows the flow chart of another method for displaying user interface in the present exemplary embodiment;
Fig. 4 shows the flow chart of another method for displaying user interface in the present exemplary embodiment;
Fig. 5 shows the flow chart of another method for displaying user interface in the present exemplary embodiment;
Fig. 6 shows a kind of coordinate system schematic diagram of virtual scene in the present exemplary embodiment;
Fig. 7 shows a kind of block diagram of user interface display device in the present exemplary embodiment;
Fig. 8 shows the block diagram of another user interface display device in the present exemplary embodiment;
Fig. 9 shows a kind of computer readable storage medium for realizing the above method in the present exemplary embodiment;
Figure 10 shows a kind of electronic equipment for realizing the above method in the present exemplary embodiment.
Specific embodiment
Example embodiment is described more fully with reference to the drawings.However, example embodiment can be with a variety of shapes
Formula is implemented, and is not understood as limited to example set forth herein;On the contrary, thesing embodiments are provided so that the disclosure will more
Fully and completely, and by the design of example embodiment comprehensively it is communicated to those skilled in the art.Described feature, knot
Structure or characteristic can be incorporated in any suitable manner in one or more embodiments.
In the present context, " first ", " second ", " third " etc. only use as the label of corresponding object, not logarithm
Amount or sequence cause to limit.
The exemplary embodiment of the disclosure provides firstly a kind of method for displaying user interface, and the executing subject of this method can
To be augmented reality equipment.Wherein, augmented reality equipment is that by the terminal device of augmented reality, such as wearable
Equipment, mobile phone, computer, tablet computer etc..Augmented reality application is disposed in terminal device enables to equipment to realize accordingly
Augmented reality function.Currently, many terminal devices all support augmented reality class application, such as mobile phone A R digital map navigation, mobile phone A R
Game etc. also has special equipment of many AR, such as AR glasses, AR helmet etc., and the disclosure does not limit this.
Fig. 2 shows a kind of processes of the present exemplary embodiment, may comprise steps of:
Step S210 identifies the real-world object plane in real scene, is generated according to the real-world object plane with reference to flat
Face;
The reference planes for meeting preset condition are determined as objective plane by step S220;
Step S230 obtains user interface to be shown;
Step S240 shows the user interface in the objective plane.
Wherein, real scene is actual environment locating for the user using augmented reality equipment, be can wrap in real scene
Include multiple real-world objects, such as tables and chairs, wall, ground etc..Real-world object plane can refer to the surface of real-world object, such as wall
Face, desktop, ground etc..Augmented reality equipment may be implemented to identify the real-world object in real scene by camera, thus will
Virtual scene is projected in real scene;Or according to the real-world object in the real scene of camera acquisition, on the screen will
The model of virtual three-dimensional model and real-world object is combined display.
In the disclosure, by obtaining the image data or video data of the available real scene of camera, thus
Each real-world object in real scene is identified according to image, the video data in real scene, obtains real scene
In real-world object plane.It may include multiple real-world objects in real scene, it is flat so as to get multiple real-world objects
Face, such as it is parallel to the plane (such as ground, desktop) of horizontal plane, plane (such as metope, window face, mirror surface perpendicular to horizontal plane
Deng) or other planes, such as inclined plane etc., the present embodiment does not limit this.Reference planes, which can refer to, to be parallel to very
Virtual reference planes can be generated according to the real-world object plane in real scene in the virtual plane of real plane, in order to aobvious
Show user interface.By projection can in real-world object plane or be parallel at the position of real-world object plane generate reference
Plane.Alternatively, generating reference planes by establishing model in virtual scene.
Objective plane may include the satisfactory plane of angle in reference planes, specifically, can be with according to real scene
It establishes a virtual scene and virtual three-dimensional coordinate system and utilizes this so that reference planes correspond in the three-dimensional system of coordinate
Three-dimensional system of coordinate can determine objective plane.If establishing coordinate system by x-y plane of ground, objective plane may include hanging down
Directly in the plane of x-y.In other words, it is target that preset condition, which may include the plane that positional relationship is met the requirements in reference planes,
Plane.Also, preset condition can also include the plane that shape is met the requirements in reference planes, such as rectangle, triangle, circle
Etc. the plane of regular shapes or irregular plane.
In turn, user interface can be shown in objective plane based on objective plane.User interface can be projected in target
In plane, or it is embedded on the objective plane in virtual scene.Wherein, user interface may include that augmented reality is applied in reality
The interface shown when existing function.User interface is the interface interacted with user to be realized in augmented reality application, such as obtain user
The information of input, or information is showed into user.
In the example embodiment of the disclosure, on the one hand, according in real scene true planar generate reference planes,
Objective plane is determined from reference planes, shows that family interface and augmented reality field can be used in user interface based on objective plane
Scape is independent mutually, and user interface is avoided to cause to cover or isolate augmented reality scene, to improve the feeling of immersion of user, improves
User experience.On the other hand, it is improved when showing user interface without considering the Overlapping display effect with augmented reality scene
Flexibility.In another aspect, user interface is shown can cause augmented reality scene to avoid user interface in real scene
It isolates, to increase the sense of reality of user.
The present embodiment further includes a virtual scene, which can be the field that augmented reality application is presented to the user
Scape, such as scene of game, test scene, medical scene etc., are also possible to artistic design class scene, such as interior, landscape, gardens
Etc. scenes etc. or other scenes, the virtual scene can be projected in real scene so that user possess it is on the spot in person
Experience.Therefore study, amusement, medicine, the fine arts or other scenes may be implemented in the augmented reality equipment of the present exemplary embodiment
Using.Therefore, in the present embodiment, step S220 can with specifically includes the following steps:
Step S301 determines a datum plane in the reference planes;
Step S302 shows a virtual scene according to the datum plane, will meet predeterminated position with the virtual scene
The reference planes of relationship are determined as the objective plane;
A datum plane is selected in reference planes, is relied on the datum plane and is shown that virtual scene, virtual scene can be with
It is projected on datum plane.After virtual scene is projected on datum plane, predeterminated position pass can will be met with virtual scene
The reference planes of system are determined as objective plane, such as the ginseng parallel or vertical for the face of (x, 0, z) with coordinate in virtual scene
Examining plane is objective plane.In the exemplary embodiment, the reference planes that predeterminated position relationship is met with virtual scene can wrap
Include the reference planes with the horizontal plane of virtual scene.The horizontal plane of virtual scene may include in virtual scene virtually
Face, virtual desktop etc. also may include face horizontal in real scene, such as ground etc., and the present embodiment does not do special limit to this
It is fixed.It can be determined as objective plane, such as metope, cabinet face, window face, mirror surface with the reference planes of the horizontal plane of virtual scene
Corresponding reference planes etc..
The reference planes for meeting predeterminated position relationship with virtual scene can also include the ginseng in the positive direction of virtual scene
Examine plane.Can determine the positive direction in three dimensions of virtual scene according to the virtual coordinate system of virtual scene, then this three
The reference planes of positive direction in a dimension can be determined as objective plane.As shown in fig. 6, virtual scene is projected to true field
When in scape, virtual scene then can be virtual coordinate system positive direction towards the direction of user towards user with first person.
For example, if the plane for being parallel to user's sight is x-z-plane, then the point of z > 0 or x > 0 is virtual coordinates when user looks squarely
It is the point in positive direction.The point of z>0 can be considered as in the front of virtual scene in virtual scene, and the point of z<0 can be considered as
The back side of virtual scene, the reference planes then by the reference planes on the direction of z > 0, or on the direction x > 0 are as target
Plane is more in line with the visual angle of user.
Meet preset condition reference planes may include it is multiple, the area by calculating each reference planes can determine ginseng
The objective plane in plane is examined, such as area in reference planes is maximum as objective plane.Also, by other means
Objective plane can be selected from reference planes, such as length in reference planes is maximum as objective plane etc., Huo Zhecong
A plane is randomly choosed in reference planes as objective plane etc..
In one exemplary embodiment, objective plane can be determined using the bounding box of each reference planes.Specifically,
The bounding box for calculating each reference planes determines that the maximum reference planes of bounding box are objective plane.Bounding box can refer to AABB
Bounding box, OBB bounding box, or surround ball etc., this example embodiment does not do particular determination to this.
Bounding box by calculating each reference planes can determine the maximum plane of width and bounding box of bounding box
High maximum plane.So as to determine that target is put down according to user interface to be zoomed in and out to the bandwagon effect of rear user interface
Face.Specifically, can using the wide maximum plane of bounding box as the first zoom factor of the first plane computations user interface, then with
The high maximum plane of bounding box as second the second zoom factor of plane computations, then to user interface with the first zoom factor into
If user interface cannot all be shown on the first plane after row scaling, then be zoomed in and out with the second zoom factor to user interface
If user interface can be fully displayed in the second plane, using the second plane as final objective plane.If respectively with
After one zoom factor, the second zoom factor zoom in and out user interface, user interface exceeds the first plane and the second plane,
Then using the maximum plane of user interface area of display as objective plane.Further, it is also possible to by other means from multiple ginsengs
It examines and determines objective plane in face, such as default is using the maximum plane of bounding box width as objective plane etc..
After obtaining objective plane, user interface can be shown on objective plane.User interface may include multiple pre-
The display parameters being first arranged, such as the coordinate in virtual coordinate system of color, size etc. or user interface of user interface.
User interface, which is calculated, according to the size dimension of objective plane projects in the coordinate in objective plane and user interface each group
The coordinate of part, and then user interface is projected in objective plane.Also, in order to stick on user interface more with real scene
It closes, increases the feeling of immersion of user, user interface can also be zoomed in and out, enable the better adaptation objective of user interface flat
The size in face.That is, according to the width of objective plane or the display size of the adjustable user interface of height.Such as it adjusts
The display size of whole user interface make user interface display width and objective plane width in certain error range
It is equal.
In the exemplary embodiment, adjust user interface display size can with specifically includes the following steps:
Step S401 obtains the predetermined width and preset height of the user interface;
Step S402 determines the pantograph ratio of the user interface by the width of the predetermined width and the objective plane
Example, or determine by the height of the preset height and the objective plane scaling of the user interface;
Step S403 determines the display size of the user interface according to the scaling.
The predetermined width and preset height of user interface are usually to be arranged in the development process that augmented reality is applied,
The parameters of user interface are set by developer, that is, the width of user interface and color, the cloth of height and user interface
Office, or various assemblies are added in the user interface.For example, it is 500px that width, which can be set, height is 300px etc..Mesh
The height or width for marking plane can be obtained by camera, after camera scans real scene, can determine objective plane
Geometric data.The geometric data can refer to the real data of objective plane, such as objective plane wide 1m, high 0.5m etc., can also
To refer to the data after being converted by the real data of objective plane virtual unit, different augmented realities using draw
Holding up can be different, and the data after different engines converts real data may be different.If also, objective plane is not
When the geometric surface of rule, the width and height of objective plane can be the width and height of the bounding box obtained when calculating bounding box.
For example, if the predetermined width of user interface is actual_w, the width of objective plane is virtual_w,
Then the scaling of user interface can be obtained according to virtual_w*ToEngineScale ()/actual_w, wherein
ToEngineScale () is indicated virtual Conversion of measurement unit to the function of corresponding engine unit.Similarly, according to objective plane
The scaling that height virtual_h calculates user interface can pass through virtual_h*ToEngineScale ()/actual_
h.The result being calculated is the scaling of user interface.
User interface can be zoomed in and out according to scaling, to be shown on objective plane.User interface and true
Virtual scene in real field scape is mutually indepedent, avoids user interface and causes the problem of blocking to virtual scene, so that user
It experiences more smooth.Meanwhile the display of user interface is not limited by virtual scene, and user is facilitated to carry out on a user interface respectively
Kind interactive operation, can be improved interactive flexibility.
In the exemplary embodiment, with the movement of augmented reality equipment camera, the real scene recognized can be sent out
Changing changes in response to real scene, redefines objective plane according to the real scene after variation.When what is recognized
When real scene changes, the reference planes generated according to the real-world object plane in real scene, which will also be sent, to be changed, root
Objective plane is redefined according to the reference planes after variation, and then user interface is shown in the objective plane redefined out
On.
Specifically, it can detecte the movement of augmented reality equipment;When the moving distance of augmented reality equipment be greater than it is default away from
From when, objective plane can be redefined.Wherein, pre-determined distance may include 0.5 meter, 1 meter etc., also may include other away from
From, such as 0.8 meter, 2 meters etc., the present embodiment does not limit.The movement of augmented reality equipment can be examined by range sensor
It surveys, such as infrared sensor etc., at regular intervals then section, the distance that available range sensor detects calculate phase
The difference for the distance that adjacent time point gets can redefine objective plane if the difference is greater than pre-determined distance.
Alternatively, can detecte the rotation of augmented reality equipment;When the rotational angle of augmented reality equipment is greater than predetermined angle
When, redefine objective plane.Wherein, predetermined angle may include 15 degree, 30 degree etc., also may include other angles value, example
Such as 10 degree, 20 degree, this implementation is without limitation.It can detecte the rotational angle of augmented reality equipment by direction sensor, such as
Changing towards angle for fruit detection augmented reality equipment, then calculate the amplitude of variation, i.e. rotational angle, if rotational angle
Greater than predetermined angle, then objective plane is redefined.
In the exemplary embodiment, it if not meeting the real-world object plane of preset condition in real scene, i.e., does not deposit
In objective plane, then user interface is shown with predetermined manner.Predetermined manner may include showing user interface in virtual field
At the n unit of scape, a virtual scene, the x-z-plane of the datum plane and virtual scene are generated based on a datum plane for example, setting
It is overlapped, then can show user interface at the n unit of the maximum x coordinate away from virtual scene.Alternatively, user interface is random
It is shown at the n unit apart from virtual scene.
Fig. 5 diagrammatically illustrates another process of this example embodiment.As shown in figure 5, this example embodiment may include
Step S501 to step S507.For step S501, when user is using the terminal device for having augmented reality application, augmented reality
Using the access right for needing to obtain camera, after the access right for successfully getting camera, it can star camera and sweep
Real scene is retouched, virtual reference planes are converted for the real-world object plane of scanning by corresponding engine.For step S502,
Judge whether there is the reference planes for meeting preset condition.Specifically, can by ground in real scene or other be parallel to
The plane on ground is as datum plane, and by virtual scene display in datum plane, reference planes are vertical with datum plane if it exists,
It then determines the reference planes for existing and meeting preset condition, executes step S503;The reference vertical with datum plane is flat if it does not exist
When face, step S507 is executed.
For step S503, AABB bounding box is calculated by meeting the geometric data of reference planes of preset condition, thus
Show that the maximum of each reference planes is wide high.In step S504, maximum width, the Gao Hou of each reference planes are obtained, at these
Maximum width is chosen in data again, the scaling of user interface is calculated by maximum width, or pass through maximum high calculating user circle
The scaling in face.Determine that maximum high or maximum wide reference planes are objective plane by the scaling of user interface.
After determining the scaling of user interface, in step S505, position and the angle of user interface are set according to objective plane.
The position and angle that user interface is shown are determined according to the corresponding data in virtual coordinate system of objective plane, and then execute step
Rapid S506 is shown in user interface on objective plane.
It is default display mode by UI Preferences for step S507.Illustratively, the observation visual angle of user is as schemed
Shown in 6, show that the datum plane of virtual scene is x-z-plane at this time, then the default angles of display of user interface is perpendicular to x-z
Plane, and user interface default display location is at z > 0 or x > 0.Alternatively, determining each coordinate of virtual scene in x
Maximum value on direction is x1, determines that maximum value of the virtual scene in the direction z is z1, the then position that the user interface defaulted is shown
It may include the plane at z1+n or the plane at x1+n, wherein n is positive integer.After the display mode for determining user interface,
It executes step S506 and shows user interface.
In the exemplary embodiment, the real-world object plane in real scene is identified, it can be from real-world object plane really
A fixed objective plane, that is, objective plane can be true plane, then user interface can be directly projected on objective plane
On.Wherein, user interface is the interface interacted in virtual reality applications with user, which can also provide a void
Quasi- scene, the virtual scene can also be projected in real scene in a real-world object plane, then objective plane may include with
The vertical other real-world object planes of the plane.
The exemplary embodiment of the disclosure additionally provides a kind of user interface display device, can be applied to augmented reality and sets
It is standby.As shown in fig. 7, user interface display device 700 may include: plane acquiring unit 710, for identification in real scene
Real-world object plane generates reference planes according to the real-world object plane;Plane determination unit 720 is marked, it is pre- for that will meet
If the reference planes of condition are determined as objective plane;Interface acquiring unit 730, for obtaining user interface;Interface display
Unit 740, for showing the user interface in the objective plane.
In a kind of exemplary embodiment of the disclosure, described device further include: first movement detection unit, for detecting
The movement of the augmented reality equipment;First plane determination unit is greater than for the moving distance when the augmented reality equipment
When pre-determined distance, the objective plane is redefined.
In a kind of exemplary embodiment of the disclosure, the second mobile detection unit is set for detecting the augmented reality
Standby rotation;Second plane determination unit, for when the rotational angle of the augmented reality equipment be greater than predetermined angle when, again
Determine the objective plane.
In one exemplary embodiment, interface display unit 720 may include: the first display unit, for according to
The width or height of objective plane adjust the displaying size of the user interface, and the user interface is shown in the mesh
Mark plane.
In one exemplary embodiment, the first display unit can be used for obtaining the user interface predetermined width and
Preset height;The scaling of the user interface is determined by the width of the predetermined width and the objective plane, or
The scaling of the user interface is determined by the height of the preset height and the objective plane;According to the pantograph ratio
Example determines the display size of the user interface.
In one exemplary embodiment, the plane screening unit can also include: third plane determination unit, be used for
A datum plane is determined in the reference planes;Scene display unit will for showing a virtual scene in the datum plane
The reference planes for meeting predeterminated position relationship with the virtual scene are determined as objective plane.
In one exemplary embodiment, the reference planes that predeterminated position relationship is met with the virtual scene include with
The reference planes of the horizontal plane of the virtual scene.
In one exemplary embodiment, the reference planes for meeting predeterminated position relationship with the virtual scene are included in
The reference planes of the positive direction of the virtual scene.
In one exemplary embodiment, plane screening unit can be also used for calculating multiple and virtual scene satisfaction
The bounding box of the reference planes of predeterminated position relationship determines that the maximum reference planes of bounding box are objective plane.
In one exemplary embodiment, which can also include: default display unit, be used for
If it is determined that then the user interface is shown with predetermined manner there is no the reference planes for meeting preset condition.
The exemplary embodiment of the disclosure additionally provides another user interface display device, can be applied to augmented reality and sets
It is standby.As shown in figure 8, user interface display device 800 may include plane recognition unit 810, interface acquiring unit 810 and boundary
Face projecting cell 820.Specifically:
The real-world object plane in real scene for identification of plane acquiring unit 810, from the real-world object plane
Determine objective plane;Interface acquiring unit 820 is for obtaining user interface to be shown;Interface projecting cell 830 is used for institute
User interface is stated to be projected on the objective plane.
The detail of each module/unit has been described in detail in the embodiment of method part in above-mentioned apparatus, therefore
It repeats no more.
Person of ordinary skill in the field it is understood that various aspects of the disclosure can be implemented as system, method or
Program product.Therefore, various aspects of the disclosure can be with specific implementation is as follows, it may be assumed that complete hardware embodiment, complete
The embodiment combined in terms of full Software Implementation (including firmware, microcode etc.) or hardware and software, can unite here
Referred to as circuit, " module " or " system ".
The exemplary embodiment of the disclosure additionally provides a kind of computer readable storage medium, and being stored thereon with can be realized
The program product of this specification above method.In some possible embodiments, various aspects of the disclosure can also be realized
For a kind of form of program product comprising program code, when program product is run on the terminal device, program code is used for
Execute terminal device described in above-mentioned " illustrative methods " part of this specification according to the various exemplary embodiment party of the disclosure
The step of formula.
It is produced refering to what is shown in Fig. 9, describing the program according to the exemplary embodiment of the disclosure for realizing the above method
Product 900, can be using portable compact disc read only memory (CD-ROM) and including program code, and can set in terminal
It is standby, such as run on PC.However, the program product of the disclosure is without being limited thereto, in this document, readable storage medium storing program for executing can
With to be any include or the tangible medium of storage program, the program can be commanded execution system, device or device use or
Person is in connection.
Program product can be using any combination of one or more readable mediums.Readable medium can be readable signal Jie
Matter or readable storage medium storing program for executing.Readable storage medium storing program for executing for example can be but be not limited to electricity, magnetic, optical, electromagnetic, infrared ray or partly lead
System, device or the device of body, or any above combination.More specific example (the non exhaustive column of readable storage medium storing program for executing
Table) it include: the electrical connection with one or more conducting wires, portable disc, hard disk, random access memory (RAM), read-only storage
Device (ROM), erasable programmable read only memory (EPROM or flash memory), optical fiber, portable compact disc read only memory (CD-
ROM), light storage device, magnetic memory device or above-mentioned any appropriate combination.
Computer-readable signal media may include in a base band or as carrier wave a part propagate data-signal,
In carry readable program code.The data-signal of this propagation can take various forms, including but not limited to electromagnetic signal,
Optical signal or above-mentioned any appropriate combination.Readable signal medium can also be any readable Jie other than readable storage medium storing program for executing
Matter, the readable medium can send, propagate or transmit for by instruction execution system, device or device use or and its
The program of combined use.
The program code for including on readable medium can transmit with any suitable medium, including but not limited to wirelessly, have
Line, optical cable, RF etc. or above-mentioned any appropriate combination.
Can with any combination of one or more programming languages come write for execute the disclosure operation program
Code, programming language include object oriented program language-Java, C++ etc., further include conventional process
Formula programming language-such as " C " language or similar programming language.Program code can be calculated fully in user
It executes in equipment, partly execute on a user device, executing, as an independent software package partially in user calculating equipment
Upper part executes on a remote computing or executes in remote computing device or server completely.It is being related to remotely counting
In the situation for calculating equipment, remote computing device can pass through the network of any kind, including local area network (LAN) or wide area network
(WAN), it is connected to user calculating equipment, or, it may be connected to external computing device (such as utilize ISP
To be connected by internet).
The exemplary embodiment of the disclosure additionally provides a kind of electronic equipment that can be realized the above method.The electronic equipment
Equipment including can be realized augmented reality.This exemplary embodiment according to the disclosure is described referring to Figure 10
Electronic equipment 1000.The electronic equipment 1000 that Figure 10 is shown is only an example, should not be to the function of the embodiment of the present disclosure
Any restrictions are brought with use scope.
As shown in Figure 10, electronic equipment 1000 can be showed in the form of universal computing device.The group of electronic equipment 1000
Part can include but is not limited to: at least one above-mentioned processing unit 1010, at least one above-mentioned storage unit 1020, connection are different
Bus 1030, display unit 1040 and the camera unit of system component (including storage unit 1020 and processing unit 1010)
1070。
Wherein, camera unit 1070 is for obtaining real scene, such as image, the video of real scene etc., i.e., method with
Real scene described in Installation practice part.
Storage unit 1020 is stored with program code, and program code can be executed with unit 1010 processed, so that processing is single
Member 1010 executes described in above-mentioned " illustrative methods " part of this specification according to the various illustrative embodiments of the disclosure
Step.For example, processing unit 1010 can execute Fig. 2 or method and step shown in Fig. 3 etc..
Storage unit 1020 may include the readable medium of volatile memory cell form, such as Random Access Storage Unit
(RAM) 1021 and/or cache memory unit 1022, it can further include read-only memory unit (ROM) 1023.
Storage unit 1020 can also include program/utility with one group of (at least one) program module 1025
1024, such program module 1025 includes but is not limited to: operating system, one or more application program, other program moulds
It may include the realization of network environment in block and program data, each of these examples or certain combination.
Bus 1030 can be to indicate one of a few class bus structures or a variety of, including storage unit bus or storage
Cell controller, peripheral bus, graphics acceleration port, processing unit use any bus structures in a variety of bus structures
Local bus.
Electronic equipment 1000 can also be with one or more external equipments 1100 (such as keyboard, sensing equipment, bluetooth equipment
Deng) communication, can also be enabled a user to one or more equipment interact with the electronic equipment 1000 communicate, and/or with make
The electronic equipment 1000 can with it is one or more of the other calculating equipment be communicated any equipment (such as router, modulation
Demodulator etc.) communication.This communication can be carried out by input/output (I/O) interface 1050.Also, electronic equipment 1000
Network adapter 1060 and one or more network (such as local area network (LAN), wide area network (WAN) and/or public affairs can also be passed through
Common network network, such as internet) communication.As shown, network adapter 1060 passes through its of bus 1030 and electronic equipment 1000
The communication of its module.It should be understood that although not shown in the drawings, other hardware and/or software can be used in conjunction with electronic equipment 1000
Module, including but not limited to: microcode, device driver, redundant processing unit, external disk drive array, RAID system, magnetic
Tape drive and data backup storage system etc..
Through the above description of the embodiments, those skilled in the art is it can be readily appreciated that example described herein is implemented
Mode can also be realized by software realization in such a way that software is in conjunction with necessary hardware.Therefore, according to the disclosure
The technical solution of embodiment can be embodied in the form of software products, which can store non-volatile at one
Property storage medium (can be CD-ROM, USB flash disk, mobile hard disk etc.) in or network on, including some instructions are so that a calculating
Equipment (can be personal computer, server, terminal installation or network equipment etc.) is executed according to the exemplary implementation of the disclosure
The method of example.
In addition, above-mentioned attached drawing is only the schematic theory of the processing according to included by the method for disclosure exemplary embodiment
It is bright, rather than limit purpose.It can be readily appreciated that the time that above-mentioned processing shown in the drawings did not indicated or limited these processing is suitable
Sequence.In addition, be also easy to understand, these processing, which can be, for example either synchronously or asynchronously to be executed in multiple modules.
It should be noted that although being referred to several modules or list for acting the equipment executed in the above detailed description
Member, but this division is not enforceable.In fact, according to an exemplary embodiment of the present disclosure, above-described two or
More multimode or the feature and function of unit can embody in a module or unit.Conversely, above-described one
A module or the feature and function of unit can be to be embodied by multiple modules or unit with further division.
Those skilled in the art after considering the specification and implementing the invention disclosed here, will readily occur to its of the disclosure
His embodiment.This application is intended to cover any variations, uses, or adaptations of the disclosure, these modifications, purposes or
Adaptive change follow the general principles of this disclosure and including the undocumented common knowledge in the art of the disclosure or
Conventional techniques.The description and examples are only to be considered as illustrative, and the true scope and spirit of the disclosure are by claim
It points out.
It should be understood that the present disclosure is not limited to the precise structures that have been described above and shown in the drawings, and
And various modifications and changes may be made without departing from the scope thereof.The scope of the present disclosure is only limited by the attached claims.
Claims (15)
1. a kind of method for displaying user interface is applied to augmented reality equipment, which is characterized in that the described method includes:
It identifies the real-world object plane in real scene, reference planes is generated according to the real-world object plane;
The reference planes for meeting preset condition are determined as objective plane;
Obtain user interface to be shown;
The user interface is shown in the objective plane.
2. the method according to claim 1, wherein the method also includes:
Detect the movement of the augmented reality equipment;
When the moving distance of the augmented reality equipment is greater than pre-determined distance, the objective plane is redefined.
3. the method according to claim 1, wherein the method also includes:
Detect the rotation of the augmented reality equipment;
When the rotational angle of the augmented reality equipment is greater than predetermined angle, the objective plane is redefined.
4. the method according to claim 1, wherein described show user circle in the objective plane
Face, comprising:
The display size that the user interface is adjusted according to the width of the objective plane or height, by the user interface
It is shown in the objective plane.
5. according to the method described in claim 4, it is characterized in that, described adjust according to the width or height of the objective plane
The display size of the whole user interface, comprising:
Obtain the predetermined width and preset height of the user interface;
The scaling of the user interface is determined by the width of the predetermined width and the objective plane, or passes through institute
The height for stating preset height and the objective plane determines the scaling of the user interface;
The display size of the user interface is determined according to the scaling.
6. the method according to claim 1, wherein the reference planes for meeting preset condition are determined as mesh
Marking plane includes:
A datum plane is determined in the reference planes;
A virtual scene is shown according to the datum plane, and the reference planes of predeterminated position relationship will be met with the virtual scene
It is determined as the objective plane.
7. according to the method described in claim 6, the reference planes for meeting predeterminated position relationship with the virtual scene, packet
It includes:
With the reference planes of the horizontal plane of the virtual scene.
8. according to the method described in claim 6, it is characterized in that, described determine the reference planes for meeting preset condition
For objective plane, comprising:
The bounding box for calculating multiple reference planes for meeting predeterminated position relationship with the virtual scene, determines that bounding box is maximum
Reference planes are objective plane.
9. according to the method described in claim 6, it is characterized in that, described meet predeterminated position relationship with the virtual scene
Reference planes, comprising:
In the reference planes of the positive direction of the virtual scene.
10. the method according to claim 1, wherein the method also includes:
If it is determined that then the user interface is shown with predetermined manner there is no the reference planes for meeting preset condition.
11. a kind of method for displaying user interface characterized by comprising
It identifies the real-world object plane in real scene, objective plane is determined from the real-world object plane;
Obtain user interface to be shown;
The user interface is projected on the objective plane.
12. a kind of user interface display device, it is applied to augmented reality equipment, which is characterized in that described device includes:
Plane acquiring unit, the real-world object plane in real scene, generates according to the real-world object plane and joins for identification
Examine plane;
Objective plane determination unit, for the reference planes for meeting preset condition to be determined as objective plane;
Interface acquiring unit, for obtaining user interface;
Interface display unit, for showing the user interface in the objective plane.
13. a kind of user interface display device, it is applied to augmented reality equipment, which is characterized in that described device includes:
Plane acquiring unit, the real-world object plane in real scene, determines mesh from the real-world object plane for identification
Mark plane;
Interface acquiring unit, for obtaining user interface to be shown;
Interface projecting cell, for the user interface to be projected in the objective plane.
14. a kind of computer readable storage medium, is stored thereon with computer program, which is characterized in that the computer program
Claim 1-11 described in any item method for displaying user interface are realized when being executed by processor.
15. a kind of electronic equipment characterized by comprising
Processor;
Memory, for storing the executable instruction of the processor;And
Photographic device;
Wherein, the processor is configured to require 1-11 described in any item via executing the executable instruction and carry out perform claim
Method for displaying user interface.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910555316.XA CN110286906B (en) | 2019-06-25 | 2019-06-25 | User interface display method and device, storage medium and mobile terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910555316.XA CN110286906B (en) | 2019-06-25 | 2019-06-25 | User interface display method and device, storage medium and mobile terminal |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110286906A true CN110286906A (en) | 2019-09-27 |
CN110286906B CN110286906B (en) | 2024-02-02 |
Family
ID=68005761
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910555316.XA Active CN110286906B (en) | 2019-06-25 | 2019-06-25 | User interface display method and device, storage medium and mobile terminal |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110286906B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111124112A (en) * | 2019-12-10 | 2020-05-08 | 北京一数科技有限公司 | Interactive display method and device for virtual interface and entity object |
CN111736692A (en) * | 2020-06-01 | 2020-10-02 | Oppo广东移动通信有限公司 | Display method, display device, storage medium and head-mounted device |
CN112862976A (en) * | 2019-11-12 | 2021-05-28 | 北京超图软件股份有限公司 | Image generation method and device and electronic equipment |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4774684A (en) * | 1985-01-31 | 1988-09-27 | Canon Kabushiki Kaisha | Electronic apparatus with a display means |
CN102622776A (en) * | 2011-01-31 | 2012-08-01 | 微软公司 | Three-dimensional environment reconstruction |
CN102646275A (en) * | 2012-02-22 | 2012-08-22 | 西安华旅电子科技有限公司 | Method for realizing virtual three-dimensional superposition through tracking and positioning algorithms |
CN104007552A (en) * | 2014-05-30 | 2014-08-27 | 北京理工大学 | Display system of light field helmet with true stereo feeling |
CN108958466A (en) * | 2017-11-08 | 2018-12-07 | 北京市燃气集团有限责任公司 | Excavation Training Methodology based on virtual reality technology |
-
2019
- 2019-06-25 CN CN201910555316.XA patent/CN110286906B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4774684A (en) * | 1985-01-31 | 1988-09-27 | Canon Kabushiki Kaisha | Electronic apparatus with a display means |
CN102622776A (en) * | 2011-01-31 | 2012-08-01 | 微软公司 | Three-dimensional environment reconstruction |
CN102646275A (en) * | 2012-02-22 | 2012-08-22 | 西安华旅电子科技有限公司 | Method for realizing virtual three-dimensional superposition through tracking and positioning algorithms |
CN104007552A (en) * | 2014-05-30 | 2014-08-27 | 北京理工大学 | Display system of light field helmet with true stereo feeling |
CN108958466A (en) * | 2017-11-08 | 2018-12-07 | 北京市燃气集团有限责任公司 | Excavation Training Methodology based on virtual reality technology |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112862976A (en) * | 2019-11-12 | 2021-05-28 | 北京超图软件股份有限公司 | Image generation method and device and electronic equipment |
CN112862976B (en) * | 2019-11-12 | 2023-09-08 | 北京超图软件股份有限公司 | Data processing method and device and electronic equipment |
CN111124112A (en) * | 2019-12-10 | 2020-05-08 | 北京一数科技有限公司 | Interactive display method and device for virtual interface and entity object |
CN111736692A (en) * | 2020-06-01 | 2020-10-02 | Oppo广东移动通信有限公司 | Display method, display device, storage medium and head-mounted device |
CN111736692B (en) * | 2020-06-01 | 2023-01-31 | Oppo广东移动通信有限公司 | Display method, display device, storage medium and head-mounted device |
Also Published As
Publication number | Publication date |
---|---|
CN110286906B (en) | 2024-02-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2020202551B2 (en) | Method for representing points of interest in a view of a real environment on a mobile device and mobile device therefor | |
JP6643357B2 (en) | Full spherical capture method | |
KR101637990B1 (en) | Spatially correlated rendering of three-dimensional content on display components having arbitrary positions | |
US20170186219A1 (en) | Method for 360-degree panoramic display, display module and mobile terminal | |
US11120613B2 (en) | Image generating device and method of generating image | |
US11893702B2 (en) | Virtual object processing method and apparatus, and storage medium and electronic device | |
TWI701941B (en) | Method, apparatus and electronic device for image processing and storage medium thereof | |
US20200394841A1 (en) | Information processing apparatus, information processing method, and storage medium | |
US20160063764A1 (en) | Image processing apparatus, image processing method, and computer program product | |
KR20170134513A (en) | How to Display an Object | |
CN110286906A (en) | Method for displaying user interface, device, storage medium and mobile terminal | |
CN111373347B (en) | Apparatus, method and computer program for providing virtual reality content | |
US11430192B2 (en) | Placement and manipulation of objects in augmented reality environment | |
US11107184B2 (en) | Virtual object translation | |
CN109640070A (en) | A kind of stereo display method, device, equipment and storage medium | |
CN111681320B (en) | Model display method and device in three-dimensional house model | |
US11995776B2 (en) | Extended reality interaction in synchronous virtual spaces using heterogeneous devices | |
JP2020173529A (en) | Information processing device, information processing method, and program | |
US10296098B2 (en) | Input/output device, input/output program, and input/output method | |
CN110070617B (en) | Data synchronization method, device and hardware device | |
WO2021149509A1 (en) | Imaging device, imaging method, and program | |
JP2016139199A (en) | Image processing device, image processing method, and program | |
CN111651043B (en) | Augmented reality system supporting customized multi-channel interaction | |
CN112308981A (en) | Image processing method, image processing device, electronic equipment and storage medium | |
KR101741149B1 (en) | Method and device for controlling a virtual camera's orientation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |