CN105807936B - Information processing method and electronic equipment - Google Patents

Information processing method and electronic equipment Download PDF

Info

Publication number
CN105807936B
CN105807936B CN201610203269.9A CN201610203269A CN105807936B CN 105807936 B CN105807936 B CN 105807936B CN 201610203269 A CN201610203269 A CN 201610203269A CN 105807936 B CN105807936 B CN 105807936B
Authority
CN
China
Prior art keywords
boundary
virtual
virtual scene
dimensional
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610203269.9A
Other languages
Chinese (zh)
Other versions
CN105807936A (en
Inventor
许奔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201610203269.9A priority Critical patent/CN105807936B/en
Publication of CN105807936A publication Critical patent/CN105807936A/en
Application granted granted Critical
Publication of CN105807936B publication Critical patent/CN105807936B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the invention discloses an information processing method and electronic equipment, which are used for presenting a three-dimensional virtual scene; determining a three-dimensional virtual scene boundary; and processing the virtual object in the three-dimensional virtual scene based on the determined virtual scene boundary so that the virtual object in the virtual scene conforms to the determined virtual scene boundary. Therefore, the information processing method and the electronic device provided by the embodiment of the invention provide the concept of the virtual scene boundary, so that a user can interact with the virtual scene based on the virtual scene boundary, the interaction diversity of the user and the virtual scene is increased, and the user experience is improved.

Description

Information processing method and electronic equipment
Technical Field
The present invention relates to the field of electronic technologies, and in particular, to an information processing method and an electronic device.
Background
With the continuous development of scientific technology, the virtual reality technology gets more and more extensive attention, and a virtual scene is simulated, so that a user is as if he is immersed in the scene, and the user can interact with the virtual scene.
However, the inventor finds that, in the process of implementing the invention, the interactive modes of the user and the virtual scene are less at present, and the user experience is poorer. For example, when a virtual scene is projected in a real space by an augmented reality technology, the virtual scene is only passively projected, the projected area is often a bearing surface, such as a wall, and the projected area meets an open space without the obstruction of the bearing surface, because the straight-line propagation characteristic of light rays causes the projected image to be too divergent.
Therefore, how to increase the interactive mode of the user and the virtual scene presents a problem to be solved urgently.
Disclosure of Invention
The invention aims to provide an information processing method and electronic equipment, which are used for increasing the interactive mode of a user and a virtual scene.
In order to achieve the purpose, the invention provides the following technical scheme:
an information processing method applied to an electronic device, the method comprising:
presenting a three-dimensional virtual scene;
determining a three-dimensional virtual scene boundary;
and processing the virtual object in the three-dimensional virtual scene based on the determined virtual scene boundary so that the virtual object in the virtual scene conforms to the determined virtual scene boundary.
Preferably, the method for presenting a three-dimensional virtual scene includes: projecting a three-dimensional virtual scene in real space;
the determining the three-dimensional virtual scene boundary comprises: the boundary of the region to be projected in real space is determined.
In the above method, preferably, the processing the virtual object in the three-dimensional virtual scene based on the determined virtual scene boundary, so that the virtual object in the virtual scene conforms to the determined virtual scene boundary includes:
and projecting according to the determined boundary of the region to be projected, so that the virtual object in the projected three-dimensional virtual scene conforms to the determined boundary of the region to be projected.
The above method, preferably, the determining the boundary of the region to be projected in the real space includes:
collecting the motion track of an operation body in a real space;
and determining the boundary of the area to be projected in the real space based on the motion track of the operation body in the real space.
Preferably, the method for presenting a three-dimensional virtual scene includes: projecting a three-dimensional virtual scene in a real space, or displaying the three-dimensional virtual scene through a display unit;
the determining the three-dimensional virtual scene boundary comprises: a virtual boundary is determined in the three-dimensional virtual scene.
In the above method, preferably, the determining a virtual boundary in the three-dimensional virtual scene includes:
and determining the user visible boundary of the three-dimensional virtual scene as a virtual boundary in the three-dimensional virtual scene.
In the above method, preferably, the determining a virtual boundary in the three-dimensional virtual scene includes:
collecting the motion track of an operation body in a real space;
converting the motion trail of the operation body in the real space into a virtual motion trail in a three-dimensional virtual space;
determining a virtual boundary in the three-dimensional virtual scene based on the virtual motion trajectory.
In the above method, preferably, the processing the virtual object in the three-dimensional virtual scene based on the determined virtual scene boundary, so that the virtual object in the virtual scene conforms to the determined virtual scene boundary includes:
when a virtual object reaches the virtual scene boundary, performing processing of a virtual effect corresponding to the virtual scene boundary on the virtual object.
The above method, preferably, the processing of executing the virtual effect corresponding to the virtual scene boundary on the virtual object when the virtual object reaches the virtual scene boundary includes,
the virtual objects cross a virtual scene boundary to another virtual scene boundary, or,
the virtual object bounces off after reaching the virtual scene boundary, or,
the virtual object is cut into multiple copies after reaching the boundary of the virtual scene, or,
the virtual objects are compressed or enlarged after reaching the virtual scene boundary.
An electronic device, comprising: a rendering component for rendering a three-dimensional virtual scene, a processor and a memory coupled to the processor; wherein the content of the first and second substances,
the processor is used for determining a three-dimensional virtual scene boundary, and processing a virtual object in the three-dimensional virtual scene based on the determined virtual scene boundary so that the virtual object in the virtual scene conforms to the determined virtual scene boundary.
In the above electronic device, preferably, the presenting means includes: a projection unit for projecting a three-dimensional virtual scene in a real space;
in determining the boundary of the three-dimensional virtual scene, the processor is configured to determine the boundary of the region to be projected in the real space.
The above electronic device is preferably configured to, in processing the virtual object in the three-dimensional virtual scene based on the determined virtual scene boundary so that the virtual object in the virtual scene conforms to the determined virtual scene boundary,
and projecting according to the determined boundary of the region to be projected, so that the virtual object in the projected three-dimensional virtual scene conforms to the determined boundary of the region to be projected.
The above electronic device, preferably, in terms of determining a boundary of an area to be projected in real space, the processor is configured to,
collecting the motion track of an operation body in a real space; and determining the boundary of the area to be projected in the real space based on the motion track of the operation body in the real space.
In the electronic device, preferably, the presenting component is a projecting unit or a display unit, the projecting unit is configured to project a three-dimensional virtual scene in a real space, and the display unit is configured to display the three-dimensional virtual scene;
in determining the boundary of the three-dimensional virtual scene, the processor is configured to determine a virtual boundary in the three-dimensional virtual scene.
The above electronic device, preferably, in terms of determining a virtual boundary in the three-dimensional virtual scene, the processor is configured to,
and determining the user visible boundary of the three-dimensional virtual scene as a virtual boundary in the three-dimensional virtual scene.
The above electronic device, preferably, in terms of determining a virtual boundary in the three-dimensional virtual scene, the processor is configured to,
collecting the motion track of an operation body in a real space; converting the motion trail of the operation body in the real space into a virtual motion trail in a virtual space; determining a virtual boundary in the three-dimensional virtual scene based on the virtual motion trajectory.
The above electronic device is preferably configured to, in processing the virtual object in the three-dimensional virtual scene based on the determined virtual scene boundary, make the virtual object in the virtual scene conform to the determined virtual scene boundary,
when a virtual object reaches the virtual scene boundary, performing processing of a virtual effect corresponding to the virtual scene boundary on the virtual object.
The above electronic device, preferably, in terms of executing processing of a virtual effect corresponding to a virtual scene boundary on a virtual object when the virtual object reaches the virtual scene boundary, the processor is configured to,
the virtual objects cross a virtual scene boundary to another virtual scene boundary, or,
the virtual object bounces off after reaching the virtual scene boundary, or,
the virtual object is cut into multiple copies after reaching the boundary of the virtual scene, or,
the virtual objects are compressed or enlarged after reaching the virtual scene boundary.
According to the scheme, the information processing method and the electronic equipment provided by the application present the three-dimensional virtual scene; determining a three-dimensional virtual scene boundary; and processing the virtual object in the three-dimensional virtual scene based on the determined virtual scene boundary so that the virtual object in the virtual scene conforms to the determined virtual scene boundary. Therefore, the information processing method and the electronic device provided by the embodiment of the invention provide the concept of the virtual scene boundary, so that a user can interact with the virtual scene based on the virtual scene boundary, the interaction diversity of the user and the virtual scene is increased, and the user experience is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a flowchart of an implementation of an information processing method according to an embodiment of the present invention;
FIG. 2 is a flowchart of an implementation of determining a boundary of an area to be projected in real space according to an embodiment of the present invention;
fig. 3 is a flowchart of an implementation of determining a virtual boundary in a three-dimensional virtual scene according to an embodiment of the present invention;
FIG. 4a is an exemplary diagram of a virtual object (cup) before reaching a virtual scene boundary A according to an embodiment of the present invention;
fig. 4b is an exemplary diagram of a process of executing a virtual effect corresponding to a virtual scene boundary on a virtual object when the virtual object reaches the virtual scene boundary a according to the embodiment of the present invention;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings described above, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the application described herein may be practiced otherwise than as specifically illustrated.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, an electronic device implementing an information processing method according to an embodiment of the present invention includes:
step S11: presenting a three-dimensional virtual scene;
in the embodiment of the present invention, the manner of presenting the three-dimensional virtual scene may include, but is not limited to, the following: projecting a three-dimensional virtual scene in real space, for example, an augmented reality device projects a three-dimensional virtual scene in real space; the three-dimensional virtual scene is displayed by a display unit, for example, a virtual display device displays the generated three-dimensional virtual scene by the display unit.
Step S12: determining a three-dimensional virtual scene boundary;
the three-dimensional virtual scene boundary may be a boundary in a real space or a boundary in a three-dimensional virtual scene.
Optionally, the three-dimensional virtual scene boundary may be determined based on user-related information. For example, the three-dimensional virtual scene boundary may be determined based on a movement trajectory of the operation body in the real space, or may be determined based on a user view angle range.
Step S13: and processing the virtual object in the three-dimensional virtual scene based on the determined virtual scene boundary so that the virtual object in the virtual scene conforms to the determined virtual scene boundary.
The information processing method provided by the embodiment of the invention presents a three-dimensional virtual scene; determining a three-dimensional virtual scene boundary; and processing the virtual object in the three-dimensional virtual scene based on the determined virtual scene boundary so that the virtual object in the virtual scene conforms to the determined virtual scene boundary. Therefore, the information processing method and the electronic device provided by the embodiment of the invention provide the concept of the virtual scene boundary, so that a user can interact with the virtual scene based on the virtual scene boundary, the interaction diversity of the user and the virtual scene is increased, and the user experience is improved.
Optionally, an implementation manner of presenting a three-dimensional virtual scene provided in the embodiment of the present invention may include: a three-dimensional virtual scene is projected in real space.
Correspondingly, an implementation manner of determining the boundary of the three-dimensional virtual scene provided by the embodiment of the present invention may include: the boundary of the region to be projected in real space is determined.
In the embodiment of the invention, when the three-dimensional virtual scene is presented in the real space in a projection mode, the size of the projection area can be determined based on the user operation characteristics, namely the size of the projection area is set by the user, so that a mode of interacting with the user is increased, and the user experience is improved.
Optionally, the processing the virtual object in the three-dimensional virtual scene based on the determined virtual scene boundary, so that the virtual object in the virtual scene conforms to the determined virtual scene boundary, provided by the embodiment of the present invention, includes:
and projecting according to the determined boundary of the region to be projected, so that the virtual object in the projected three-dimensional virtual scene conforms to the determined boundary of the region to be projected.
That is to say, after the boundary of the area to be projected is determined, the boundary of the three-dimensional virtual scene is adjusted, so that the three-dimensional virtual scene is projected in the determined boundary of the area to be projected, and the projected image is prevented from being excessively dispersed.
Optionally, an implementation flowchart for determining a boundary of an area to be projected in a real space according to an embodiment of the present invention is shown in fig. 2, and may include:
step S21: collecting the motion track of an operation body in a real space;
the operation body may be an operation handle of the electronic device, or may be the user itself, or may be a part of the user's body, such as a finger. The motion track of the operation body in the real space can be acquired through a sensing device in the operation body.
Step S22: and determining the boundary of the region to be projected in the real space based on the acquired motion track of the operation body in the real space.
Optionally, the plane coordinates of each trajectory point in the motion trajectory of the collected operation body in the real space may be used to determine the boundary of the region to be projected on the bearing surface. Optionally, the boundary points of which the plane coordinates are the plane coordinates of the track points may be determined on the bearing surface, and the determined boundary points are sequentially connected according to the acquisition sequence of the track points to form a boundary to be projected and distinguished in the real space.
Optionally, an implementation manner of presenting a three-dimensional virtual scene provided in the embodiment of the present invention may be: the three-dimensional virtual scene is projected in a real space, or displayed through a display unit.
Correspondingly, an implementation manner for determining the boundary of the three-dimensional virtual scene provided by the embodiment of the present invention may be: a virtual boundary is determined in a three-dimensional virtual scene.
Regardless of the manner in which the three-dimensional virtual scene is rendered, in embodiments of the present invention, a virtual boundary is determined in the three-dimensional virtual scene, i.e., the virtual boundary is part of the three-dimensional virtual scene.
The virtual boundary may be a two-dimensional plane boundary or a three-dimensional space boundary.
Optionally, an implementation manner of determining a virtual boundary in a three-dimensional virtual scene provided by the embodiment of the present invention may be:
and determining the user visible boundary of the three-dimensional virtual scene as a virtual boundary in the three-dimensional virtual scene.
Regardless of the manner in which the virtual scene is presented, when the electronic device is stationary, the boundary of the virtual scene presented by the electronic device is also fixed, that is, when the electronic device is stationary, the virtual scene seen by the user also has a certain range, and the boundary of the three-dimensional virtual scene visible to the user can be determined as the boundary in the three-dimensional virtual scene.
Optionally, an implementation flowchart for determining a virtual boundary in a three-dimensional virtual scene provided in an embodiment of the present invention is shown in fig. 3, and may include:
step S31: collecting the motion track of an operation body in a real space;
the operation body may be an operation handle of the electronic device, or may be the user itself, or may be a part of the user's body, such as a finger. The motion track of the operation body in the real space can be acquired through a sensing device in the operation body.
Step S32: converting the motion trail of the operation body in the real space into a virtual motion trail in a three-dimensional virtual space;
the motion trail of the operation body in the real space can be converted into a virtual motion trail in the three-dimensional virtual space according to the preset mapping relation between the real space coordinates and the three-dimensional virtual space coordinates.
Step S33: and determining a virtual boundary in the three-dimensional virtual scene based on the virtual motion trail.
With the virtual boundaries in place, the manner of interaction with the user may be set based on the virtual boundaries.
Optionally, an implementation manner of processing a virtual object in a three-dimensional virtual scene based on the determined virtual scene boundary, so that the virtual object in the virtual scene conforms to the determined virtual scene boundary, provided by the embodiment of the present invention, may include:
when the virtual object reaches the virtual scene boundary, processing of a virtual effect corresponding to the virtual scene boundary is performed on the virtual object.
In the embodiment of the invention, the virtual effect processing operation of the virtual object is preset based on the virtual boundary, different virtual boundaries can be set with different virtual effect processing operations, and the interaction mode with a user is increased.
Optionally, an implementation manner of performing, when the virtual object reaches the boundary of the virtual scene, processing of the virtual effect corresponding to the boundary of the virtual scene on the virtual object, provided by the embodiment of the present invention, may be:
virtual objects cross a virtual scene boundary to another virtual scene boundary.
As shown in fig. 4a-4B, fig. 4a is an exemplary diagram of a virtual object (cup) provided by an embodiment of the present invention before reaching a virtual scene boundary a, where two virtual scene boundaries, namely a virtual scene boundary a and a virtual scene boundary B, are preset in fig. 4 a; fig. 4b is an exemplary diagram of a process of executing a virtual effect corresponding to a virtual scene boundary on a virtual object when the virtual object reaches the virtual scene boundary a according to the embodiment of the present invention. In this example, when the virtual object reaches the virtual scene boundary a, the virtual object appears on the virtual scene boundary B side. Before there is no virtual scene boundary, if a user wants to move a virtual object from the virtual scene boundary a to the virtual scene boundary B, the user can only drag the virtual object according to the direction of the dotted arrow in fig. 4a, and after the virtual scene boundary is established, the user can move the virtual object to the virtual scene boundary a, that is, move the virtual object according to the moving direction of the solid arrow in fig. 4a, and the virtual object can appear on the virtual scene boundary B side.
Obviously, after the virtual scene boundary exists, the user can move the virtual object more conveniently.
Another implementation of performing a process of a virtual effect corresponding to a virtual scene boundary on a virtual object when the virtual object reaches the virtual scene boundary may be:
the virtual object bounces back after reaching the virtual scene boundary. Such as a ball bouncing after rolling to the virtual scene boundary.
When the virtual object reaches the virtual scene boundary, another implementation manner of performing a process of a virtual effect corresponding to the virtual scene boundary on the virtual object may be:
after the virtual object reaches the boundary of the virtual scene, the virtual object is cut into multiple copies.
When the virtual object reaches the virtual scene boundary, another implementation manner of performing a process of a virtual effect corresponding to the virtual scene boundary on the virtual object may be:
when the virtual object reaches the virtual scene boundary, the virtual object is compressed or enlarged.
Corresponding to the method embodiment, an embodiment of the present invention further provides an electronic device, and a schematic structural diagram of the electronic device provided in the embodiment of the present invention is shown in fig. 5, and may include:
a presentation component 51, a processor 52 and a memory 53 coupled to the processor 52; wherein the content of the first and second substances,
a presentation component 51 for presenting a three-dimensional virtual scene;
in the embodiment of the present invention, the manner of presenting the three-dimensional virtual scene may include, but is not limited to, the following: projecting a three-dimensional virtual scene in real space, for example, an augmented reality device projects a three-dimensional virtual scene in real space; the three-dimensional virtual scene is displayed by a display unit, for example, a virtual display device displays the generated three-dimensional virtual scene by the display unit.
The processor 52 is configured to determine a boundary of a three-dimensional virtual scene, and process a virtual object in the three-dimensional virtual scene based on the determined boundary of the virtual scene, so that the virtual object in the virtual scene conforms to the determined boundary of the virtual scene.
The three-dimensional virtual scene boundary may be a boundary in a real space or a boundary in a three-dimensional virtual scene.
Optionally, the three-dimensional virtual scene boundary may be determined based on user-related information. For example, the three-dimensional virtual scene boundary may be determined based on a movement trajectory of the operation body in the real space, or may be determined based on a user view angle range.
The memory 53 is used to store programs and data generated during operation of the processor 52. The processor 52 realizes the above-described functions by executing the program stored in the memory 53.
The electronic equipment provided by the embodiment of the invention presents a three-dimensional virtual scene; determining a three-dimensional virtual scene boundary; and processing the virtual object in the three-dimensional virtual scene based on the determined virtual scene boundary so that the virtual object in the virtual scene conforms to the determined virtual scene boundary. Therefore, the information processing method and the electronic device provided by the embodiment of the invention provide the concept of the virtual scene boundary, so that a user can interact with the virtual scene based on the virtual scene boundary, the interaction diversity of the user and the virtual scene is increased, and the user experience is improved.
Optionally, the presentation component 51 may comprise a projection unit for projecting a three-dimensional virtual scene in real space;
in determining the boundaries of the three-dimensional virtual scene, the processor 52 may be configured to determine the boundaries of the area to be projected in real space.
In the embodiment of the invention, when the three-dimensional virtual scene is presented in the real space in a projection mode, the size of the projection area can be determined based on the user operation characteristics, namely the size of the projection area is set by the user, so that a mode of interacting with the user is increased, and the user experience is improved.
Optionally, in aspects where the virtual objects in the three-dimensional virtual scene are processed based on the determined virtual scene boundaries such that the virtual objects in the virtual scene conform to the determined virtual scene boundaries, the processor 52 may be configured to,
and projecting according to the determined boundary of the region to be projected, so that the virtual object in the projected three-dimensional virtual scene conforms to the determined boundary of the region to be projected.
That is to say, after the projection distinguishing boundary is determined, the boundary of the three-dimensional virtual scene is adjusted, so that the three-dimensional virtual scene is projected in the determined boundary of the area to be projected, and the projected image is prevented from being excessively dispersed.
Alternatively, in determining the boundaries of the area to be projected in real space, the processor 52 may be configured to,
collecting the motion track of an operation body in a real space; the boundary of the region to be projected in the real space is determined based on the motion track of the operation body in the real space.
The operation body may be an operation handle of the electronic device, or may be the user itself, or may be a part of the user's body, such as a finger. The motion track of the operation body in the real space can be acquired through a sensing device in the operation body.
Optionally, the plane coordinates of each trajectory point in the motion trajectory of the collected operation body in the real space may be used to determine the boundary of the region to be projected on the bearing surface. Optionally, the boundary points of which the plane coordinates are the plane coordinates of the track points may be determined on the bearing surface, and the determined boundary points are sequentially connected according to the acquisition sequence of the track points to form a boundary to be projected and distinguished in the real space.
Optionally, the presenting part 51 may be: the projection unit is used for projecting a three-dimensional virtual scene in a real space, and the display unit is used for displaying the three-dimensional virtual scene;
in determining the boundaries of the three-dimensional virtual scene, processor 52 may be configured to determine the virtual boundaries in the three-dimensional virtual scene.
Regardless of the manner in which the three-dimensional virtual scene is rendered, in embodiments of the present invention, a virtual boundary is determined in the three-dimensional virtual scene, i.e., the virtual boundary is part of the three-dimensional virtual scene.
The virtual boundary may be a two-dimensional plane boundary or a three-dimensional space boundary.
In an alternative aspect of determining virtual boundaries in a three-dimensional virtual scene, processor 52 may be configured to,
and determining the user visible boundary of the three-dimensional virtual scene as a virtual boundary in the three-dimensional virtual scene.
Regardless of the manner in which the virtual scene is presented, when the electronic device is stationary, the boundary of the virtual scene presented by the electronic device is also fixed, that is, when the electronic device is stationary, the virtual scene seen by the user also has a certain range, and the boundary of the three-dimensional virtual scene visible to the user can be determined as the boundary in the three-dimensional virtual scene.
In an alternative aspect of determining virtual boundaries in a three-dimensional virtual scene, processor 52 may be configured to,
collecting the motion track of an operation body in a real space; converting the motion track of the operation body in the real space into a virtual motion track in the virtual space; and determining a virtual boundary in the three-dimensional virtual scene based on the virtual motion trail.
The operation body may be an operation handle of the electronic device, or may be the user itself, or may be a part of the user's body, such as a finger. The motion track of the operation body in the real space can be acquired through a sensing device in the operation body.
The motion trail of the operation body in the real space can be converted into a virtual motion trail in the three-dimensional virtual space according to the preset mapping relation between the real space coordinates and the three-dimensional virtual space coordinates.
With the virtual boundaries in place, the manner of interaction with the user may be set based on the virtual boundaries.
Optionally, in aspects where the virtual objects in the three-dimensional virtual scene are processed based on the determined virtual scene boundaries such that the virtual objects in the virtual scene conform to the determined virtual scene boundaries, the processor 52 may be configured to,
when the virtual object reaches the boundary of the virtual scene, the virtual object is executed with the processing of the virtual effect corresponding to the boundary of the virtual scene.
In the embodiment of the invention, the virtual effect processing operation of the virtual object is preset based on the virtual boundary, different virtual boundaries can be set with different virtual effect processing operations, and the interaction mode with a user is increased.
Alternatively, in aspects in which processing of virtual effects corresponding to virtual scene boundaries is performed on virtual objects when the virtual objects reach the virtual scene boundaries, processor 52 may be configured to,
virtual objects cross a virtual scene boundary to another virtual scene boundary, or,
the virtual object bounces off after reaching the virtual scene boundary, or,
the virtual object is cut into multiple copies after reaching the boundary of the virtual scene, or,
the virtual objects are compressed or enlarged after reaching the virtual scene boundary.
The electronic device in the embodiment of the invention can be an augmented display device or a virtual reality device.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of the electronic device described above may refer to the corresponding process in the foregoing method embodiment, and is not described herein again.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (16)

1. An information processing method applied to an electronic device, the method comprising:
presenting a three-dimensional virtual scene;
determining a three-dimensional virtual scene boundary;
executing a virtual effect corresponding to the virtual scene boundary on a virtual object in the three-dimensional virtual scene based on the determined virtual scene boundary, so that the virtual object in the virtual scene conforms to the determined virtual scene boundary, including:
when a virtual object reaches the virtual scene boundary, a virtual effect corresponding to the virtual scene boundary is executed on the virtual object.
2. The method of claim 1, wherein said rendering a three-dimensional virtual scene comprises: projecting a three-dimensional virtual scene in real space;
the determining the three-dimensional virtual scene boundary comprises: the boundary of the region to be projected in real space is determined.
3. The method of claim 2, wherein performing a virtual effect corresponding to the virtual scene boundary on a virtual object in the three-dimensional virtual scene based on the determined virtual scene boundary to conform the virtual object in the virtual scene to the determined virtual scene boundary further comprises:
and projecting according to the determined boundary of the region to be projected, so that the virtual object in the projected three-dimensional virtual scene conforms to the determined boundary of the region to be projected.
4. The method of claim 2, wherein the determining the boundary of the area to be projected in real space comprises:
collecting the motion track of an operation body in a real space;
and determining the boundary of the area to be projected in the real space based on the motion track of the operation body in the real space.
5. The method of claim 1, wherein said rendering a three-dimensional virtual scene comprises: projecting a three-dimensional virtual scene in a real space, or displaying the three-dimensional virtual scene through a display unit;
the determining the three-dimensional virtual scene boundary comprises: a virtual boundary is determined in the three-dimensional virtual scene.
6. The method of claim 5, wherein determining a virtual boundary in the three-dimensional virtual scene comprises:
and determining the user visible boundary of the three-dimensional virtual scene as a virtual boundary in the three-dimensional virtual scene.
7. The method of claim 5, wherein determining a virtual boundary in the three-dimensional virtual scene comprises:
collecting the motion track of an operation body in a real space;
converting the motion trail of the operation body in the real space into a virtual motion trail in a three-dimensional virtual space;
determining a virtual boundary in the three-dimensional virtual scene based on the virtual motion trajectory.
8. The method of claim 1, wherein performing a virtual effect on the virtual object corresponding to the virtual scene boundary when the virtual object reaches the virtual scene boundary comprises,
the virtual objects cross a virtual scene boundary to another virtual scene boundary, or,
the virtual object bounces off after reaching the virtual scene boundary, or,
the virtual object is cut into multiple copies after reaching the boundary of the virtual scene, or,
the virtual objects are compressed or enlarged after reaching the virtual scene boundary.
9. An electronic device, comprising: a rendering component for rendering a three-dimensional virtual scene, a processor and a memory coupled to the processor; wherein the content of the first and second substances,
the processor is configured to determine a three-dimensional virtual scene boundary, perform a virtual effect corresponding to the virtual scene boundary on a virtual object in the three-dimensional virtual scene based on the determined virtual scene boundary, and conform the virtual object in the virtual scene to the determined virtual scene boundary, and includes:
when a virtual object reaches the virtual scene boundary, a virtual effect corresponding to the virtual scene boundary is executed on the virtual object.
10. The electronic device of claim 9, wherein the presentation component comprises: a projection unit for projecting a three-dimensional virtual scene in a real space;
in determining the boundary of the three-dimensional virtual scene, the processor is configured to determine the boundary of the region to be projected in the real space.
11. The electronic device of claim 10, wherein in terms of performing a virtual effect corresponding to the virtual scene boundary on a virtual object in the three-dimensional virtual scene based on the determined virtual scene boundary such that the virtual object in the virtual scene conforms to the determined virtual scene boundary, the processor is further configured to,
and projecting according to the determined boundary of the region to be projected, so that the virtual object in the projected three-dimensional virtual scene conforms to the determined boundary of the region to be projected.
12. The electronic device of claim 10, wherein in determining the boundary of the area to be projected in real space, the processor is configured to,
collecting the motion track of an operation body in a real space; and determining the boundary of the area to be projected in the real space based on the motion track of the operation body in the real space.
13. The electronic device according to claim 9, wherein the presentation component is a projection unit for projecting a three-dimensional virtual scene in real space or a display unit for displaying a three-dimensional virtual scene;
in determining the boundary of the three-dimensional virtual scene, the processor is configured to determine a virtual boundary in the three-dimensional virtual scene.
14. The electronic device of claim 13, wherein, in determining a virtual boundary in the three-dimensional virtual scene, the processor is configured to,
and determining the user visible boundary of the three-dimensional virtual scene as a virtual boundary in the three-dimensional virtual scene.
15. The electronic device of claim 13, wherein, in determining a virtual boundary in the three-dimensional virtual scene, the processor is configured to,
collecting the motion track of an operation body in a real space; converting the motion trail of the operation body in the real space into a virtual motion trail in a virtual space; determining a virtual boundary in the three-dimensional virtual scene based on the virtual motion trajectory.
16. The electronic device of claim 9, wherein in terms of performing a virtual effect on a virtual object corresponding to the virtual scene boundary when the virtual object reaches the virtual scene boundary, the processor is configured to,
the virtual objects cross a virtual scene boundary to another virtual scene boundary, or,
the virtual object bounces off after reaching the virtual scene boundary, or,
the virtual object is cut into multiple copies after reaching the boundary of the virtual scene, or,
the virtual objects are compressed or enlarged after reaching the virtual scene boundary.
CN201610203269.9A 2016-03-31 2016-03-31 Information processing method and electronic equipment Active CN105807936B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610203269.9A CN105807936B (en) 2016-03-31 2016-03-31 Information processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610203269.9A CN105807936B (en) 2016-03-31 2016-03-31 Information processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN105807936A CN105807936A (en) 2016-07-27
CN105807936B true CN105807936B (en) 2021-02-19

Family

ID=56459469

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610203269.9A Active CN105807936B (en) 2016-03-31 2016-03-31 Information processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN105807936B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107393017A (en) * 2017-08-11 2017-11-24 北京铂石空间科技有限公司 Image processing method, device, electronic equipment and storage medium
US11695908B2 (en) * 2018-03-30 2023-07-04 Sony Corporation Information processing apparatus and information processing method
CN109254660B (en) * 2018-08-31 2020-11-17 歌尔光学科技有限公司 Content display method, device and equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102142055A (en) * 2011-04-07 2011-08-03 上海大学 True three-dimensional design method based on augmented reality interactive technology
CN103049266A (en) * 2012-12-17 2013-04-17 天津大学 Mouse operation method of Delta 3D (Three-Dimensional) scene navigation
WO2014194066A1 (en) * 2013-05-30 2014-12-04 Charles Anthony Smith Hud object design and method
CN105074617A (en) * 2013-03-11 2015-11-18 日本电气方案创新株式会社 Three-dimensional user interface device and three-dimensional operation processing method
CN105144030A (en) * 2013-02-27 2015-12-09 微软技术许可有限责任公司 Mixed reality augmentation

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102142055A (en) * 2011-04-07 2011-08-03 上海大学 True three-dimensional design method based on augmented reality interactive technology
CN103049266A (en) * 2012-12-17 2013-04-17 天津大学 Mouse operation method of Delta 3D (Three-Dimensional) scene navigation
CN105144030A (en) * 2013-02-27 2015-12-09 微软技术许可有限责任公司 Mixed reality augmentation
CN105074617A (en) * 2013-03-11 2015-11-18 日本电气方案创新株式会社 Three-dimensional user interface device and three-dimensional operation processing method
WO2014194066A1 (en) * 2013-05-30 2014-12-04 Charles Anthony Smith Hud object design and method

Also Published As

Publication number Publication date
CN105807936A (en) 2016-07-27

Similar Documents

Publication Publication Date Title
CN110073417B (en) Method and apparatus for placing virtual objects of augmented or mixed reality applications in a real world 3D environment
Frati et al. Using Kinect for hand tracking and rendering in wearable haptics
Azmandian et al. Physical Space Requirements for Redirected Walking: How Size and Shape Affect Performance.
JP5833750B2 (en) Gesture control technology that expands the range of dialogue in computer vision applications
MY192140A (en) Information processing method, terminal, and computer storage medium
WO2021062098A4 (en) Devices, methods, and graphical user interfaces for interacting with three-dimensional environments
JP4234677B2 (en) Man-machine interface using deformable device
EP3446196B1 (en) Dynamic haptic retargeting
EP3275514A1 (en) Virtuality-and-reality-combined interactive method and system for merging real environment
EP3729237A1 (en) Augmented reality user interface control
US20170163958A1 (en) Method and device for image rendering processing
CN107583271A (en) The exchange method and device of selection target in gaming
JP2017529635A5 (en)
RU2017102195A (en) LOOKING BASED PLACEMENT OF OBJECTS IN A VIRTUAL REALITY
CN105807936B (en) Information processing method and electronic equipment
CN107479712B (en) Information processing method and device based on head-mounted display equipment
EP2051208A2 (en) Generating an asset for interactive entertainment using digital image capture
Mossel et al. Drillsample: precise selection in dense handheld augmented reality environments
EP2946274A1 (en) Methods and systems for creating swivel views from a handheld device
RU2667720C1 (en) Method of imitation modeling and controlling virtual sphere in mobile device
CN106611443B (en) Three-dimensional topographic point picking method and device
JP5876600B1 (en) Information processing program and information processing method
CN111803960A (en) Method and equipment for starting preset process
CN112613374A (en) Face visible region analyzing and segmenting method, face making-up method and mobile terminal
US10379639B2 (en) Single-hand, full-screen interaction on a mobile device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant