CN112274922A - Virtual subject position adjusting method and device, storage medium and electronic equipment - Google Patents

Virtual subject position adjusting method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN112274922A
CN112274922A CN202011299888.5A CN202011299888A CN112274922A CN 112274922 A CN112274922 A CN 112274922A CN 202011299888 A CN202011299888 A CN 202011299888A CN 112274922 A CN112274922 A CN 112274922A
Authority
CN
China
Prior art keywords
virtual
angle
prompt
adjusting
mixed reality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011299888.5A
Other languages
Chinese (zh)
Inventor
刘海岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202011299888.5A priority Critical patent/CN112274922A/en
Publication of CN112274922A publication Critical patent/CN112274922A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5258Changing parameters of virtual cameras by dynamically adapting the position of the virtual camera to keep a game object or game character in its viewing frustum, e.g. for tracking a character or a ball
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure relates to the field of human-computer interaction technologies, and in particular, to a method and an apparatus for adjusting a position of a virtual subject, a storage medium, and an electronic device. The method can comprise the following steps: presenting the virtual subject in a mixed reality scene presented by the mixed reality terminal; displaying a position prompt identifier on the virtual main body; acquiring an inclination angle of a target plane of the virtual main body relative to a coordinate plane in a virtual coordinate system in the mixed reality scene; and adjusting the display state of the position prompt mark according to the inclination angle of the target plane relative to the coordinate plane so as to prompt the relationship between the adjusted current inclination angle of the virtual main body and a preset angle. The method and the device can provide convenience for a user to adjust the position of the virtual main body in the three-dimensional space, reduce the difficulty of operation and improve the user experience.

Description

Virtual subject position adjusting method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of human-computer interaction technologies, and in particular, to a method and an apparatus for adjusting a position of a virtual subject, a storage medium, and an electronic device.
Background
Mixed Reality (MR) is a further development of virtual Reality technology by presenting virtual scene information in a real scene. And an interactive feedback information loop is constructed among the real world, the virtual world and the user so as to enhance the reality of the user experience.
In a mixed reality game or system interface, a virtual subject is usually required to be placed in a real space, and a user is provided with an experience of interacting with a real environment without influencing the user to observe the real environment.
However, in the process of placing the virtual body, it is difficult for the user to grasp the placing angle of the virtual body, so that the placing difficulty is increased.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure is directed to a method and an apparatus for adjusting a position of a virtual subject, a storage medium, and an electronic device, so as to overcome, at least to a certain extent, a problem that a user may have difficulty in grasping a placement angle of the virtual subject in a virtual subject placement process.
According to one aspect of the present disclosure, there is provided a virtual subject position adjustment method applied to a mixed reality terminal capable of presenting a virtual subject, the method including:
presenting the virtual subject in a mixed reality scene presented by the mixed reality terminal;
displaying a position prompt identifier on the virtual main body;
acquiring an inclination angle of a target plane of the virtual main body relative to a coordinate plane in a virtual coordinate system in the mixed reality scene;
and adjusting the display state of the position prompt mark according to the inclination angle of the target plane relative to the coordinate plane so as to prompt the relationship between the adjusted current angle of the virtual main body and a preset angle.
In an exemplary embodiment of the present disclosure, the display state includes one or more of a display position, a display color, or a display shape.
In an exemplary embodiment of the present disclosure, the adjusting the display state of the position prompt identifier according to an inclination angle of the target plane relative to the coordinate plane to prompt a relationship between an adjusted current angle of the virtual subject and a preset angle includes:
displaying a location reference identifier on the virtual subject;
and adjusting the relative position of the position prompt identifier and the position reference identifier according to the inclination angle of the target plane relative to the coordinate plane so as to prompt the relationship between the adjusted current angle of the virtual main body and a preset angle.
In an exemplary embodiment of the present disclosure, the adjusting the relative position of the position indication mark and the position reference mark according to the inclination angle of the target plane relative to the coordinate plane to indicate the relationship between the adjusted current angle of the virtual body and the preset angle includes:
and when the current angle is the same as the preset angle, adjusting the position prompt identifier to be positioned in the position reference identifier.
In an exemplary embodiment of the present disclosure, the adjusting the relative position of the position indication mark and the position reference mark according to the inclination angle of the target plane relative to the coordinate plane to indicate the relationship between the adjusted current angle of the virtual body and the preset angle further includes:
and when the current angle is different from the preset angle, adjusting the position prompt identifier to deviate from the position reference identifier.
In an exemplary embodiment of the present disclosure, the method further comprises:
and adjusting the distance of the position prompt identifier deviating from the position reference identifier according to the difference value between the current angle and the preset angle.
In an exemplary embodiment of the disclosure, the outline of the position cue marker and the outline of the position reference marker are at least partially identical.
According to an aspect of the present disclosure, there is provided a virtual subject position adjusting apparatus applied to a mixed reality terminal that can present a virtual subject, the virtual subject position adjusting apparatus including:
the first interaction module is used for presenting the virtual main body in the mixed reality scene presented by the mixed reality terminal;
the second interaction module is used for displaying a position prompt identifier on the virtual main body;
the acquisition module is used for acquiring the inclination angle of the target plane of the virtual main body relative to the coordinate plane in the virtual coordinate system in the mixed reality scene;
and the control module is used for adjusting the display state of the position prompt identifier according to the inclination angle of the target plane relative to the coordinate plane so as to prompt the adjusted relation between the current angle of the virtual main body and a preset angle.
According to an aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the virtual subject position adjustment method of any one of the above.
According to an aspect of the present disclosure, there is provided an electronic device including:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the virtual subject position adjustment method of any one of the above via execution of the executable instructions.
The invention discloses a virtual subject position adjusting method and device, a storage medium and electronic equipment. Presenting the virtual subject in a mixed reality scene presented by the mixed reality terminal; displaying a position prompt identifier on the virtual main body; acquiring an inclination angle of a target plane of the virtual main body relative to a coordinate plane in a virtual coordinate system in the mixed reality scene; and adjusting the display state of the position prompt mark according to the inclination angle of the target plane relative to the coordinate plane so as to prompt the relationship between the adjusted current inclination angle of the virtual main body and a preset angle. On one hand, a position prompt identifier is displayed on the virtual main body presented in the mixed reality scene, and the user can be prompted about the relationship between the current position of the virtual main body and the position to be adjusted through the position prompt identifier, so that the user is helped to adjust the position of the virtual main body according to the display state of the position prompt identifier, convenience is provided for the user to adjust the position of the virtual main body in a three-dimensional space, the operation difficulty is reduced, and the user experience is improved; on the other hand, the display state of the position prompt mark is adjusted by taking the inclination angle of the target plane of the virtual main body relative to the coordinate plane in the virtual coordinate system in the mixed reality scene as a reference; the position prompt mark can display different states according to different inclination angles, the different display states can prompt a user of the difference between the inclination angle of the currently adjusted virtual main body position and a preset angle, and compared with the condition that the angle is directly displayed as a reference prompt, the position prompt method has the advantages that the adjustment process is prevented from being too sensitive, and the stability and operability of position adjustment are improved; in another aspect, the display state of the position prompt mark prompts the adjustment of the relationship between the current inclination angle of the virtual main body and the preset angle, so that the interestingness of the whole adjustment process can be improved, and the user experience can also be improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The above and other features and advantages of the present disclosure will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty. In the drawings:
FIG. 1 is a schematic diagram of a mixed reality scene presented by a mixed reality terminal according to the present disclosure;
FIG. 2 is a flow chart of a virtual subject position adjustment method of the present disclosure;
FIG. 3 is a schematic diagram illustrating a process of adjusting the position of a virtual subject according to the present disclosure;
FIG. 4 is a block diagram of a virtual subject position adjustment apparatus of the present disclosure;
FIG. 5 is a block diagram view of an electronic device in an exemplary embodiment of the disclosure;
FIG. 6 is a schematic diagram illustrating a program product in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The same reference numerals denote the same or similar parts in the drawings, and thus, a repetitive description thereof will be omitted.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the embodiments of the disclosure can be practiced without one or more of the specific details, or with other methods, components, materials, devices, steps, and so forth. In other instances, well-known structures, methods, devices, implementations, materials, or operations are not shown or described in detail to avoid obscuring aspects of the disclosure.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. That is, these functional entities may be implemented in the form of software, or in one or more software-hardened modules, or in different networks and/or processor devices and/or microcontroller devices.
Mixed Reality (MR) is a further development of virtual Reality technology, which builds an interactive feedback information loop among the virtual world, the real world and the user by introducing real scene information into the virtual environment, so as to enhance the Reality of the user experience.
Referring to fig. 1, there is provided a schematic diagram of a mixed reality scene presented by a mixed reality terminal; as shown in fig. 1, in a mixed reality scene 101, for example, real scene information constructed by a table, a cup, etc., various virtual subjects 102 to be displayed, for example, a video playing interface, a calendar display interface, a music playing interface, etc., may be presented. In the process of placing the virtual main bodies 102, sometimes the user needs to adjust the placing position by himself or herself to place the virtual main bodies 102 at appropriate positions to meet the user's needs.
However, in the actual placing process, for example, when the virtual main body 102 is placed at a preset position such as a horizontal position or a vertical position, the user determines the angle at which the placing position is difficult to adjust through his own vision, so that the placing difficulty is increased, and the user experience is reduced.
In order to provide assistance to a user in the process of placing the virtual subject 102, referring to fig. 2, the present exemplary embodiment provides a method for adjusting a position of a virtual subject, which may specifically include the following steps:
step S210, presenting the virtual subject in a mixed reality scene presented by the mixed reality terminal;
step S220, displaying a position prompt mark on the virtual main body;
step S230, acquiring an inclination angle of a target plane of the virtual main body relative to a coordinate plane in a virtual coordinate system in the mixed reality scene;
step S240, adjusting a display state of the position prompt mark according to an inclination angle of the target plane relative to the coordinate plane, so as to prompt a relationship between the adjusted current inclination angle of the virtual body and a preset angle.
According to the virtual body position adjusting method in the exemplary embodiment, on one hand, a position prompt identifier is displayed on a virtual body presented in a mixed reality scene, and a user can be prompted about the relationship between the current position of the virtual body and a position to be adjusted through the position prompt identifier, so that the user is helped to adjust the position of the virtual body according to the display state of the position prompt identifier, convenience is provided for the user to adjust the position of the virtual body in a three-dimensional space, the difficulty of operation is reduced, and the user experience is improved; on the other hand, the display state of the position prompt mark is adjusted by taking the inclination angle of the target plane of the virtual main body relative to the coordinate plane in the virtual coordinate system in the mixed reality scene as a reference; the position prompt mark can display different states according to different inclination angles, the different display states can prompt a user of the difference between the inclination angle of the currently adjusted virtual main body position and a preset angle, and compared with the condition that the angle is directly displayed as a reference prompt, the position prompt method has the advantages that the adjustment process is prevented from being too sensitive, and the stability and operability of position adjustment are improved; in another aspect, the display state of the position prompt mark prompts the adjustment of the relationship between the current inclination angle of the virtual main body and the preset angle, so that the interestingness of the whole adjustment process can be improved, and the user experience can also be improved.
Next, a virtual subject position adjustment method in the present exemplary embodiment will be further described.
In step S210, the virtual subject is presented in the mixed reality scene presented by the mixed reality terminal.
In the present exemplary embodiment, the adjustment is mainly performed on the virtual body 102 presented in the mixed reality scene 101, that is, the position adjustment is performed on the virtual body 102 in the three-dimensional space. The user may adjust the position of the virtual subject 102 directly with a finger or may adjust the position of the virtual subject 102 via a control button on the mixed reality terminal.
In practical applications, a user first needs to select a virtual subject 102 to be presented through operations on a mixed reality terminal. The virtual main body 102 may be a video playing interface, a calendar display interface, a music playing interface, a virtual entity, and the like, which is not limited in this exemplary embodiment.
In practical applications, the structure and shape of the virtual body 102 may be preset on a mixed reality terminal, and this exemplary embodiment is not particularly limited in this respect. In the present exemplary embodiment, a square virtual subject will be described as an example, and other cases may refer to execution.
In step S220, a position prompt mark is displayed on the virtual subject.
Referring to fig. 3, which is a schematic diagram illustrating a position adjustment process of a virtual subject according to the present exemplary embodiment, as shown in fig. 3, a position prompt identifier 103 is disposed on the virtual subject 102, and is used for prompting a user about a relationship between a current position of the virtual subject 102 and a preset target position to help the user determine whether the virtual subject 102 has been adjusted in place. Wherein the position cue marker 103 is a distinguishably displayed part on the virtual body 102, e.g. having a different color, or being movable, etc.
In practical applications, the position prompt mark 103 may be disposed on the top surface or the side surface of the virtual body 102, or may be disposed at any convenient position of the virtual body 102 for observation. In the present exemplary embodiment, as shown in fig. 3, for example, the target position of the virtual body 102 is set as a vertical position, the position cue markers 103 are provided on the top surface 106 and the side surface 107 of the square virtual body 102, respectively.
In the present exemplary embodiment, the shape of the position prompt mark 103 may be, for example, a circle, a semicircle, an ellipse, a square, or an irregular shape such as a cartoon pattern, and this exemplary embodiment is not limited in this respect.
In step S230, an inclination angle of the target plane of the virtual subject with respect to a coordinate plane in a virtual coordinate system in the mixed reality scene is obtained.
In the exemplary embodiment, with reference to the virtual coordinate system in the mixed reality scene, by obtaining the inclination angle of the target plane 104 of the virtual subject 102 with respect to the coordinate plane in the virtual coordinate system, the current position of the target plane 104 can be determined, and thus the difference between the current position and the preset target position can be determined.
The difference between the first inclination angle of the target plane 104 in the current position relative to the coordinate plane in the virtual coordinate system and the second inclination angle of the target plane 104 in the preset target position relative to the coordinate plane in the virtual coordinate system is the difference between the current position and the preset target position. The position of the virtual body 102 is adjusted to the target position only when the first inclination angle and the second inclination angle are equal.
In the present exemplary embodiment, the target plane 104 in the target position may be parallel to the coordinate plane or may be non-parallel. That is, the second inclination angle may be 0 or any other angle, which is not limited in this exemplary embodiment.
In practical applications, the target plane 104 may be a plane where the position prompt identifier 103 is located, may also be another reference plane of the virtual body 102, and may also be an assumed non-visible plane, as long as the target plane corresponds to the virtual body 102 and is convenient for positioning the position of the virtual body 102, which is not limited in this exemplary embodiment.
In the present exemplary embodiment, the top surface 106 of the square virtual body 102 is illustrated as the target plane 104.
In step S240, a display state of the position prompt mark is adjusted according to an inclination angle of the target plane relative to the coordinate plane, so as to prompt a relationship between the adjusted current angle of the virtual body and a preset angle.
In the present exemplary embodiment, the display state of the position indication mark 103 is adjusted according to the inclination angle of the target plane relative to the coordinate plane, that is, the relationship between the current angle of the virtual body 102 and the preset angle is displayed through the position indication mark 103. Thus, a reference may be provided for the user to prompt the user for a difference between the current position of the virtual subject 102 and the preset target position to visualize the position adjustment of the virtual subject 102. The user may thus make corresponding adjustments to the virtual subject 102 based on the prompts.
In practical applications, the display state of the position prompt indicator 103 may be one or more of a display position, a display color, or a display shape. That is, the relationship between the current position of the virtual main body 102 and the preset target position can be prompted through different display positions, for example, as shown in fig. 3, the central position of the plane where the position prompt identifier 103 is located is taken as the target position; the more the virtual body 102 is displaced from the target position, the more the position cue marker 103 is displaced from the center position. The specific deviation amount of the position indication mark 103 may be determined according to the difference between the first inclination angle and the second inclination angle.
Specifically, the ratio of the maximum angle range of the target plane 104 occupied by the tilt angle may be calculated, where the maximum angle range is the maximum tilt angle of the target plane relative to the coordinate plane, and may be 90 degrees or 180 degrees, and for example, when the tilt angle is 30 degrees, the ratio of the maximum angle range occupied by the tilt angle is 1/3 with 90 degrees as the maximum angle range. In combination with the maximum movable distance L of the position indicator 103 on the plane, the above-mentioned ratio 1/3 is converted to the maximum movable distance L, that is, 1/3L may be the current position of the position indicator 103. As shown in fig. 3, when the central position is set as the target position, the position indication mark 103 can move within a half range of the plane where the position indication mark 103 is located, and the farther the position indication mark 103 is located from the central position, the more the current position of the virtual body 102 deviates from the target position.
In the exemplary embodiment, the position relation represented by the angle is converted into the position relation represented by the distance, so that the adjustable range of the prompt can be increased, compared with the condition that the angle is directly displayed as the reference prompt, the condition that the adjustment process is too sensitive and the accurate position is difficult to capture can be avoided, and the stability and the operability of position adjustment are improved.
In practical applications, the difference between the current position and the target position can also be prompted by causing the position prompt identifier 103 to display different colors. For example, as can be seen from comparing fig. 3(a) and 3(b), for the position indication mark 103 on the side surface 107 (in this case, the side surface 107 is taken as the target plane 104, and the target position of the virtual body 102 is taken as the side surface 107 parallel to the coordinate plane), the larger the difference between the current position of the virtual body 102 and the target position is, the darker the color of the position indication mark 103 on the side surface 107 is. When the virtual body 102 is located at the position shown in fig. 3(b), the angle between the side surface 107 and the coordinate plane is small, the difference between the current position of the virtual body 102 and the target position is small, and the color of the position indication mark 103 on the side surface 107 is light; when the virtual body 102 is located at the position shown in fig. 3(a), the angle between the side surface 107 and the coordinate plane is large, the current position of the virtual body 102 is greatly different from the target position, and the position indication mark 103 on the side surface 107 is dark in color. The user can judge the difference between the current adjusted position and the target position by the shade of the color of the position prompt mark 103.
In practical applications, the position indication mark 103 may also be used as an indication by different colors, that is, different display colors, such as green, yellow, red, etc., may be set according to the difference between the current angle and the preset angle. This exemplary embodiment is not particularly limited to this.
In practical applications, the difference between the current position and the target position can also be prompted by displaying different shapes through the position prompt identifier 103. For example, when the difference between the current angle and the preset angle is large, the position prompt identifier 103 may display a large shape; along with the difference between the current angle and the preset angle gradually becomes smaller, the shape displayed by the position prompt identifier 103 can be gradually smaller until the current angle is equal to the preset angle, and the shape displayed by the position prompt identifier 103 is minimum and even can disappear. The specific shape and size can be set according to actual conditions, and this exemplary embodiment is not particularly limited in this respect.
In the present exemplary embodiment, on the basis of the display position by the position cue indicator 103 as the relationship between the current angle of the cueing virtual body 102 and the preset angle, the position reference indicator 105 is also displayed on the virtual body 102. The relative positions of the position prompt mark 103 and the position reference mark 105 can be adjusted according to the inclination angle of the target plane 104 relative to the coordinate plane, so as to prompt the relationship between the adjusted current angle of the virtual body 102 and a preset angle.
In the present exemplary embodiment, the position reference marker 105 is a mark when the virtual body 102 is at the target position. When the current angle is the same as the preset angle, the position prompt mark 103 may be adjusted to be located within the position reference mark 105, and thus, the user may determine that the virtual body 102 has been adjusted to the target position. When the current angle is different from the preset angle, the position prompt identifier 103 can be adjusted to deviate from the position reference identifier 105, and the distance that the position prompt identifier 103 deviates from the position reference identifier 105 is adjusted according to the difference value between the current angle and the preset angle, that is, the deviated distance can be determined according to the difference value between the first inclination angle and the second inclination angle, and the larger the difference value is, the larger the deviated distance is.
In the method for adjusting the position of the virtual subject provided by the exemplary embodiment, the position reference identifier 105 is used as a reference for adjusting the position prompt identifier 103, and a user can intuitively judge the current position of the virtual subject 102 by comparing the position difference between the position reference identifier 105 and the position prompt identifier 103, so that intuitive auxiliary guidance can be provided for adjusting the position of the virtual subject 102, and the adjustment is more intuitive and convenient.
In the present exemplary embodiment, the position of the position reference mark 105 may be set on the plane where the position prompt mark 103 is located, for example, as shown in fig. 3, the position reference mark 105 may be set at the center of the plane where the position prompt mark 103 is located. In fig. 3, two position reference markers 105 are shown, one on the top surface 106 of the virtual body 102 and the other on the side surface 107 of the virtual body 102, but the position of the virtual body 102 can be adjusted by using either one of the two position reference markers 105 as a reference.
In practical applications, the position reference mark 105 and the position prompt mark 103 may be only disposed on the top surface 106 of the virtual body 102, the position reference mark 105 and the position prompt mark 103 may be only disposed on the side surface 107 of the virtual body 102, and the position reference mark 105 and the position prompt mark 103 may be simultaneously disposed on the top surface 106 and the side surface 107, respectively, which is not particularly limited in this exemplary embodiment.
In the present exemplary embodiment, in addition to using the combination of the position reference identifier 105 and the position prompt identifier 103 to prompt the relationship between the current tilt angle of the virtual body 102 and the preset angle, the position prompt identifier 103 may also be displayed with different colors for prompting, for example, taking the preset position of the target plane and the coordinate plane as an example of superposition, when the current angle is the same as the preset angle, the tilt angle of the target plane relative to the coordinate plane is 0 degree, that is, when the position prompt identifier 103 is located within the position reference identifier 105, the position prompt identifier 103 may be displayed as green; when the inclination angle of the target plane relative to the coordinate plane is greater than 0 degree and less than 60 degrees, the position prompt mark 103 may be displayed in yellow; when the inclination angle of the target plane relative to the coordinate plane is greater than 60 degrees, the position prompt identifier 103 may be displayed in red; therefore, the display prompt is diversified, the interestingness in the whole adjusting process is improved, and the user experience is improved.
In the exemplary embodiment, the shape of the position reference mark 105 may be set according to actual conditions, in order to more intuitively show the adjusted target position, the outline of the position reference mark 105 may be at least partially identical to the outline of the position prompt mark 103 (that is, the outlines of the two may be completely identical or only partially identical), and when the outlines of the same part of the position prompt mark 103 and the position reference mark 105 coincide, the adjustment to the target position is prompted.
In practical applications, the color of the position reference mark 105 may be set to be transparent, and only the periphery of the position reference mark 105 is displayed by a darker color, for example, black, etc. When the position prompt mark 103 intersects with the position reference mark 105, the position prompt mark 103 does not cover the periphery of the position reference mark 105 when gradually covering the position reference mark 105, so that a visual reference is provided for the user to adjust the position of the virtual body 102.
Referring to fig. 3, only two boundaries of the position reference mark 105 are color-displayed, and the position indicator mark 103 is moved as long as it is located between the two color-displayed boundaries, i.e. the outline of the position indicator mark 103 coincides with the outline of the position reference mark 105, so as to indicate that the position of the virtual body 102 has been adjusted to the target position.
It should be noted that although the various steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that these steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
In an exemplary embodiment of the present disclosure, there is also provided a virtual subject position adjusting apparatus, which may be applied to a mixed reality terminal capable of presenting a virtual subject, as shown in fig. 4, where the information processing apparatus 400 may include: a first interaction module 410, a second interaction module 420, an acquisition module 430, and a control module 440, wherein:
a first interaction module 410 operable to present the virtual subject in a mixed reality scene presented by the mixed reality terminal;
a second interaction module 420, which may be configured to display a location hint identifier on the virtual subject;
an obtaining module 430, configured to obtain an inclination angle of a target plane of the virtual subject with respect to a coordinate plane in a virtual coordinate system in the mixed reality scene;
the control module 440 may be configured to adjust a display state of the position prompt identifier according to an inclination angle of the target plane relative to the coordinate plane, so as to prompt a relationship between the adjusted current angle of the virtual body and a preset angle.
The specific details of each virtual subject position adjusting device module are already described in detail in the corresponding virtual subject position adjusting method, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the apparatus for performing are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
In an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Thus, various aspects of the invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 500 according to this embodiment of the invention is described below with reference to fig. 5. The electronic device 500 shown in fig. 5 is only an example and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 5, the electronic device 500 is embodied in the form of a general purpose computing device. The components of the electronic device 500 may include, but are not limited to: the at least one processing unit 510, the at least one memory unit 520, a bus 530 connecting various system components (including the memory unit 520 and the processing unit 510), and a display unit 540.
Wherein the storage unit 520 stores program code that can be executed by the processing unit 510 to cause the processing unit 510 to perform the steps according to various exemplary embodiments of the present invention described in the above section "exemplary method" of the present specification. For example, the processing unit 510 may execute step S210 shown in fig. 2, presenting the virtual subject in a mixed reality scene presented by the mixed reality terminal; step S220, displaying a position prompt mark on the virtual main body; step S230, acquiring an inclination angle of a target plane of the virtual main body relative to a coordinate plane in a virtual coordinate system in the mixed reality scene; step S240, adjusting a display state of the position prompt mark according to an inclination angle of the target plane relative to the coordinate plane, so as to prompt a relationship between the adjusted current inclination angle of the virtual body and a preset angle.
The memory unit 520 may include a readable medium in the form of a volatile memory unit, such as a random access memory unit (RAM)5201 and/or a cache memory unit 5202, and may further include a read only memory unit (ROM) 5203.
Storage unit 520 may also include a program/utility 5204 having a set (at least one) of program modules 5205, such program modules 5205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 530 may be one or more of any of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 500 may also communicate with one or more external devices 570 (e.g., keyboard, pointing device, Bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 500, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 500 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interfaces 550. Also, the electronic device 500 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via the network adapter 560. As shown, the network adapter 560 communicates with the other modules of the electronic device 500 over the bus 530. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 500, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above section "exemplary methods" of the present description, when said program product is run on the terminal device.
Referring to fig. 6, a program product 600 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Furthermore, the above-described figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (10)

1. A virtual subject position adjusting method is applied to a mixed reality terminal capable of presenting a virtual subject, and is characterized by comprising the following steps:
presenting the virtual subject in a mixed reality scene presented by the mixed reality terminal;
displaying a position prompt identifier on the virtual main body;
acquiring an inclination angle of a target plane of the virtual main body relative to a coordinate plane in a virtual coordinate system in the mixed reality scene;
and adjusting the display state of the position prompt mark according to the inclination angle of the target plane relative to the coordinate plane so as to prompt the relationship between the adjusted current angle of the virtual main body and a preset angle.
2. The virtual subject position adjustment method according to claim 1, wherein the display state includes one or more of a display position, a display color, or a display shape.
3. The method for adjusting the position of the virtual body according to claim 2, wherein the step of adjusting the display state of the position indication mark according to the inclination angle of the target plane relative to the coordinate plane to indicate the relationship between the adjusted current angle of the virtual body and a preset angle includes:
displaying a location reference identifier on the virtual subject;
and adjusting the relative position of the position prompt identifier and the position reference identifier according to the inclination angle of the target plane relative to the coordinate plane so as to prompt the relationship between the adjusted current angle of the virtual main body and a preset angle.
4. The virtual body position adjusting method according to claim 3, wherein the step of adjusting the relative position of the position indication mark and the position reference mark according to the inclination angle of the target plane relative to the coordinate plane to indicate the relationship between the adjusted current angle of the virtual body and a preset angle includes:
and when the current angle is the same as the preset angle, adjusting the position prompt identifier to be positioned in the position reference identifier.
5. The method according to claim 3, wherein the step of adjusting the relative position between the position indication mark and the position reference mark according to the inclination angle of the target plane relative to the coordinate plane to indicate the relationship between the adjusted current angle of the virtual body and a preset angle further comprises:
and when the current angle is different from the preset angle, adjusting the position prompt identifier to deviate from the position reference identifier.
6. The virtual subject position adjustment method according to claim 5, characterized in that the method further comprises:
and adjusting the distance of the position prompt identifier deviating from the position reference identifier according to the difference value between the current angle and the preset angle.
7. The virtual subject position adjustment method according to any one of claims 3 to 6, wherein the outline of the position cue marker and the outline of the position reference marker are at least partially identical.
8. A virtual subject position adjusting device is applied to a mixed reality terminal capable of presenting a virtual subject, and is characterized by comprising:
the first interaction module is used for presenting the virtual main body in the mixed reality scene presented by the mixed reality terminal;
the second interaction module is used for displaying a position prompt identifier on the virtual main body;
the acquisition module is used for acquiring the inclination angle of the target plane of the virtual main body relative to the coordinate plane in the virtual coordinate system in the mixed reality scene;
and the control module is used for adjusting the display state of the position prompt identifier according to the inclination angle of the target plane relative to the coordinate plane so as to prompt the adjusted relation between the current angle of the virtual main body and a preset angle.
9. A computer-readable storage medium on which a computer program is stored, the computer program, when executed by a processor, implementing the virtual subject position adjustment method of any one of claims 1 to 7.
10. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the virtual subject position adjustment method of any one of claims 1-7 via execution of the executable instructions.
CN202011299888.5A 2020-11-19 2020-11-19 Virtual subject position adjusting method and device, storage medium and electronic equipment Pending CN112274922A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011299888.5A CN112274922A (en) 2020-11-19 2020-11-19 Virtual subject position adjusting method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011299888.5A CN112274922A (en) 2020-11-19 2020-11-19 Virtual subject position adjusting method and device, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN112274922A true CN112274922A (en) 2021-01-29

Family

ID=74399305

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011299888.5A Pending CN112274922A (en) 2020-11-19 2020-11-19 Virtual subject position adjusting method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN112274922A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113838217A (en) * 2021-09-23 2021-12-24 北京百度网讯科技有限公司 Information display method and device, electronic equipment and readable storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006061717A (en) * 2002-11-20 2006-03-09 Sega Corp Game image display control program, game device, and storage medium
CN108600528A (en) * 2018-04-09 2018-09-28 网易(杭州)网络有限公司 Interaction control method and device, electronic equipment, storage medium
CN110287913A (en) * 2019-06-28 2019-09-27 京东数字科技控股有限公司 Image flame detection reminding method and device, user terminal and storage medium
CN110827412A (en) * 2018-08-09 2020-02-21 北京微播视界科技有限公司 Method, apparatus and computer-readable storage medium for adapting a plane
CN111127406A (en) * 2019-12-10 2020-05-08 创维集团智能装备有限公司 Back plate machining position adjusting method, terminal, system and storage medium
CN111124133A (en) * 2019-12-30 2020-05-08 腾讯科技(深圳)有限公司 Method, device, equipment and storage medium for danger prompt information in virtual scene
CN111882633A (en) * 2020-07-24 2020-11-03 上海米哈游天命科技有限公司 Picture rendering method, device, equipment and medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006061717A (en) * 2002-11-20 2006-03-09 Sega Corp Game image display control program, game device, and storage medium
CN108600528A (en) * 2018-04-09 2018-09-28 网易(杭州)网络有限公司 Interaction control method and device, electronic equipment, storage medium
CN110827412A (en) * 2018-08-09 2020-02-21 北京微播视界科技有限公司 Method, apparatus and computer-readable storage medium for adapting a plane
CN110287913A (en) * 2019-06-28 2019-09-27 京东数字科技控股有限公司 Image flame detection reminding method and device, user terminal and storage medium
CN111127406A (en) * 2019-12-10 2020-05-08 创维集团智能装备有限公司 Back plate machining position adjusting method, terminal, system and storage medium
CN111124133A (en) * 2019-12-30 2020-05-08 腾讯科技(深圳)有限公司 Method, device, equipment and storage medium for danger prompt information in virtual scene
CN111882633A (en) * 2020-07-24 2020-11-03 上海米哈游天命科技有限公司 Picture rendering method, device, equipment and medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113838217A (en) * 2021-09-23 2021-12-24 北京百度网讯科技有限公司 Information display method and device, electronic equipment and readable storage medium
CN113838217B (en) * 2021-09-23 2023-09-12 北京百度网讯科技有限公司 Information display method and device, electronic equipment and readable storage medium

Similar Documents

Publication Publication Date Title
CN107913520B (en) Information processing method, information processing device, electronic equipment and storage medium
JP6159323B2 (en) Information processing method and information processing apparatus
US9355574B2 (en) 3D virtual training system and method
US8806359B2 (en) Workflow driven display for medical procedures
CN108287657B (en) Skill applying method and device, storage medium and electronic equipment
CN107329690B (en) Virtual object control method and device, storage medium and electronic equipment
US20190073029A1 (en) System and method for receiving user commands via contactless user interface
CN111670018A (en) Guidance for positioning a patient and a surgical robot
US20210358093A1 (en) Method and device of correcting image distortion, display device, computer readable medium, electronic device
CN108776544B (en) Interaction method and device in augmented reality, storage medium and electronic equipment
JPH08190640A (en) Information display method and information provision system
WO2009113026A2 (en) Apparatus to create, save and format text documents using gaze control and method associated based on the optimized positioning of cursor
CN110197461A (en) A kind of coordinate transformation relation determines method, apparatus, equipment and storage medium
CN112274922A (en) Virtual subject position adjusting method and device, storage medium and electronic equipment
JPH10283115A (en) Display input device
CN113849112B (en) Augmented reality interaction method, device and storage medium suitable for power grid regulation and control
CN113559501A (en) Method and device for selecting virtual units in game, storage medium and electronic equipment
CN102150116A (en) Remote-controlled pointing
US10073586B2 (en) Method and system for mouse pointer to automatically follow cursor
CN112274419B (en) Abdomen acupuncture point positioning system, method and device, control equipment and storage medium
CN112650391A (en) Human-computer interaction method, device and equipment based on virtual reality and storage medium
CN116954387A (en) Terminal keyboard input interaction method, device, terminal and medium
CN114743433B (en) Multi-channel alarm presenting method and device for simulating threats in flight training environment
CN115531875A (en) Virtual scene zooming method and device, storage medium and electronic equipment
CN113769403A (en) Virtual object moving method and device, readable storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination