CN108415570B - Control selection method and device based on augmented reality - Google Patents

Control selection method and device based on augmented reality Download PDF

Info

Publication number
CN108415570B
CN108415570B CN201810187008.1A CN201810187008A CN108415570B CN 108415570 B CN108415570 B CN 108415570B CN 201810187008 A CN201810187008 A CN 201810187008A CN 108415570 B CN108415570 B CN 108415570B
Authority
CN
China
Prior art keywords
target control
terminal
control
threshold
plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810187008.1A
Other languages
Chinese (zh)
Other versions
CN108415570A (en
Inventor
吴志武
雷月雯
申文迪
姜帆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201810187008.1A priority Critical patent/CN108415570B/en
Publication of CN108415570A publication Critical patent/CN108415570A/en
Application granted granted Critical
Publication of CN108415570B publication Critical patent/CN108415570B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a control selection method and device based on augmented reality. Wherein, the method comprises the following steps: displaying a target control in a virtual scene; and in the process that the terminal moves towards the target control, controlling the target control to be in a selected state under the condition that a first preset condition is met. The invention solves the technical problem that the AR scene display is incomplete because the UI control is arranged in the display screen of the terminal by the AR application in the related technology.

Description

Control selection method and device based on augmented reality
Technical Field
The invention relates to the field of augmented reality, in particular to a control selection method and device based on augmented reality.
Background
Augmented Reality (AR) is a new technology for seamlessly integrating real world information and virtual world information, and is characterized in that entity information (visual information, sound, taste, touch and the like) which is difficult to experience in a certain time space range of the real world originally is overlapped after being simulated through scientific technologies such as computers, virtual information is applied to the real world and is perceived by human senses, so that the sensory experience beyond Reality is achieved. The real environment and the virtual object are superimposed in real time onto the same picture or space. The AR application is relatively extensive, and at present, because the AR in the mobile terminal device (for example, a mobile phone) needs to occupy the hand of the user to grasp, and the user needs to see the virtual image through the mobile phone screen, the user usually selects or interacts with the mobile phone screen, which limits the space advantage of the AR application to a great extent, and limits the interaction behavior of the user on the mobile phone screen, which not only results in incomplete display of the AR scene, but also results in lower operation efficiency and higher error probability when the user operates through the mobile phone screen.
Aiming at the problem that the display of an AR scene is incomplete due to the fact that a UI control is arranged in a display screen of a terminal in the AR application in the related art, an effective solution is not provided at present.
Disclosure of Invention
The embodiment of the invention provides a control selection method and device based on augmented reality, and aims to at least solve the technical problem that the display of an AR scene is incomplete because an AR application in the related art sets a UI control in a display screen of a terminal.
According to an aspect of the embodiments of the present invention, there is provided an augmented reality-based control selection method, including: displaying a target control in a virtual scene; and in the process that the terminal moves towards the target control, controlling the target control to be in a selected state under the condition that a first preset condition is met.
Further, the meeting the first predetermined condition includes: the terminal is located in a first preset angle range in front of the target control, an included angle between a normal vector of a plane where the terminal is located and a normal vector of a plane where the target control is located is in a second preset angle range, and a distance between the terminal and the target control is smaller than a first threshold value, wherein the direction of the normal vector of the plane where the terminal is located is a reverse direction faced by a display screen of the terminal, the direction of the normal vector of the plane where the target control is located is a direction faced by a front face of the target control, and the front face of the target control is a face facing the terminal.
Further, when a result of multiplying a relative position vector point by a normal vector of a plane where the target control is located is larger than a second threshold, determining that the terminal is located in the first predetermined angle range in front of the target control, wherein the relative position vector is a direction vector pointing to a position where the terminal is located from a center point of the target control; under the condition that the result of multiplying the normal vector point of the plane where the terminal is located by the normal vector of the plane where the target control is located is smaller than a third threshold value, determining that an included angle between the normal vector of the plane where the terminal is located and the normal vector of the plane where the target control is located is within a second preset angle range; determining that a distance between the terminal and the target control is less than the first threshold if the length of the relative position vector is less than the first threshold.
Further, when the target control is in the selected state, the method further comprises: and marking the target control as a selected state, and determining the display parameters of the target control according to the distance between the terminal and the target control.
Further, when the target control is in the selected state, the method further comprises: and controlling the target control to be in an activated state under the condition that a second preset condition is met.
Further, the meeting of the second predetermined condition includes: the distance between the terminal and the target control is smaller than a fourth threshold, wherein the fourth threshold is smaller than the first threshold.
Further, when the target control is in the activated state, the method further comprises: and controlling the target control to be in a trigger state under the condition that a third preset condition is met.
Further, the satisfying of the third predetermined condition includes: an included angle between a motion direction vector of the terminal and a normal vector of a plane where the target control is located is within a third preset angle range, and a distance between the terminal and the target control is smaller than a fifth threshold, wherein the motion direction vector of the terminal is a vector pointing to a position where the terminal is located in a current frame from a position where the terminal is located in a previous frame, and the fifth threshold is smaller than the fourth threshold.
Further, under the condition that a result of multiplying the motion direction vector point of the terminal by the normal vector of the plane where the target control is located is smaller than a sixth threshold, determining that an included angle between the motion direction vector of the terminal and the normal vector of the plane where the target control is located is within a third predetermined angle range; and under the condition that the length of the relative position vector is smaller than the fifth threshold, determining that the distance between the terminal and the target control is smaller than the fifth threshold, wherein the relative position vector is a direction vector pointing to the position of the terminal from the central point of the target control.
Further, when the target control is in the trigger state, the method further includes: and prompting the target control which is successfully triggered.
Further, when the target control is in the selected state, the method further comprises: and controlling the target control to be in a trigger state under the condition that a fourth preset condition is met.
Further, the satisfying of the fourth predetermined condition includes: the distance between the terminal and the target control is less than a seventh threshold, wherein the seventh threshold is less than the first threshold.
Further, when the target control is in the selected state, the method further comprises: binding the terminal with the target control; and controlling the target control to move in the virtual scene by using the terminal.
Further, the controlling, by the terminal, the target control to move in the virtual scene includes at least one of: the target control moves along a preset track along with the terminal; the target control rotates along with the terminal; and the target control swings along with the terminal at the position of the target control.
Further, in the process of controlling the target control to move in the virtual scene by using the terminal, the method further includes: acquiring a control operation executed on the terminal, wherein the control operation is used for indicating that the binding relationship between the terminal and the target control is released; responding to the control operation to remove the binding relationship between the terminal and the target control, and acquiring the current moving position of the target control; under the condition that the position to which the target control is currently moved is a first position, updating the position of the target control to the first position, wherein the first position is a position where the target control is allowed to be located; and under the condition that the position to which the target control is moved currently is a second position, restoring the position of the target control to the original position of the target control, wherein the second position is a position where the target control is not allowed to be located.
According to another aspect of the embodiments of the present invention, there is also provided an augmented reality-based control selection apparatus, including: the display unit is used for displaying the target control in the virtual scene; and the selection unit is used for controlling the target control to be in a selected state under the condition that a first preset condition is met in the process that the terminal moves towards the target control.
According to another aspect of the embodiments of the present invention, there is also provided a storage medium, where the storage medium includes a stored program, and the program executes any one of the above methods for selecting an augmented reality-based control when running.
According to another aspect of the embodiments of the present invention, there is also provided a processor, configured to execute a program, where the program executes any one of the above augmented reality-based control selection methods.
According to another aspect of the embodiments of the present invention, there is also provided a terminal, including a memory and a processor, where the memory stores a computer program, and the processor is configured to execute the computer program to perform any one of the above augmented reality-based control selection methods.
In the embodiment of the invention, the target control is controlled to be in the selected state under the condition that a first preset condition for indicating and triggering the selection of the target control is met in the process that the terminal moves towards the target control by adopting a mode of displaying the target control in a virtual scene, so that the aim of operating the target control on a terminal screen is achieved, the technical problem that the AR scene display is incomplete due to the fact that the UI control is arranged in the display screen of the terminal by the AR application in the related technology is solved, the terminal is ensured to display the complete AR scene, and the technical effect of improving the operating efficiency of the target control is achieved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 is a schematic diagram of a hardware environment for an augmented reality based control selection method according to an embodiment of the invention;
FIG. 2 is a flow diagram of an augmented reality based control selection method according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a normal vector of a plane in which a target control is located according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a relative position vector according to an embodiment of the invention;
FIG. 5 is a diagram illustrating normal vectors of a plane in which a terminal is located according to an embodiment of the present invention;
FIG. 6 is a schematic illustration of a first predetermined angle according to an embodiment of the present invention;
FIG. 7 is a schematic illustration of a second predetermined angle according to an embodiment of the present invention;
FIG. 8 is a flow chart of a method of touch selection in a mobile device AR application in accordance with a preferred embodiment of the present invention; and
fig. 9 is a schematic diagram of an augmented reality based control selection apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In accordance with an embodiment of the present invention, there is provided a method embodiment of an augmented reality based control selection method, it is noted that the steps illustrated in the flowchart of the drawings may be performed in a computer system such as a set of computer executable instructions and that while a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than here.
Optionally, in this embodiment, the augmented reality-based control selection method may be applied to a hardware environment formed by the server 102 and the terminal 104 as shown in fig. 1. As shown in fig. 1, a server 102 is connected to one or more terminals 104 (only one terminal is shown in fig. 1) via a network including, but not limited to: the terminal 104 is not limited to a PC, a mobile phone, a tablet computer, etc., and may be a terminal with an AR function. The augmented reality-based control selection method of the embodiment of the invention can be executed by the terminal 104, and can also be executed by a client installed on the terminal 104.
Fig. 2 is a flowchart of a method for selecting a control in augmented reality according to an embodiment of the present invention, and as shown in fig. 2, the method includes the following steps:
step S102, displaying a target control in a virtual scene;
and step S104, in the process that the terminal moves towards the target control, controlling the target control to be in a selected state under the condition that a first preset condition is met.
Through the steps, the target control is controlled to be in the selected state under the condition that the first preset condition for indicating triggering selection of the target control is met in the process that the terminal moves towards the target control, the purpose of operating the target control on a terminal screen is achieved, the technical problem that in the prior art, an AR application sets a UI control in a display screen of the terminal, so that display of an AR scene is incomplete is solved, the situation that the terminal displays the complete AR scene is guaranteed, and the technical effect of operating efficiency of the target control is improved.
In the scheme provided in step S102, the target control may be a UI control, and the user may perform relevant control on the augmented reality through the target control. The shape of the target control is not particularly limited in the embodiment of the invention, and the target control can be rectangular, circular and the like. The target control in the embodiment of the present invention may include at least one target element, where each target element may correspond to a UI function. The target control can be displayed in a virtual scene, where the virtual scene is a three-dimensional virtual space. It should be noted that the nature of the target control may be a geometric body that can be interacted with, and therefore, the target control may be established in a three-dimensional virtual space in the embodiment of the present invention, and the technical manner adopted for displaying the target control in a virtual scene in the embodiment of the present invention is not specifically limited. According to the embodiment of the invention, the target control is displayed in the virtual scene instead of the terminal, so that the terminal display screen can completely display the AR scene, and the use experience of a user can be improved.
In the scheme provided in step S104, the embodiment of the present invention may utilize the terminal to control the target control displayed in the virtual scene, and specifically, in the process that the terminal moves toward the target control, the embodiment of the present invention may determine whether a first predetermined condition is satisfied in real time, where the first predetermined condition may be used to indicate that the target control is triggered to be selected. It should be noted that the first predetermined condition may be set according to actual requirements.
Optionally, in the embodiment of the present invention, it may be determined that the first predetermined condition is satisfied only when the terminal is located in the first predetermined angle range in front of the target control, an included angle between a normal vector of a plane in which the terminal is located and a normal vector of a plane in which the target control is located is within a second predetermined angle range, and a distance between the terminal and the target control is smaller than a first threshold. Optionally, the normal vector of the plane where the terminal is located may be a normal vector passing through a center point of the terminal, and the direction of the normal vector of the plane where the terminal is located may be an opposite direction to which the display screen of the terminal faces. Alternatively, the terminal may be rectangular, and the central point of the terminal is the intersection point of the diagonal lines of the rectangle. Optionally, the normal vector of the plane where the target control is located may be a normal vector passing through the center point of the target control, and the direction of the normal vector of the plane where the target control is located may be a direction faced by the front face of the target control, where the front face of the target control may be a face facing the terminal, and optionally, the target control may also be a rectangle, and then the center point of the target control may be a rectangular diagonal intersection point. The front of the target control can be the direction of the target control towards the terminal.
Optionally, in a case that a result of multiplying the relative position vector point by a normal vector of a plane where the target control is located is greater than a second threshold, it may be determined that the terminal is located in a first predetermined angle range in front of the target control, where the relative position vector may be a direction vector pointing from a center point of the target control to a location where the terminal is located. And under the condition that the result of multiplying the normal vector point of the plane where the terminal is located by the normal vector of the plane where the target control is located is smaller than a third threshold value, determining that the included angle between the normal vector of the plane where the terminal is located and the normal vector of the plane where the target control is located is within a second preset angle range. And determining that the distance between the terminal and the target control is smaller than the first threshold value under the condition that the length of the relative position vector is smaller than the first threshold value.
That is to say, in the embodiment of the present invention, whether the first predetermined condition is satisfied may be determined by whether the result of multiplying the relative position vector point by the normal vector of the plane in which the target control is located is greater than the second threshold, whether the result of multiplying the normal vector of the plane in which the terminal is located by the normal vector of the plane in which the target control is located is less than the third threshold, and whether the length of the relative position vector is less than the first threshold, where the first predetermined condition may be satisfied only when all of the three conditions are satisfied. It should be noted that the first threshold, the second threshold, and the third threshold may be set or adjusted according to actual requirements, and are not specifically limited herein.
For the sake of clarity of the first predetermined condition, the following detailed description will be made on each vector and each angle included in the first predetermined condition with reference to the specific drawings:
as shown in fig. 3, the intersection points of the diagonal lines of the target control 10 and the terminal 20 are a central point O and a central point Q, respectively, the normal vector of the plane where the target control is located is a normal vector passing through the central point of the target control 10, as shown by a vector OM in fig. 3, the direction of the normal vector of the plane where the target control is located is the direction facing the front surface of the target control 10, and the front surface of the target control 10 is the surface facing the terminal 20.
As shown in fig. 4, the intersection points of the diagonal lines of the target control 10 and the terminal 20 are a center point O and a center point Q, respectively, and the relative position vector is a direction vector pointing from the center point of the target control 10 to the position of the terminal 20, and optionally, the relative position vector is a direction vector pointing from the center point of the target control 10 to the center point of the terminal 20, such as a vector OQ shown in fig. 4.
As shown in fig. 5, the intersection points of the diagonal lines of the target control 10 and the terminal 20 are a center point O and a center point Q, respectively, the normal vector of the plane where the terminal is located is a normal vector passing through the center point of the target terminal 20, as shown in fig. 5, the direction of the normal vector of the plane where the terminal is located is the direction facing the front surface of the terminal 20, and the front surface of the terminal 20 is the surface facing the target control 10.
The terminal is located in front of the target control within a first predetermined angle, where it should be noted that the front of the target control refers to a side of the target control facing the terminal, the first predetermined angle is an included angle between the relative position vector and a normal vector of a plane where the target control is located, and the first predetermined angle is an angle α between a vector OQ and a vector OM as shown in fig. 6. As shown in fig. 6, the smaller α, the terminal is required to be directly in front of the target control to satisfy the first predetermined condition.
An included angle between the normal vector of the plane where the terminal is located and the normal vector of the plane where the target control is located is within a second predetermined angle range, where it should be noted that the second predetermined angle is an angle β between a vector QN and a vector OM as shown in fig. 7. The larger beta is, the closer the plane of the terminal 20 and the plane of the target control 10 are parallel to each other is required to satisfy the second predetermined condition.
It should be noted that, in the embodiment of the present invention, the first predetermined angle and the second predetermined angle are not specifically limited, and may be set or adjusted according to actual requirements. For example, the first predetermined angle may be 0-90 degrees and the second predetermined angle may be 90-180 degrees.
In the process that the terminal moves towards the target control, if the first preset condition is judged to be met, the embodiment of the invention can control the target control to be in a selected state; and if the first preset condition is judged not to be met, the target control is not triggered to be selected.
According to the embodiment of the invention, the target control is displayed in the virtual scene, the terminal is moved towards the target control to trigger and select the target control, so that the terminal can completely display the AR scene, and the terminal is moved to trigger and select the target control.
As an optional embodiment, when the target control is in the selected state, the embodiment may further move the terminal continuously toward the target control, and may determine in real time whether a second predetermined condition is met during the moving process, where the second predetermined condition may be used to indicate that the control target control is in the activated state. The second predetermined condition is not specifically limited in the embodiments of the present invention, and may be set or adjusted according to the requirements of the actual application scenario.
Optionally, it may be determined that the second predetermined condition is satisfied only when the terminal is located in a first predetermined angle range in front of the target control, an included angle between a normal vector of a plane in which the terminal is located and a normal vector of a plane in which the target control is located is within a second predetermined angle range, and a distance between the terminal and the target control is smaller than a fourth threshold. Under the condition that the result of multiplying the relative position vector point by the normal vector of the plane where the target control is located is larger than a second threshold value, the terminal can be determined to be located in a first preset angle range in front of the target control; under the condition that the result of multiplying the normal vector point of the plane where the terminal is located by the normal vector of the plane where the target control is located is smaller than a third threshold value, determining that an included angle between the normal vector of the plane where the terminal is located and the normal vector of the plane where the target control is located is within a second preset angle range; and determining that the distance between the terminal and the target control is smaller than a fourth threshold value under the condition that the length of the relative position vector is smaller than the fourth threshold value. The fourth threshold is smaller than the first threshold, and the fourth threshold may also be set or adjusted according to the requirements of the actual application scenario, which is not specifically limited herein.
When the target control is in the selected state, if the second preset condition is judged to be met, the embodiment of the invention can control the target control to be in the activated state, so that the purpose of controlling the target control by using the terminal is realized.
As an optional embodiment, when the target control is controlled to be in the selected state, if it is determined that the terminal is located in a first predetermined angle range in front of the target control, an included angle between a normal vector of a plane where the terminal is located and a normal vector of a plane where the target control is located is in a second predetermined angle range, and a distance between the terminal and the target control is greater than a fourth threshold, the target control may be marked as the selected state, where a display parameter of the selected state is determined by the distance between the terminal and the target control.
It should be noted that, after determining that the first predetermined condition is satisfied, the target control is selected. The target control may be marked as the selected state, for example, the target control may become highlighted, have a larger shape, and the like, and the representation form of the selected state of the target control is not specifically limited here. After the target control is marked as the selected state, in the moving process of the terminal continuously moving towards the target control, if the distance between the terminal and the target control is not smaller than the fourth threshold, that is, before the second predetermined condition for activating the touch event of the target control is not met, the display parameter of the selected state of the target control may change correspondingly with the distance between the terminal and the target control, for example, as the distance between the terminal and the target control becomes closer, the target control may appear brighter and brighter, or the target control may appear larger and larger in shape. Accordingly, if the terminal is further away from the target control, the target control may appear darker and darker, or the target control may appear smaller and smaller in shape.
The selected state of the target control changes along with the distance between the terminal and the target control, so that a user can intuitively and clearly know how to activate the touch event of the target control, and the interaction experience between the user and the target control can be improved.
As an optional embodiment, when the control target control is in an activated state, the embodiment may further move the terminal continuously toward the target control, and may determine whether a third predetermined condition is met in real time during the moving process, where the third predetermined condition may be used to indicate that a target element in the target control is triggered, so as to achieve a purpose of triggering a certain UI function. The third predetermined condition is not specifically limited in the embodiments of the present invention, and may be set or adjusted according to the requirements of the actual application scenario.
Optionally, it may be determined that the third predetermined condition is satisfied only when the terminal is located in the first predetermined angle range in front of the target control, an included angle between the motion direction vector of the terminal and the normal vector of the plane where the target control is located is in a third predetermined angle range, and a distance between the terminal and the target control is smaller than a fifth threshold. The motion direction vector of the terminal is a vector pointing from the position of the terminal of the previous frame to the position of the terminal of the current frame.
Optionally, when the result of multiplying the relative position vector by the normal vector of the plane where the target control is located is greater than a second threshold, it may be determined that the terminal is located in a first predetermined angle range in front of the target control; under the condition that the result of multiplying the motion direction vector point of the terminal by the normal vector of the plane where the target control is located is smaller than a sixth threshold value, determining that an included angle between the motion direction vector of the terminal and the normal vector of the plane where the target control is located is within a third preset angle range; in the event that the length of the relative position vector is less than a fifth threshold, it may be determined that the distance between the terminal and the target control is less than the fifth threshold.
That is to say, in the embodiment of the present invention, whether the third predetermined condition is satisfied may be determined by whether the result of multiplying the relative position vector by the normal vector of the plane in which the target control is located is greater than the second threshold, whether the result of multiplying the motion direction vector of the terminal by the normal vector of the plane in which the target control is located is less than the sixth threshold, and whether the length of the relative position vector is less than the fifth threshold, where the third predetermined condition may be determined to be satisfied only when all of the three conditions are satisfied. It should be noted here that the fifth threshold may be smaller than the fourth threshold, and the fifth threshold and the sixth threshold may be set or adjusted according to actual requirements, which is not specifically limited herein.
When the control target control is in the activated state, in the process that the terminal continues to move towards the target control, if the third predetermined condition is judged to be met, the embodiment of the invention can control the target control to be in the trigger state so as to trigger the target element in the target control; and if the third preset condition is judged not to be met, the target element in the target control is not triggered.
As an optional embodiment, after the control target control is in the trigger state, this embodiment may further include: and prompting the target control which is successfully triggered.
It should be noted that if a certain target element in the target control in the virtual scene is successfully triggered, a prompt may be fed back to the user, and the form of the prompt is not specifically limited in the embodiment of the present invention, for example, the user may be prompted by the forms of terminal vibration, terminal sound effect playing, target control zooming, and the like, that a certain target element in the target control has been successfully triggered. According to the embodiment, the user can be informed that the triggering operation is successful and the user can take the terminal away through the feedback prompt to the user after the target element in the target control is successfully triggered, so that the user experience is improved.
As an optional embodiment, when the control target control is in the selected state, the embodiment may further move the terminal continuously toward the target control, and may determine whether a fourth predetermined condition is met in real time in the moving process, where the fourth predetermined condition may be used to indicate that a target element in the target control is triggered, so as to achieve a purpose of triggering a certain UI function. The fourth predetermined condition is not specifically limited in the embodiments of the present invention, and may be set or adjusted according to the requirements of the actual application scenario.
Optionally, it may be determined that the fourth predetermined condition is satisfied only when the terminal is located in the first predetermined angle range in front of the target control, an included angle between a normal vector of a plane in which the terminal is located and a normal vector of a plane in which the target control is located is within a second predetermined angle range, and a distance between the terminal and the target control is smaller than a seventh threshold. Wherein the seventh threshold is less than the first threshold.
When the target control is controlled to be in the selected state, in the process that the terminal continues to move towards the target control, if the fourth preset condition is judged to be met, the embodiment of the invention can directly control the target control to be in the trigger state, so as to trigger the target element in the target control. According to the embodiment, the target element in the target control is directly triggered after the target control is selected, and compared with the method that the target control is reactivated after the target control is selected and then the target element in the target control is triggered, the time can be saved, and the operation is simplified.
As an optional embodiment, when the control target control is in the selected state, the embodiment may further include: binding the terminal with the target control; and controlling the target control to move in the virtual scene by using the terminal.
It should be noted that, after the target control is selected by using the terminal trigger, the embodiment of the present invention may establish a binding relationship between the terminal and the target control, that is, the terminal and the target control are in a parent-child relationship, and the target control may become a child object of the terminal. Optionally, after the target control is selected by using the terminal, the binding relationship between the terminal and the target control may be automatically established, or after the target control is selected by using the terminal, the user may implement establishment of the binding relationship between the terminal and the target control by performing a touch operation on the terminal, where the embodiment of the present invention does not specifically limit the type of the touch operation performed by the user on the terminal, such as a click, a long press, a gesture, etc., performed on a screen of the terminal, or an operation such as shaking the terminal.
After the terminal is bound with the target control, the embodiment may utilize the terminal to control the target control to move in the virtual scene. Optionally, controlling the target control to move in the virtual scene by the terminal may include at least one of: the target control moves along a preset track along with the terminal; the target control rotates along with the terminal; the target control shakes along with the position of the target control where the terminal is located. It should be noted here that after the binding relationship between the terminal and the target control is established, the target control moves along with the mobile terminal. It should be further noted that the target control may rotate in two triggering manners, one of which is to rotate the terminal, and the target control may also rotate along with the rotation, and the other is to slide the terminal screen, and the target control may rotate around an axis, which is parallel to the plane of the terminal screen, passes through the axis of the target control, is perpendicular to the sliding trajectory of each frame, and rotates in the rotation direction according to the sliding direction.
According to the embodiment, the binding relationship between the terminal and the target control is established, and the terminal can be used for controlling the target control to move in the virtual scene, so that the aim of conveniently controlling the target control can be fulfilled, the operation is greatly facilitated, and the operation efficiency is improved.
As an optional embodiment, in the process of controlling the target control to move in the virtual scene by using the terminal, the optional embodiment may further include: acquiring a control operation executed on the terminal, wherein the control operation is used for indicating to release the binding relationship between the terminal and the target control; the binding relation between the terminal and the target control is released in response to the control operation, and the current moving position of the target control is obtained; under the condition that the position to which the target control moves currently is a first position, updating the position of the target control to be the first position, wherein the first position is a position where the target control is allowed to be located; and under the condition that the current position of the target control is the second position, restoring the position of the target control to the original position of the target control, wherein the second position is the position where the target control is not allowed to be located.
It should be noted that, in the process of controlling the target control to move in the virtual scene by using the terminal, the embodiment of the present invention may release the binding relationship between the terminal and the target control according to the actual requirement, and specifically may be implemented by a control operation executed on the terminal, where it should be noted that, this embodiment also does not specifically limit the control operation executed on the terminal, such as a click, a long press, a gesture, and the like executed on a terminal screen, or shaking the terminal. After the user executes the control operation on the terminal, the embodiment may respond to the control operation, remove the binding relationship between the terminal and the target control, and acquire the current position to which the target control moves. If the position to which the target control is currently moved is the first position, the embodiment may update the position of the target control to the first position, where the first position is a position where the target control is allowed to be located; if the current position to which the target control moves is the second position, the embodiment may restore the position of the target control to the original position of the target control, where the second position is a position where the target control is not allowed to be located. Alternatively, the embodiment may also mark the first and second positions with different colors, for example, the first position may be marked with green and the second position with red, or the target control may turn green when the target control moves to the first position and red when the target control moves to the second position.
The embodiment can realize the movement of the target control in the virtual control by using the terminal, and can effectively realize the purpose of managing the position to which the target control moves by judging whether the current position to which the target control moves is legal or not, thereby greatly facilitating the user operation and effectively improving the use experience of the user.
The present invention also provides a preferred embodiment which provides a method of touch selection in mobile device AR applications.
Fig. 8 is a flowchart of a method for touch selection in an AR application of a mobile device according to a preferred embodiment of the present invention, and as shown in fig. 8, the method may specifically include the following steps:
in step S302, a mobile device (e.g., a mobile phone, a tablet computer, etc.) starts approaching in front of the UI control.
Step S304, judging whether the mobile equipment is in the range of the angle threshold A right in front of the UI control, whether the included angle between the normal vector of the mobile equipment and the normal vector of the UI control is in the range of the threshold B, and whether the distance between the mobile equipment and the UI control is smaller than the threshold D1. And if the judgment result is yes, executing the step S306 and triggering to select the UI control. It should be noted here that the normal vector of the mobile device is the normal vector of the plane where the terminal is located, and the normal vector of the UI control is the normal vector of the plane where the target control is located.
And step S306, triggering and selecting the UI control.
Step S308, whether the mobile device is located in the range of the angle threshold A right in front of the UI control, whether the included angle between the normal vector of the mobile device and the normal vector of the UI control is in the range of the threshold B, and whether the distance between the mobile device and the UI control is smaller than the threshold D2 are judged. If the judgment result is yes, step S310 is executed to activate the touch UI event.
In step S310, the touch UI event is activated.
Step S312, whether the mobile device is located in the range of the angle threshold A right in front of the UI control, whether the included angle between the motion direction vector and the normal vector of the UI control is in the range of the threshold C, and whether the distance between the mobile device and the UI control is smaller than the threshold D3 are judged. If the judgment result is yes, step S314 is executed to trigger the UI function.
Step S314, the UI function is triggered.
For the above step S304, triggering the selected state needs to satisfy the following three conditions at the same time: the mobile device is located in a certain angle range right in front of the UI control, the mobile device needs to look at the direction of the UI control, and the distance between the mobile device and the UI control is within a certain range. And as the distance between the mobile equipment and the UI control is reduced, the UI control presents the selected special effect, and the strength of the special effect presents a linear relation along with the distance.
Optionally, determining whether to trigger the selected state requires detecting whether the following three conditions are simultaneously satisfied: the result of multiplying the relative position vector point by the UI control normal vector is greater than A1; the result of multiplying the normal vector of the UI control by the normal vector of the mobile equipment is less than A2; the relative position vector length is less than D1.
The relative position vector refers to a vector of the position of the mobile device and the relative position of the center point of the UI control, and is defined as a vector starting from the center point of the UI control and ending at the position of the mobile device, and the direction of the relative position vector points to the position of the mobile device from the center point of the UI control. The UI control normal vector refers to a normal vector passing through the center point of the UI control, and the direction of the UI control normal vector is the same as the direction faced by the front face of the UI.
It should be noted that a1, a2 and D1 may be determined according to different requirements. A1 restricts the area of the mobile device relative to the UI control where the activatable touch condition is located, and the larger A1, the more the selected state can be triggered in the area right in front of the UI control, otherwise, the side position can trigger the selected state. A2 constrains the relationship of the direction the mobile device is looking in with respect to the UI control when the mobile device can trigger the selected state, the smaller A2, the condition can only be triggered when looking at the UI control, otherwise the UI control may not appear in the device field of view when triggered. D1 constrains the distance to the UI control at which the mobile device can trigger the selected state, the smaller D1, the closer the mobile device can trigger the selected state than the UI control.
With the above step S308, the state of the touch UI event is activated. And the mobile equipment continues to move towards the UI control when the UI control is in the selected state, and when the mobile equipment approaches the UI control, the touch UI event can be activated.
Optionally, determining whether a condition for touching the UI event can be activated: when the UI control is in the selected state, it needs to detect whether the following three conditions are simultaneously satisfied: the result of multiplying the relative position vector point by the UI control normal vector is greater than A1; the result of multiplying the normal vector of the UI control by the normal vector of the mobile equipment is less than A2; the relative position vector length is less than D2.
It should be noted that D2 may be determined according to different requirements, D2 restricts the distance from the UI control element when the mobile device can activate the touch UI event, and the smaller D2, the shorter the mobile device can be triggered only at a short distance from the UI control.
With the above step S312, the UI function is triggered. When the condition of activating the touch UI event is met, the mobile device continues to move towards the UI control to touch the UI element, and the function of the UI can be triggered once.
Optionally, the implementation manner when the mobile device triggers the scene UI function may be: when the mobile device meets the three conditions for activating the touch UI event, the program starts to detect the motion direction of the mobile device and the distance from the center point of the UI control. And when the dot product of the motion direction vector and the UI control normal vector is larger than A3 and the length of the relative position vector is smaller than D3, the UI function is considered to be triggered.
The motion direction vector refers to a device motion direction calculated by a program according to the position of each frame of the device.
It should be noted that a3 and D3 can be determined according to different requirements. A3 restricts the direction of movement required for a mobile device to trigger UI functions, and the larger A3 the more movement the mobile device needs to move in a direction directly towards the UI control in order to trigger a UI function. D3 constrains the distance at which the mobile device triggers UI functions that are triggered when the mobile device is less than D3 from the UI control and all of the above conditions are met.
Optionally, in a state that the UI control selection condition is satisfied, when the mobile device position is between D1 and D2, the selection effect of the UI control changes linearly with the relative position vector length. For example, when the length of the relative position vector is D1, the UI control lighting effect is 50%; when the relative position vector length is D2, the UI control lighting effect is 100%. The luminous effect is a selectable selection effect of the UI control, and the selection effect can be changed into other different expression forms according to requirements.
Optionally, when the selection condition is satisfied but the touch UI event which can be activated is not reached, the mobile device is far away from the UI control, and the UI control selection effect linearly disappears.
Optionally, the presentation logic when the trigger UI function is satisfied: and the UI control can generate a trigger feedback prompt, such as mobile phone vibration, click sound effect, UI control zooming and the like, so that the user is informed that the trigger operation is successful, and the user can take the mobile equipment away.
After the UI triggering function is finished, the mobile device can be far away from the UI control in the opposite direction, and the whole process is finished in one complete interaction mode of touch selection in the AR.
The invention can realize the following functions:
the UI control can be established in a three-dimensional virtual space, the three-dimensional scene UI control breaks through the limitation of the UI control in the screen of the mobile device, the whole screen of the mobile device can be used for presenting an AR scene, and the reduction of AR experience caused by the fact that more UI buttons are always shielded on the screen is avoided; the mobile equipment is contacted with the UI control in the three-dimensional virtual space to achieve a triggering effect, interaction with the UI control in the three-dimensional virtual space can be rapidly carried out without a screen of the mobile equipment, the operation efficiency is improved, the operation has better fault tolerance, and the operation burden of a user is reduced; through the identification of the spatial distance between the mobile device and the UI and the threshold judgment, the conflict with the response mode of the existing control can be avoided; through the identification of the angle between the mobile device and the UI screen and the threshold judgment, misoperation can be reduced.
According to the invention, the UI control is directly triggered through the mobile equipment, so that the operation path and burden are reduced, the operation efficiency can be improved, and the user experience is improved. In addition, the UI control is displayed in the three-dimensional virtual space, so that buttons resident on the screen can be reduced, and the control does not need to reside on the screen of the mobile equipment, so that the utilization rate of the screen interface of the mobile terminal is improved.
According to an embodiment of the present invention, an embodiment of an apparatus for selecting a control based on augmented reality is further provided, and it should be noted that the apparatus for selecting a control based on augmented reality may be used to execute the method for selecting a control based on augmented reality in the embodiment of the present invention, that is, the method for selecting a control based on augmented reality in the embodiment of the present invention may be executed in the apparatus for selecting a control based on augmented reality.
Fig. 9 is a schematic diagram of an augmented reality based control selection apparatus according to an embodiment of the present invention, and as shown in fig. 9, the apparatus may include:
a display unit 12 for displaying the target control in the virtual scene; and the selecting unit 14 is used for controlling the target control to be in a selected state under the condition that a first preset condition is met in the process that the terminal moves towards the target control.
It should be noted that the display unit 12 in this embodiment may be used to execute step S102 in this embodiment of the present invention, and the selection unit 14 in this embodiment may be used to execute step S104 in this embodiment of the present invention. The modules are the same as the corresponding steps in the realized examples and application scenarios, but are not limited to the disclosure of the above embodiments.
Optionally, satisfying the first predetermined condition may include: the terminal is located in a first preset angle range in front of the target control, an included angle between a normal vector of a plane where the terminal is located and a normal vector of a plane where the target control is located is in a second preset angle range, and the distance between the terminal and the target control is smaller than a first threshold value, wherein the direction of the normal vector of the plane where the terminal is located is the opposite direction faced by a display screen of the terminal, the direction of the normal vector of the plane where the target control is located is the direction faced by the front face of the target control, and the front face of the target control is the face facing the terminal.
Optionally, determining that the terminal is located in a first predetermined angle range in front of the target control under the condition that a result of point multiplication of the relative position vector by a normal vector of a plane where the target control is located is greater than a second threshold, wherein the relative position vector is a direction vector pointing to the position of the terminal from a center point of the target control; under the condition that the result of multiplying the normal vector point of the plane where the terminal is located by the normal vector of the plane where the target control is located is smaller than a third threshold value, determining that an included angle between the normal vector of the plane where the terminal is located and the normal vector of the plane where the target control is located is within a second preset angle range; and determining that the distance between the terminal and the target control is smaller than the first threshold value under the condition that the length of the relative position vector is smaller than the first threshold value.
As an alternative embodiment, the apparatus may further comprise: and the marking unit is used for marking the target control in the selected state when the target control is in the selected state, and determining the display parameters of the target control according to the distance between the terminal and the target control.
As an alternative embodiment, the apparatus may further comprise: and the activation unit is used for controlling the target control to be in the activation state under the condition that the target control is in the selected state and a second preset condition is met.
Optionally, satisfying the second predetermined condition comprises: and the distance between the terminal and the target control is smaller than a fourth threshold, wherein the fourth threshold is smaller than the first threshold.
As an alternative embodiment, the apparatus may further comprise: and the first trigger unit is used for controlling the target element to be in a trigger state under the condition that the target control is in an activated state and a third preset condition is met.
Optionally, a third predetermined condition is satisfied, including: an included angle between a motion direction vector of the terminal and a normal vector of a plane where the target control is located is within a third preset angle range, and a distance between the terminal and the target control is smaller than a fifth threshold, wherein the motion direction vector of the terminal is a vector pointing to a position where a current frame terminal is located from a position where a previous frame terminal is located, and the fifth threshold is smaller than a fourth threshold.
Optionally, under the condition that a result of multiplying the motion direction vector point of the terminal by the normal vector of the plane where the target control is located is smaller than a sixth threshold, determining that an included angle between the motion direction vector of the terminal and the normal vector of the plane where the target control is located is within a third predetermined angle range; and determining that the distance between the terminal and the target control is smaller than a fifth threshold value under the condition that the length of the relative position vector is smaller than the fifth threshold value.
As an alternative embodiment, the apparatus may further comprise: and the prompting unit is used for prompting the successfully triggered target control when the target control is in a triggering state.
As an alternative embodiment, the apparatus may further comprise: and the second trigger unit is used for controlling the target control to be in a trigger state under the condition that a fourth preset condition is met when the target control is in the selected state.
Optionally, satisfying the fourth predetermined condition includes: and the distance between the terminal and the target control is smaller than a seventh threshold, wherein the seventh threshold is smaller than the first threshold.
As an alternative embodiment, the apparatus may further comprise: the binding unit is used for binding the terminal and the target control when the target control is in the selected state; and the motion unit is used for controlling the target control to move in the virtual scene by using the terminal.
Optionally, the motion unit may comprise at least one of: the moving module is used for moving the target control along a preset track along with the terminal; the rotating module is used for rotating the target control along with the terminal; and the shaking module is used for shaking the target control at the position of the target control along with the terminal.
As an alternative embodiment, the apparatus may further comprise: the terminal comprises an acquisition unit, a display unit and a control unit, wherein the acquisition unit is used for acquiring control operation executed on the terminal in the process of controlling the target control to move in a virtual scene by using the terminal, and the control operation is used for indicating to release the binding relationship between the terminal and the target control; the response unit is used for responding to the control operation to remove the binding relationship between the terminal and the target control and acquiring the current moving position of the target control; the updating unit is used for updating the position of the target control to be a first position under the condition that the position to which the target control moves currently is the first position, wherein the first position is a position where the target control is allowed to be located; and the restoring unit is used for restoring the position of the target control to the original position of the target control under the condition that the current position to which the target control moves is a second position, wherein the second position is a position where the target control is not allowed to be located.
It should be noted that, the above units or modules are the same as the examples and application scenarios realized by the corresponding steps, but are not limited to the disclosure of the above embodiments.
Through the device, the purpose of operating the target control without a terminal screen is achieved, and the technical problem that the display of the AR scene is incomplete due to the fact that the UI control is arranged in the display screen of the terminal by the AR application in the related technology is solved, so that the terminal is guaranteed to display the complete AR scene, and the technical effect of operating efficiency of the target control is improved.
In order to achieve the above object, according to another aspect of the present invention, an embodiment of the present invention further provides a storage medium, where the storage medium includes a stored program, and when the program runs, the device on which the storage medium is located is controlled to execute the above augmented reality-based control selection method.
In order to achieve the above object, according to another aspect of the present invention, an embodiment of the present invention further provides a processor, where the processor is configured to execute a program, where the program executes the method for selecting an augmented reality-based control.
In order to achieve the above object, according to another aspect of the present invention, an embodiment of the present invention further provides a terminal, including a memory and a processor, where the memory stores a computer program, and the processor is configured to execute the computer program to perform the augmented reality based control selection method described above.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (17)

1. A control selection method based on augmented reality is characterized by comprising the following steps:
displaying a target control in a virtual scene;
in the process that the terminal moves towards the target control, the target control is controlled to be in a selected state under the condition that a first preset condition is met;
in the selected state of the target control, the method further comprises: marking the target control as a selected state, and determining display parameters of the target control according to the distance between the terminal and the target control;
the meeting of the first predetermined condition includes: the terminal is located in a first preset angle range in front of the target control, an included angle between a normal vector of a plane where the terminal is located and a normal vector of a plane where the target control is located is in a second preset angle range, and a distance between the terminal and the target control is smaller than a first threshold value, wherein the direction of the normal vector of the plane where the terminal is located is a reverse direction faced by a display screen of the terminal, the direction of the normal vector of the plane where the target control is located is a direction faced by a front face of the target control, and the front face of the target control is a face facing the terminal.
2. The method of claim 1,
determining that the terminal is located in the first preset angle range in front of the target control under the condition that the result of multiplying the relative position vector point by the normal vector of the plane where the target control is located is larger than a second threshold, wherein the relative position vector is a direction vector pointing to the position where the terminal is located from the center point of the target control;
under the condition that the result of multiplying the normal vector point of the plane where the terminal is located by the normal vector of the plane where the target control is located is smaller than a third threshold value, determining that an included angle between the normal vector of the plane where the terminal is located and the normal vector of the plane where the target control is located is within a second preset angle range;
determining that a distance between the terminal and the target control is less than the first threshold if the length of the relative position vector is less than the first threshold.
3. The method of claim 1, wherein in the selected state of the target control, the method further comprises:
and controlling the target control to be in an activated state under the condition that a second preset condition is met.
4. The method of claim 3, wherein the meeting a second predetermined condition comprises:
the distance between the terminal and the target control is smaller than a fourth threshold, wherein the fourth threshold is smaller than the first threshold.
5. The method of claim 4, wherein when the target control is in the activated state, the method further comprises:
and controlling the target control to be in a trigger state under the condition that a third preset condition is met.
6. The method of claim 5, wherein the satisfaction of a third predetermined condition comprises:
an included angle between a motion direction vector of the terminal and a normal vector of a plane where the target control is located is within a third preset angle range, and a distance between the terminal and the target control is smaller than a fifth threshold, wherein the motion direction vector of the terminal is a vector pointing to a position where the terminal is located in a current frame from a position where the terminal is located in a previous frame, and the fifth threshold is smaller than the fourth threshold.
7. The method of claim 6,
under the condition that the result of multiplying the motion direction vector point of the terminal by the normal vector of the plane where the target control is located is smaller than a sixth threshold value, determining that the included angle between the motion direction vector of the terminal and the normal vector of the plane where the target control is located is within a third preset angle range;
and under the condition that the length of the relative position vector is smaller than the fifth threshold, determining that the distance between the terminal and the target control is smaller than the fifth threshold, wherein the relative position vector is a direction vector pointing to the position of the terminal from the central point of the target control.
8. The method of claim 5, wherein in the triggered state the target control, the method further comprises:
and prompting the target control which is successfully triggered.
9. The method of claim 1, wherein while the target control is in the selected state, the method further comprises:
and controlling the target control to be in a trigger state under the condition that a fourth preset condition is met.
10. The method of claim 9, wherein the satisfaction of the fourth predetermined condition comprises:
the distance between the terminal and the target control is less than a seventh threshold, wherein the seventh threshold is less than the first threshold.
11. The method of any of claims 1-10, wherein in the selected state of the target control, the method further comprises:
binding the terminal with the target control;
and controlling the target control to move in the virtual scene by using the terminal.
12. The method according to claim 11, wherein the controlling the target control to move in the virtual scene by the terminal comprises at least one of:
the target control moves along a preset track along with the terminal;
the target control rotates along with the terminal;
and the target control swings along with the terminal at the position of the target control.
13. The method according to claim 11, wherein during the controlling, by the terminal, the target control to move in the virtual scene, the method further comprises:
acquiring a control operation executed on the terminal, wherein the control operation is used for indicating that the binding relationship between the terminal and the target control is released;
responding to the control operation to remove the binding relationship between the terminal and the target control, and acquiring the current moving position of the target control;
under the condition that the position to which the target control is currently moved is a first position, updating the position of the target control to the first position, wherein the first position is a position where the target control is allowed to be located;
and under the condition that the position to which the target control is moved currently is a second position, restoring the position of the target control to the original position of the target control, wherein the second position is a position where the target control is not allowed to be located.
14. An augmented reality based control selection apparatus, comprising:
the display unit is used for displaying the target control in the virtual scene;
the terminal comprises a selecting unit, a judging unit and a judging unit, wherein the selecting unit is used for triggering and selecting the target control under the condition that a first preset condition is met in the process that the terminal moves towards the target control, and the first preset condition is used for indicating that the target control is triggered and selected;
when the target control is in the selected state, the method further comprises the following steps: marking the target control as a selected state, and determining display parameters of the target control according to the distance between the terminal and the target control;
the meeting of the first predetermined condition includes: the terminal is located in a first preset angle range in front of the target control, an included angle between a normal vector of a plane where the terminal is located and a normal vector of a plane where the target control is located is in a second preset angle range, and a distance between the terminal and the target control is smaller than a first threshold value, wherein the direction of the normal vector of the plane where the terminal is located is a reverse direction faced by a display screen of the terminal, the direction of the normal vector of the plane where the target control is located is a direction faced by a front face of the target control, and the front face of the target control is a face facing the terminal.
15. A storage medium, comprising a stored program, wherein the program, when executed, controls an apparatus in which the storage medium is located to perform the method of any one of claims 1 to 13.
16. A processor, configured to run a program, wherein the program when running performs the method of any one of claims 1 to 13.
17. A terminal comprising a memory and a processor, wherein the memory has stored therein a computer program, and the processor is arranged to execute the computer program to perform the method of any of claims 1 to 13.
CN201810187008.1A 2018-03-07 2018-03-07 Control selection method and device based on augmented reality Active CN108415570B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810187008.1A CN108415570B (en) 2018-03-07 2018-03-07 Control selection method and device based on augmented reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810187008.1A CN108415570B (en) 2018-03-07 2018-03-07 Control selection method and device based on augmented reality

Publications (2)

Publication Number Publication Date
CN108415570A CN108415570A (en) 2018-08-17
CN108415570B true CN108415570B (en) 2021-08-24

Family

ID=63130472

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810187008.1A Active CN108415570B (en) 2018-03-07 2018-03-07 Control selection method and device based on augmented reality

Country Status (1)

Country Link
CN (1) CN108415570B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110478901B (en) * 2019-08-19 2023-09-22 Oppo广东移动通信有限公司 Interaction method and system based on augmented reality equipment
CN112068703B (en) 2020-09-07 2021-11-16 北京字节跳动网络技术有限公司 Target object control method and device, electronic device and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102436663A (en) * 2010-08-12 2012-05-02 株式会社泛泰 User equipment, server, and method for selectively filtering augmented reality
EP2802962A1 (en) * 2012-01-12 2014-11-19 Qualcomm Incorporated Augmented reality with sound and geometric analysis
CN106843498A (en) * 2017-02-24 2017-06-13 网易(杭州)网络有限公司 Dynamic interface exchange method and device based on virtual reality
CN107092492A (en) * 2017-04-27 2017-08-25 广州四三九九信息科技有限公司 The control method and device of virtual objects
CN107430442A (en) * 2015-05-26 2017-12-01 谷歌公司 For entering and exiting the multi-dimensional graphic method of the application in immersion media and activity
KR101806864B1 (en) * 2016-10-05 2017-12-08 연세대학교 산학협력단 Apparatus for controlling 3d object in augmmented reality environment and method thereof

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102436663A (en) * 2010-08-12 2012-05-02 株式会社泛泰 User equipment, server, and method for selectively filtering augmented reality
EP2802962A1 (en) * 2012-01-12 2014-11-19 Qualcomm Incorporated Augmented reality with sound and geometric analysis
CN107430442A (en) * 2015-05-26 2017-12-01 谷歌公司 For entering and exiting the multi-dimensional graphic method of the application in immersion media and activity
KR101806864B1 (en) * 2016-10-05 2017-12-08 연세대학교 산학협력단 Apparatus for controlling 3d object in augmmented reality environment and method thereof
CN106843498A (en) * 2017-02-24 2017-06-13 网易(杭州)网络有限公司 Dynamic interface exchange method and device based on virtual reality
CN107092492A (en) * 2017-04-27 2017-08-25 广州四三九九信息科技有限公司 The control method and device of virtual objects

Also Published As

Publication number Publication date
CN108415570A (en) 2018-08-17

Similar Documents

Publication Publication Date Title
JP6529659B2 (en) Information processing method, terminal and computer storage medium
CA2985867C (en) Information processing method, terminal, and computer storage medium
CN108355354B (en) Information processing method, device, terminal and storage medium
US10500504B2 (en) Shooting game control method and apparatus, storage medium, processor, and terminal
CN109557998B (en) Information interaction method and device, storage medium and electronic device
CN107551537B (en) Method and device for controlling virtual character in game, storage medium and electronic equipment
KR102045665B1 (en) Icon display method and device
CN107132981B (en) Display control method and device, storage medium, the electronic equipment of game picture
CN108273265A (en) The display methods and device of virtual objects
CN112370780B (en) Display method and device of virtual control, storage medium and electronic equipment
CN106873886B (en) Control method and device for stereoscopic display and electronic equipment
CN109731329B (en) Method and device for determining placement position of virtual component in game
CN111701226A (en) Control method, device and equipment for control in graphical user interface and storage medium
CN108415570B (en) Control selection method and device based on augmented reality
CN112363658B (en) Interaction method and device for video call
CN108635850B (en) Information processing method, device and storage medium
CN113900570B (en) Game control method, device, equipment and storage medium
US20220152505A1 (en) Virtual object control method and apparatus, storage medium, and electronic device
CN113262476B (en) Position adjusting method and device of operation control, terminal and storage medium
CN111766989B (en) Interface switching method and device
CN112987924A (en) Method, apparatus, device and storage medium for device interaction
CN114053706A (en) Game interface visual angle processing method, device, equipment and storage medium
CN113680064A (en) Method, device, equipment and storage medium for controlling virtual character in game
CN108920051B (en) Virtual resource moving method and device
CN107861664B (en) Display control method and device, storage medium and processor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant