CN114564106B - Method and device for determining interaction indication line, electronic equipment and storage medium - Google Patents

Method and device for determining interaction indication line, electronic equipment and storage medium Download PDF

Info

Publication number
CN114564106B
CN114564106B CN202210176746.2A CN202210176746A CN114564106B CN 114564106 B CN114564106 B CN 114564106B CN 202210176746 A CN202210176746 A CN 202210176746A CN 114564106 B CN114564106 B CN 114564106B
Authority
CN
China
Prior art keywords
target
determining
indication line
interaction
indicator
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210176746.2A
Other languages
Chinese (zh)
Other versions
CN114564106A (en
Inventor
张元煌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zitiao Network Technology Co Ltd
Original Assignee
Beijing Zitiao Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zitiao Network Technology Co Ltd filed Critical Beijing Zitiao Network Technology Co Ltd
Priority to CN202210176746.2A priority Critical patent/CN114564106B/en
Publication of CN114564106A publication Critical patent/CN114564106A/en
Application granted granted Critical
Publication of CN114564106B publication Critical patent/CN114564106B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/048023D-info-object: information is displayed on the internal or external surface of a three dimensional manipulable object, e.g. on the faces of a cube that can be rotated by the user

Abstract

The embodiment of the disclosure provides a method, a device, electronic equipment and a storage medium for determining an interaction indication line, wherein the method comprises the following steps: acquiring target pointing information of a direction indicator; determining an interaction indication line to be corrected and an object to be controlled which is intersected with the interaction indication line to be corrected according to the target pointing information and the viewpoint position information of the viewpoint controller; and determining a target interaction indication line according to the indicator position information of the direction indicator and the to-be-repaired orthogonal interaction indication line, so as to control the object to be controlled based on the target interaction indication line. According to the technical scheme provided by the embodiment of the disclosure, the user can select and control any object which can be observed in the sight range through the handle, so that the interestingness of the virtual scene and the use experience of the user are improved.

Description

Method and device for determining interaction indication line, electronic equipment and storage medium
Technical Field
The embodiment of the disclosure relates to the technical field of data processing, in particular to a method and a device for determining an interaction indication line, electronic equipment and a storage medium.
Background
With the development of virtual technology, an Extended Reality (XR) device may enable a user to interact with a corresponding virtual object. To enable interaction with distant objects, an interaction indication line is required to determine the interaction object of interest to the user. The interaction indicator line functions similarly to a conventional mouse.
In practice, a user may see objects within a particular visual range using a head-worn indicator, however, some of them cannot be directly related by the curves of the hands. The main reason is that the head indicator cannot project a corresponding curve, and the curve emitted by the hand is likely to be blocked by other objects, so that the target object desired to be selected by the user cannot be selected by a straight line or a curve.
Disclosure of Invention
The embodiment of the disclosure provides a method, a device, electronic equipment and a storage medium for determining an interaction indication line, which enable a user to select and control any object which can be observed in a sight range through a handle, and improve the interestingness of a virtual scene and the use experience of the user.
In a first aspect, an embodiment of the present disclosure provides a method for determining an interaction indicator, where the method includes:
Acquiring target pointing information of a direction indicator;
determining an interaction indication line to be corrected and an object to be controlled which is intersected with the interaction indication line to be corrected according to the target pointing information and the viewpoint position information of the viewpoint controller;
and determining a target interaction indication line according to the indicator position information of the direction indicator and the to-be-repaired orthogonal interaction indication line, so as to control the object to be controlled based on the target interaction indication line.
In a second aspect, an embodiment of the present disclosure further provides a device for determining an interaction indication line, where the device includes:
the pointing information determining module is used for acquiring target pointing information of the direction indicator;
the to-be-repaired orthogonal mutual indication line determining module is used for determining an to-be-repaired interactive indication line and an object to be controlled which is intersected with the to-be-repaired orthogonal mutual indication line according to the target pointing information and the viewpoint position information of the viewpoint controller;
and the target interaction indication line determining module is used for determining a target interaction indication line according to the indicator position information of the direction indicator and the to-be-repaired orthogonal interaction indication line so as to control the to-be-controlled object based on the target interaction indication line.
In a third aspect, embodiments of the present disclosure further provide an electronic device, including:
one or more processors;
storage means for storing one or more programs,
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of determining an interaction indication line as described in any of the embodiments of the present disclosure.
In a fourth aspect, the disclosed embodiments also provide a storage medium containing computer-executable instructions, which when executed by a computer processor, are for performing a method of determining an interaction indication line according to any of the disclosed embodiments.
According to the technical scheme, target pointing information of the direction indicator is obtained, namely the pointing direction of a handle held by a hand of a user is determined; according to the target pointing information and the viewpoint position information of the viewpoint controller, determining an interaction indicating line to be corrected and an object to be controlled which is intersected with the interaction indicating line to be corrected, namely determining the sight of a user and the object which is expected to be controlled; further, according to the indicator position information of the direction indicator and the to-be-repaired orthogonal mutual indication line, the target interaction indication line is determined, so that the object to be controlled is controlled based on the target interaction indication line, a user can select and control any object which can be observed in the sight range through the handle, the problem that some objects cannot be selected due to shielding with the user is avoided, the interaction effect of the user and various objects in the scene is enriched, and the interestingness of the virtual scene and the use experience of the user are improved.
Drawings
The above and other features, advantages, and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. The same or similar reference numbers will be used throughout the drawings to refer to the same or like elements. It should be understood that the figures are schematic and that elements and components are not necessarily drawn to scale.
Fig. 1 is a flowchart illustrating a method for determining an interaction indication line according to an embodiment of the disclosure;
FIG. 2 is a schematic diagram of selecting and controlling objects within a scene based on an XR device as provided in accordance with an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of determining a target connection point on a to-be-repaired orthogonal indication line according to one embodiment of the disclosure;
FIG. 4 is a schematic diagram of a Bezier curve according to one embodiment of the present disclosure;
fig. 5 is a block diagram of a determining device for an interactive indication line according to a second embodiment of the disclosure;
fig. 6 is a schematic structural diagram of an electronic device according to a third embodiment of the disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been shown in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but are provided to provide a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order and/or performed in parallel. Furthermore, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "including" and variations thereof as used herein are intended to be open-ended, i.e., including, but not limited to. The term "based on" is based at least in part on. The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments. Related definitions of other terms will be given in the description below.
It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be understood as "one or more" unless the context clearly indicates otherwise.
Example 1
Fig. 1 is a flow chart of a method for determining an interaction indicator according to an embodiment of the present disclosure, where the method may be performed by a device for determining an interaction indicator, where the device may be implemented in software and/or hardware, and the hardware may be an electronic device, such as a mobile terminal, a PC or a server, where the device is used by a user to select an object observed in a scene. Any scenario of image presentation is usually implemented by cooperation of a client and a server, and the method provided by the embodiment may be executed by the server, the client, or the cooperation of the client and the server.
As shown in fig. 1, the method of the present embodiment includes:
s110, acquiring target pointing information of the direction indicator.
In this embodiment, a user may utilize the XR device to view, select and control a variety of objects within the 3D space. Specifically, the augmented Reality refers to an environment that is generated by computer technology and wearable devices and is actually combined with Virtual, and can be interacted with by a human machine, and is a generic name of multiple forms such as Augmented Reality (AR), virtual Reality (VR), and Mixed Reality (MR), and by fusing three visual interaction technologies, a seamless transition 'immersive' experience between the Virtual world and the real world is achieved.
Correspondingly, an XR device is a device related to the augmented reality technology, such as a handle indicator. In particular, the handle indicator may be a hand-held device developed based on wireless communication technology, which upon actuation may be associated to a particular network and with other types of XR devices, and further, by actuation of an associated button on the handle indicator, may generate a signal for selecting or controlling an object in an augmented reality scene. In the practical application process, a user can generate a control signal in a ray form for an object in a scene through the handle indicator, so the handle indicator is further divided into a handle straight line indicator (straight curved) capable of emitting a straight line ray and a handle bezier curve indicator (bezier curved) capable of emitting a curved ray according to different types of generated rays.
Because the rays emitted by the handle indicator have directionality, in this embodiment, the handle held by the user is the direction indicator, and the target pointing information is pointing information of one end of the handle emitting rays, for example, the user holds a handle straight line indicator, and when the indicator aims at the horizontal north direction, the system can determine the target pointing information corresponding to the horizontal north direction.
In the present embodiment, the target pointing information of the direction indicator may be determined based on a gyroscope provided in the direction indicator.
The gyroscope is an angular motion detection device which uses a momentum moment sensitive shell of a high-speed revolving body to rotate around one or two axes orthogonal to a rotation shaft relative to an inertia space. It can be understood that, after the gyroscope is arranged in the direction indicator, not only the direction of the direction indicator can be accurately determined, but also the direction of the end of the direction indicator emitting the ray can be determined, so that the system can acquire the target direction information corresponding to the direction indicator.
Taking fig. 2 as an example, when a user points the held handle indicator to a certain object in the virtual scene, based on the gyroscope built in the handle indicator, the direction pointed by the arrow in the figure can be determined in real time as the handle pointing direction, that is, the target pointing information of the current moment of the handle is determined. It will be appreciated that integrating a gyroscope with directionality that is not affected by a magnetic field into the direction indicator allows the target pointing information of the direction indicator to be determined with greater accuracy and efficiency.
S120, determining an interaction indication line to be corrected and an object to be controlled which is intersected with the interaction indication line to be corrected according to the target pointing information and the viewpoint position information of the viewpoint controller.
In many types of XR devices, a head size indicator is included in addition to the handle indicator. Based on this, in the present embodiment, the viewpoint controller may be a head indicator associated with the direction indicator. In particular, at least objects within the augmented reality scene may be observed when the head pointer is worn by the user.
Compared with the traditional VR head display, AR intelligent glasses and other devices, the viewpoint controller can determine viewpoint position information of a user while providing a certain visual range for the user, wherein the viewpoint position information refers to an observation point of the user, and at least one object in an extended reality scene can be seen when the user is at the current observation point. As shown in fig. 2, after the user wears the head indicator and starts, the object a and the object B in the augmented reality scene can be observed at the current observation point, and the head indicator can determine the corresponding viewpoint position information according to the current posture of the head of the user.
In this embodiment, after determining the target pointing information of the direction indicator and the viewpoint position information of the viewpoint controller, a to-be-repaired orthogonal mutual indication line may be further determined. Specifically, the to-be-repaired orthogonal mutual indication line is a line which is constructed by taking viewpoint position information as a starting point and extending along the direction corresponding to the target pointing information. Based on this, it can be understood that the process of constructing the to-be-repaired orthogonal mutual indication line is a process of directing and emitting a ray along the handle by taking the viewpoint of the user as a starting point.
Further, when the to-be-repaired orthogonal indicating line extends continuously along the direction corresponding to the target pointing information and intersects with the object in the augmented reality space, the intersected object can be determined as the object to be selected. It will be appreciated that the object to be selected is the object that the user can see in the current field of view, while the line of sight between the user and the object is directed in line with the handgrip.
Since in this embodiment, the user may select and control any object in the augmented reality scene based on the XR device, the object to be selected may also be the object to be controlled. Specifically, an intersection point of the object and the to-be-repaired orthogonal mutual indication line can be used as a control point of the object to be controlled, so that a user can control the object to be controlled based on the control point.
Continuing to take fig. 2 as an example, when the viewpoint position information and the handle pointing direction are determined, a straight line can be constructed by taking the head indicator viewpoint as a starting point, and extending along the pointing direction of the handle, when the straight line intersects with the object a, the object a is the object to be controlled, the intersection point hitpoint is the control point, and the line segment between the viewpoint and the control point is the finally obtained orthogonal mutual indication line to be repaired.
And S130, determining a target interaction indication line according to the indicator position information of the direction indicator and the to-be-repaired orthogonal interaction indication line so as to control the object to be controlled based on the target interaction indication line.
The target interaction indication line is a ray finally sent by the direction indicator and used for controlling the object to be controlled, the ray is obtained by processing on the basis of the to-be-repaired orthogonal interaction indication line, and a process of determining the target interaction indication line is described in detail below with reference to fig. 3 and 4.
In this embodiment, since the target interactive indication line only includes a portion of the to-be-repaired orthogonal interaction indication line, and another portion is a connection line between an end of the direction indicator that emits the ray and the to-be-repaired orthogonal interaction indication line, in the process of determining the target interactive indication line, each to-be-selected connection point on the to-be-repaired orthogonal interaction indication line needs to be determined.
Specifically, according to the length information of the orthogonal mutual indication line to be repaired and preset adjustment parameters, a connection point adjustment range is determined; and in the connection point adjustment range, sequentially determining each connection point to be selected according to a preset step length by taking the viewpoint position information as a starting point.
The length information is the actual length of the orthogonal mutual indication line to be repaired, which can be understood as the distance between the viewpoint of the viewpoint controller and the object to be controlled, and the connection point is the connection point between the ray finally sent by the direction indicator and the orthogonal mutual indication line to be repaired. Correspondingly, the adjustment parameter is a parameter for determining which part of the orthogonal mutual indication line to repair the connection point.
Taking fig. 3 as an example, when determining the length information of the to-be-repaired orthogonal mutual indication line between the viewpoint and the object a to be controlled and determining that the preset adjustment parameter is 0.5, it may be determined that the connection point between the ray finally sent by the handle and the to-be-repaired orthogonal mutual indication line is located at the first half part of the to-be-repaired orthogonal mutual indication line, that is, the part between the viewpoint and the midpoint of the to-be-repaired orthogonal mutual indication line. Further, after determining that the step size is 0.1, a plurality of connection points to be selected can be determined one by one on the to-be-repaired orthogonal mutual indication line from the current viewpoint of the viewpoint controller. It will be appreciated by those skilled in the art that the adjustment parameters may be selected from the range of (0, 1) according to actual needs, and the embodiments of the present disclosure are not specifically limited herein. Meanwhile, for the step length, the setting may be performed according to the actual length of the orthogonal mutual indication line to be repaired (for example, when the orthogonal mutual indication line to be repaired is 1m, the step length may be set to 0.1 m), and it may be understood that when the orthogonal mutual indication line to be repaired is longer, the step length may be adaptively increased, that is, the number of determined connection points to be selected may be reduced, so that waste of system computing resources is reduced.
Further, a target connection point is determined according to the pointer position information and each connection point to be selected. The target connection point is a point determined from a plurality of connection points to be selected, and it can be understood that the target connection point is also a point where a ray finally sent by the direction indicator intersects with the orthogonal mutual indication line to be repaired.
Specifically, according to the sequence information of each connection point to be selected, determining whether the connection line between the position information of the indicator and the corresponding connection point to be selected intersects with the obstacle for the first time; if so, determining a previous to-be-selected connection point as a target connection point when the target connection point intersects with the obstacle according to the sequence information. The obstacle is an object between the user and the object to be controlled, and it can be understood that the direction indicator in the hand of the user cannot control the object to be controlled by emitting a linear ray due to the shielding of the obstacle.
Continuing to take fig. 3 as an example, after determining a plurality of connection points to be selected on the orthogonal interconnection line to be repaired, the viewpoint on the orthogonal interconnection line to be repaired can be used as the first point to be connected with the end of the direction indicator emitting the ray, and meanwhile, whether the connection line intersects with the obstacle object B blocked between the objects to be controlled is judged. It can be seen from the figure that the first connecting line does not intersect with the object B, at this time, one end of the ray emitted from the direction indicator may be connected to the second connection point to be selected in the direction corresponding to the target pointing information, and it may be judged in the same manner whether the connecting line intersects with the object B, and so on. When the fifth connecting line is constructed, it can be determined that the connecting line intersects with the object B, and at this time, in order to construct the target interaction indication line, a last connecting point (i.e., a fourth point from the viewpoint on the orthogonal interaction indication line to be repaired in fig. 3) of the connecting point to be selected corresponding to the connecting line needs to be selected as the target connecting point.
In this embodiment, after determining the target connection point, the target interaction indication line may be determined according to the indicator position information, the target connection point, and the to-be-repaired orthogonal interaction indication line. Specifically, determining a Bezier curve according to the position information of the indicator and the target connection point; and determining a target interaction indication line according to the Bezier curve and the to-be-repaired orthogonal interaction indication line.
The Bezier curve is also called Bezier curve or Bezier curve, which is a mathematical curve applied to two-dimensional graphics application program, and consists of line segments and nodes, and because the nodes are draggable fulcra, in general vector graphics software or bitmap software, the accuracy of the generated curve can be improved through the Bezier curve.
In this embodiment, the bezier curve may be determined based on the position information of the end of the direction indicator that emits the ray and the target connection point determined on the to-be-repaired orthogonal interaction indication line, and it may be understood that the determined bezier curve is a part of the finally obtained target interaction indication line. It should be noted that the target connection point is also a connection point of a part of line segments after the target connection point on the mutual indication line of the Bezier curve and the to-be-repaired.
Taking fig. 3 as an example, after the viewpoint is taken as a starting point to determine that the fourth to-be-selected connection point is the target connection point, a bezier curve can be constructed based on the current position information of the direction indicator and the target connection point, and meanwhile, for the bezier curve (guidance part) and the part between the target connection point and the control point on the to-be-repaired orthogonal mutual indication line, the target connection point is also the connection point of the two parts.
In the practical application process, in order to improve the accuracy of the determined bezier curve, the bending degree of the bezier curve in different directions needs to be determined. Specifically, determining a first intensity corresponding to the target pointing information and a second intensity in a direction perpendicular to the target pointing information; a Bezier curve is determined based on the first intensity, the second intensity, the pointer location information, the target connection point, the target pointing information, and the perpendicular pointing information perpendicular to the target pointing information.
The first intensity reflects the bending degree of the Bezier curve in the direction of the target, and the second intensity reflects the bending degree of the Bezier curve in the direction of the target in the vertical direction, and it can be understood that the first intensity and the second intensity also represent the weight of the influence of different directions on the constructed Bezier curve. Taking fig. 4 as an example, the direction of the connecting line between the handle controller and the startouthandle is the direction of the target pointing in the vertical direction, that is, the direction of the point of the bezier curve from the direction indicator, and the corresponding direction of the connecting line between the target connection point, the condustpoint and the endinhandle is the direction of the target pointing, that is, the direction of the point of entry when the bezier curve intersects with the to-be-repaired orthogonal mutual indication line through the target connection point. The first intensity and the second intensity can be determined based on preset parameters while the two directions are determined. Specifically, the intensity of the output point of the Bezier curve from the direction indicator is determined according to the curveBegin density parameter, namely, the bending degree of the Bezier curve in the direction vertical to the target is determined, and the intensity of the input point of the Bezier curve from the target connecting point is determined according to the curveEndIntiness parameter, namely, the bending degree of the Bezier curve in the direction vertical to the target is determined. It should be noted that, in the practical application process, the preset values of at least the two parameters do not exceed 1, and the bezier curve distortion phenomenon generally does not occur.
In the practical application process, the Bezier curve is drawn based on a plurality of points, so that the Bezier curve to be drawn is optionally determined according to the first intensity, the second intensity, the position information of the indicator, the target connection point, the target pointing information and the vertical pointing information; determining a plurality of discrete points corresponding to a Bezier curve to be drawn according to preset drawing parameters; based on the plurality of discrete points, a Bezier curve is determined.
Continuing to take fig. 4 as an example, when determining the bending degrees of the bezier curve in the two directions based on the first determination and the second strength respectively, and determining the target connection point, the bezier curve to be drawn can be determined. Further, a preset drawing parameter guidanceForwardOffset is obtained, and it can be understood that the drawing parameter is at least used for determining a plurality of discrete points for drawing the bezier curve in a two-dimensional plane. Meanwhile, the discrete points can be presented to a user based on the viewpoint controller, and after the user determines that the discrete points are adopted, a confirmation instruction can be issued to the system, and the system can connect a plurality of discrete points in sequence based on the instruction, so that a Bezier curve between a section of rays emitted by the direction indicator and a target connection point is obtained. It should be noted that, the bezier curve constructed based on the determined discrete points usually exceeds the to-be-repaired orthogonal mutual indication line by a distance which can be used as the breakdown distance, so that the occurrence of the condition that the ray judgment threshold value is wrong due to zero breakdown distance is avoided.
In this embodiment, in order to enable a user to control an object to be controlled in an augmented reality scene through a direction indicator, after determining a bezier curve that is a part of a target interaction indication line, a connection line between a target connection point on the corrected interaction indication line and a control point of the object to be controlled is required to be used as the interaction indication line to be spliced; and determining a target interaction indication line based on the Bezier curve and the interaction indication line to be spliced.
Taking fig. 2 as an example for illustration, after determining the bezier curve as the guidance portion, the hitpoint and the portion between the connection points may be connected, so as to obtain the interactive indication line to be spliced. Further, after the rays sent by the direction indicator along the Bezier curve reach the target connection point, the rays continue to be emitted along the interaction indication line to be spliced, when the rays intersect with the object A serving as the object to be controlled, the paths of the rays are the target interaction indication line, and the paths can be understood to be obtained by splicing the Bezier curve and the interaction indication line to be spliced. Meanwhile, the user can select and control the object A based on the target interaction indication line, for example, after the object A is selected, the system can display the information of the object A to the user through a display device associated with the viewpoint controller, and after the user presses a corresponding button on the direction indicator, a grabbing signal corresponding to the ray on the target interaction indication line can be generated, so that the effect of grabbing the object A in the augmented reality space is achieved.
It should be noted that, in the process of controlling the movement of the object to be controlled, optionally, updating the target interaction indication line according to the obtained target pointing information of the direction indicator and the viewpoint position information of the viewpoint controller is repeatedly performed.
Specifically, since the hand holding the handle of the user is in a continuously moving state, during the movement of the user, the system can continuously acquire the target pointing information of the direction indicator according to the scheme of the embodiment, and continuously determine the object to be controlled, which the user desires to control in the augmented reality scene, by combining the viewpoint position information of the viewpoint controller. Further, for the determined object, the currently generated target interaction indication line is updated, and a control signal corresponding to the ray on the updated target interaction indication line is generated according to the triggering operation of the user, so that continuous control of one object is realized, or the object in the scene is continuously switched according to the actual situation of the user to control.
According to the technical scheme, target pointing information of the direction indicator is obtained, namely the pointing direction of a handle held by a hand of a user is determined; according to the target pointing information and the viewpoint position information of the viewpoint controller, determining an interaction indicating line to be corrected and an object to be controlled which is intersected with the interaction indicating line to be corrected, namely determining the sight of a user and the object which is expected to be controlled; further, according to the indicator position information of the direction indicator and the to-be-repaired orthogonal mutual indication line, the target interaction indication line is determined, so that the object to be controlled is controlled based on the target interaction indication line, a user can select and control any object which can be observed in the sight range through the handle, the problem that some objects cannot be selected due to shielding with the user is avoided, the interaction effect of the user and various objects in the scene is enriched, and the interestingness of the virtual scene and the use experience of the user are improved.
Example two
Fig. 5 is a block diagram of a device for determining an interaction indication line according to a second embodiment of the present disclosure, which may execute the method for determining an interaction indication line according to any embodiment of the present disclosure, and has functional modules and beneficial effects corresponding to the execution method. As shown in fig. 5, the apparatus specifically includes: the system comprises a pointing information determining module 210, a waiting-repair orthogonal mutual indication line determining module 220 and a target interaction indication line determining module 230.
The pointing information determining module 210 is configured to obtain target pointing information of the direction indicator.
The to-be-repaired orthogonal mutual indication line determining module 220 is configured to determine an to-be-repaired interactive indication line and an to-be-controlled object intersecting with the to-be-repaired orthogonal mutual indication line according to the target pointing information and the viewpoint position information of the viewpoint controller.
The target interaction indication line determining module 230 is configured to determine a target interaction indication line according to the indicator position information of the direction indicator and the to-be-repaired orthogonal interaction indication line, so as to control the to-be-controlled object based on the target interaction indication line.
Optionally, the pointing information determining module 210 is further configured to determine target pointing information of the direction indicator based on a gyroscope provided in the direction indicator.
Based on the above technical solutions, the to-be-repaired orthogonal mutual indication line determining module 220 includes a to-be-repaired interactive indication line determining unit and a to-be-controlled object determining unit.
And the to-be-repaired orthogonal mutual indication line determining unit is used for determining the to-be-repaired orthogonal mutual indication line corresponding to the target pointing information by taking the viewpoint position information as a starting point.
And the object to be controlled determining unit is used for taking the object to be selected intersecting with the orthogonal interaction indication line to be repaired as the object to be controlled, and taking the intersecting intersection point as a control point of the object to be controlled so as to control the object to be controlled based on the control point.
Based on the above technical solutions, the target interaction indication line determining module 230 includes a to-be-selected connection point determining unit, a target connection point determining unit, and a target interaction indication line determining unit.
And the to-be-selected connection point determining unit is used for determining each to-be-selected connection point on the to-be-repaired orthogonal mutual indication line.
And the target connection point determining unit is used for determining a target connection point according to the indicator position information and each connection point to be selected.
And the target interaction indication line determining unit is used for determining a target interaction indication line according to the indicator position information, the target connection point and the to-be-repaired orthogonal interaction indication line.
Optionally, the to-be-selected connection point determining unit is further configured to determine a connection point adjustment range according to the length information of the to-be-repaired orthogonal mutual indication line and a preset adjustment parameter; and in the connection point adjustment range, sequentially determining each connection point to be selected according to a preset step length by taking the viewpoint position information as a starting point.
Optionally, the target connection point determining unit is further configured to determine, according to the order information of determining each connection point to be selected, whether a connection line between the indicator position information and the corresponding connection point to be selected intersects the obstacle for the first time; if yes, determining a previous to-be-selected connection point when crossing the obstacle according to the sequence information, and taking the previous to-be-selected connection point as the target connection point.
Optionally, the target interaction indication line determining unit is further configured to determine a bezier curve according to the indicator position information and the target connection point; and determining the target interaction indication line according to the Bezier curve and the to-be-repaired orthogonal interaction indication line.
Optionally, the target interaction indication line determining unit is further configured to determine a first intensity corresponding to the target pointing information and a second intensity in a direction perpendicular to the target pointing information; the Bezier curve is determined based on the first intensity, the second intensity, the pointer location information, the target connection point, the target pointing information, and the perpendicular pointing information perpendicular to the target pointing information.
Optionally, the target interaction indication line determining unit is further configured to determine a bezier curve to be drawn according to the first intensity, the second intensity, the indicator position information, the target connection point, the target pointing information, and the vertical pointing information; determining a plurality of discrete points corresponding to the Bezier curve to be drawn according to preset drawing parameters; the Bezier curve is determined based on the plurality of discrete points.
Optionally, the target interaction indication line determining unit is further configured to use a connection line between the target connection point on the corrected interaction indication line and the object control point to be controlled as an interaction indication line to be spliced; and determining the target interaction indication line based on the Bezier curve and the interaction indication line to be spliced.
On the basis of the technical schemes, the device for determining the interactive indication line further comprises a target interactive indication line updating module.
And the target interaction indication line updating module is used for repeatedly executing the updating of the target interaction indication line according to the acquired target pointing information of the direction indicator and the viewpoint position information of the viewpoint controller in the process of controlling the movement of the object to be controlled.
According to the technical scheme provided by the embodiment, target pointing information of the direction indicator is obtained, namely, the pointing direction of a handle held by a hand of a user is determined; according to the target pointing information and the viewpoint position information of the viewpoint controller, determining an interaction indicating line to be corrected and an object to be controlled which is intersected with the interaction indicating line to be corrected, namely determining the sight of a user and the object which is expected to be controlled; further, according to the indicator position information of the direction indicator and the to-be-repaired orthogonal mutual indication line, the target interaction indication line is determined, so that the object to be controlled is controlled based on the target interaction indication line, a user can select and control any object which can be observed in the sight range through the handle, the problem that some objects cannot be selected due to shielding with the user is avoided, the interaction effect of the user and various objects in the scene is enriched, and the interestingness of the virtual scene and the use experience of the user are improved.
The device for determining the interaction indication line provided by the embodiment of the disclosure can execute the method for determining the interaction indication line provided by any embodiment of the disclosure, and has the corresponding functional modules and beneficial effects of the execution method.
It should be noted that each unit and module included in the above apparatus are only divided according to the functional logic, but not limited to the above division, so long as the corresponding functions can be implemented; in addition, the specific names of the functional units are also only for convenience of distinguishing from each other, and are not used to limit the protection scope of the embodiments of the present disclosure.
Example III
Fig. 6 is a schematic structural diagram of an electronic device according to a third embodiment of the disclosure. Referring now to fig. 6, a schematic diagram of an electronic device (e.g., a terminal device or server in fig. 6) 300 suitable for use in implementing embodiments of the present disclosure is shown. The terminal devices in the embodiments of the present disclosure may include, but are not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., in-vehicle navigation terminals), and the like, and stationary terminals such as digital TVs, desktop computers, and the like. The electronic device shown in fig. 6 is merely an example and should not be construed to limit the functionality and scope of use of the disclosed embodiments.
As shown in fig. 6, the electronic device 300 may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 301 that may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 302 or a program loaded from a storage means 306 into a Random Access Memory (RAM) 303. In the RAM 303, various programs and data required for the operation of the electronic apparatus 300 are also stored. The processing device 301, the ROM 302, and the RAM 303 are connected to each other via a bus 304. An edit/output (I/O) interface 305 is also connected to bus 304.
In general, the following devices may be connected to the I/O interface 305: editing devices 306 including, for example, a touch screen, a touch pad, a keyboard, a mouse, a camera, a microphone, an accelerometer, a gyroscope, and the like; an output device 307 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 308 including, for example, magnetic tape, hard disk, etc.; and communication means 309. The communication means 309 may allow the electronic device 300 to communicate with other devices wirelessly or by wire to exchange data. While fig. 6 shows an electronic device 300 having various means, it is to be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a non-transitory computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via communications means 309, or installed from storage means 306, or installed from ROM 302. The above-described functions defined in the methods of the embodiments of the present disclosure are performed when the computer program is executed by the processing means 301.
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
The electronic device provided by the embodiment of the present disclosure and the method for determining the interaction indication line provided by the foregoing embodiment belong to the same inventive concept, and technical details not described in detail in the present embodiment may be referred to the foregoing embodiment, and the present embodiment has the same beneficial effects as the foregoing embodiment.
Example IV
The embodiment of the present disclosure provides a computer storage medium having stored thereon a computer program which, when executed by a processor, implements the method for determining an interaction indication line provided by the above embodiment.
It should be noted that the computer readable medium described in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some implementations, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText Transfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to:
responding to a special effect triggering operation, determining at least one special effect image to be displayed, and displaying the at least one special effect image to be displayed according to a preset image display mode;
and if the condition of page turning is detected to be met in the display process, executing a target page turning special effect on the currently displayed special effect image to be displayed, and displaying the rest special effect images to be displayed according to the image display mode until receiving the operation of stopping special effect display.
Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including, but not limited to, an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware. The name of the unit does not in any way constitute a limitation of the unit itself, for example the first acquisition unit may also be described as "unit acquiring at least two internet protocol addresses".
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
According to one or more embodiments of the present disclosure, there is provided a method of determining an interaction indication line, the method comprising:
acquiring target pointing information of a direction indicator;
determining an interaction indication line to be corrected and an object to be controlled which is intersected with the interaction indication line to be corrected according to the target pointing information and the viewpoint position information of the viewpoint controller;
and determining a target interaction indication line according to the indicator position information of the direction indicator and the to-be-repaired orthogonal interaction indication line, so as to control the object to be controlled based on the target interaction indication line.
According to one or more embodiments of the present disclosure, there is provided a method for determining an interaction indication line, further comprising:
optionally, the target pointing information of the direction indicator is determined based on a gyroscope provided in the direction indicator.
According to one or more embodiments of the present disclosure, there is provided a method for determining an interaction indication line, further comprising:
optionally, the viewpoint position information is taken as a starting point, and a to-be-repaired orthogonal mutual indication line corresponding to the target pointing information is determined;
and taking the object to be selected intersecting with the to-be-repaired orthogonal interaction indication line as the object to be controlled, and taking the intersecting intersection point as a control point of the object to be controlled so as to control the object to be controlled based on the control point.
According to one or more embodiments of the present disclosure, there is provided a method for determining an interaction indication line, further comprising:
optionally, determining each connection point to be selected on the orthogonal mutual indication line to be repaired;
determining a target connection point according to the indicator position information and each connection point to be selected;
and determining a target interaction indication line according to the indicator position information, the target connection point and the to-be-repaired orthogonal interaction indication line.
According to one or more embodiments of the present disclosure, there is provided a method for determining an interaction indication line, further comprising:
optionally, determining a connection point adjustment range according to the length information of the orthogonal mutual indication line to be repaired and preset adjustment parameters;
and in the connection point adjustment range, sequentially determining each connection point to be selected according to a preset step length by taking the viewpoint position information as a starting point.
According to one or more embodiments of the present disclosure, there is provided a method for determining an interaction indication line, further comprising:
optionally, determining whether a connection line between the indicator position information and the corresponding connection point to be selected intersects the obstacle for the first time according to the sequence information of determining each connection point to be selected;
If yes, determining a previous to-be-selected connection point when crossing the obstacle according to the sequence information, and taking the previous to-be-selected connection point as the target connection point.
According to one or more embodiments of the present disclosure, there is provided a method for determining an interaction indication line, further comprising:
optionally, determining a bezier curve according to the indicator position information and the target connection point;
and determining the target interaction indication line according to the Bezier curve and the to-be-repaired orthogonal interaction indication line.
According to one or more embodiments of the present disclosure, there is provided a method for determining an interaction indication line, further comprising:
optionally, determining a first intensity corresponding to the target pointing information and a second intensity in a direction perpendicular to the target pointing information;
the Bezier curve is determined based on the first intensity, the second intensity, the pointer location information, the target connection point, the target pointing information, and the perpendicular pointing information perpendicular to the target pointing information.
According to one or more embodiments of the present disclosure, there is provided a method for determining an interaction indication line, further comprising:
optionally, determining a bezier curve to be drawn according to the first intensity, the second intensity, the indicator position information, the target connection point, the target pointing information and the vertical pointing information;
Determining a plurality of discrete points corresponding to the Bezier curve to be drawn according to preset drawing parameters;
the Bezier curve is determined based on the plurality of discrete points.
According to one or more embodiments of the present disclosure, there is provided a method for determining an interaction indication line, further comprising:
optionally, a connecting line between the target connection point on the correction interaction indication line and the object control point to be controlled is used as an interaction indication line to be spliced;
and determining the target interaction indication line based on the Bezier curve and the interaction indication line to be spliced.
According to one or more embodiments of the present disclosure, there is provided a method for determining an interaction indication line, further comprising:
optionally, updating the target interaction indication line according to the obtained target pointing information of the direction indicator and the viewpoint position information of the viewpoint controller.
According to one or more embodiments of the present disclosure, there is provided an apparatus for determining an interaction indication line, including:
the pointing information determining module is used for acquiring target pointing information of the direction indicator;
The to-be-repaired orthogonal mutual indication line determining module is used for determining an to-be-repaired interactive indication line and an object to be controlled which is intersected with the to-be-repaired orthogonal mutual indication line according to the target pointing information and the viewpoint position information of the viewpoint controller;
and the target interaction indication line determining module is used for determining a target interaction indication line according to the indicator position information of the direction indicator and the to-be-repaired orthogonal interaction indication line so as to control the to-be-controlled object based on the target interaction indication line.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by persons skilled in the art that the scope of the disclosure referred to in this disclosure is not limited to the specific combinations of features described above, but also covers other embodiments which may be formed by any combination of features described above or equivalents thereof without departing from the spirit of the disclosure. Such as those described above, are mutually substituted with the technical features having similar functions disclosed in the present disclosure (but not limited thereto).
Moreover, although operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limiting the scope of the present disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are example forms of implementing the claims.

Claims (13)

1. A method for determining an interaction indicator, comprising:
acquiring target pointing information of a direction indicator;
determining an interaction indication line to be corrected and an object to be controlled which is intersected with the interaction indication line to be corrected according to the target pointing information and the viewpoint position information of the viewpoint controller;
determining a target interaction indication line according to the indicator position information of the direction indicator and the to-be-repaired orthogonal interaction indication line, so as to control the object to be controlled based on the target interaction indication line;
the determining a target interaction indication line according to the indicator position information of the direction indicator and the to-be-repaired orthogonal interaction indication line comprises the following steps:
determining a plurality of to-be-selected connection points on the to-be-repaired orthogonal mutual indication line;
determining a target connection point according to the indicator position information and a plurality of connection points to be selected;
And determining a target interaction indication line according to the indicator position information, the target connection point and the to-be-repaired orthogonal interaction indication line.
2. The method of claim 1, wherein the obtaining target pointing information for the direction indicator comprises:
target pointing information of the direction indicator is determined based on a gyroscope provided in the direction indicator.
3. The method according to claim 1, wherein the determining the interactive indication line to be corrected and the object to be controlled intersecting the interactive indication line to be corrected according to the target pointing information and the viewpoint position information of the viewpoint controller includes:
determining an orthogonal mutual indication line to be repaired corresponding to the target pointing information by taking the viewpoint position information as a starting point;
and taking the object to be selected intersecting with the to-be-repaired orthogonal interaction indication line as the object to be controlled, and taking the intersecting intersection point as a control point of the object to be controlled so as to control the object to be controlled based on the control point.
4. The method of claim 1, wherein the determining a plurality of points of attachment to be selected on the orthogonal mutual indication line to be repaired comprises:
Determining a connection point adjustment range according to the length information of the orthogonal mutual indication line to be repaired and preset adjustment parameters;
and in the connection point adjustment range, sequentially determining a plurality of connection points to be selected according to a preset step length by taking the viewpoint position information as a starting point.
5. The method of claim 4, wherein determining a target connection point based on the pointer location information and a plurality of connection points to be selected comprises:
determining whether a connection line between the indicator position information and the corresponding connection point to be selected intersects an obstacle for the first time according to the sequence information of the plurality of connection points to be selected;
if yes, determining a previous to-be-selected connection point when crossing the obstacle according to the sequence information, and taking the previous to-be-selected connection point as the target connection point.
6. The method of claim 1, wherein the determining a target interaction indicator line based on the indicator location information, the target connection point, and the to-be-repaired orthogonal interaction indicator line comprises:
determining a Bezier curve according to the indicator position information and the target connection point;
and determining the target interaction indication line according to the Bezier curve and the to-be-repaired orthogonal interaction indication line.
7. The method of claim 6, wherein said determining a bezier curve based on said pointer location information and said target connection point comprises:
determining a first intensity corresponding to the target pointing information and a second intensity in a direction perpendicular to the target pointing information;
the Bezier curve is determined based on the first intensity, the second intensity, the pointer location information, the target connection point, the target pointing information, and the perpendicular pointing information perpendicular to the target pointing information.
8. The method of claim 7, wherein the determining the bezier curve based on the first intensity, the second intensity, the pointer location information, the target connection point, the target pointing information, and the vertical pointing information perpendicular to the target pointing information comprises:
determining a Bezier curve to be drawn according to the first intensity, the second intensity, the indicator position information, the target connection point, the target pointing information and the vertical pointing information;
determining a plurality of discrete points corresponding to the Bezier curve to be drawn according to preset drawing parameters;
the Bezier curve is determined based on the plurality of discrete points.
9. The method of claim 6, wherein the determining the target interaction indicator from the bezier curve and the to-be-repaired orthogonal interaction indicator comprises:
a connecting line between the target connecting point on the correction interaction indicating line and the control point of the object to be controlled is used as an interaction indicating line to be spliced;
and determining the target interaction indication line based on the Bezier curve and the interaction indication line to be spliced.
10. The method of claim 1, further comprising, in controlling the movement of the object to be controlled:
and repeatedly executing the updating of the target interaction indication line according to the acquired target pointing information of the direction indicator and the viewpoint position information of the viewpoint controller.
11. A device for determining an interactive indicator line, comprising:
the pointing information determining module is used for acquiring target pointing information of the direction indicator;
the to-be-repaired orthogonal mutual indication line determining module is used for determining an to-be-repaired interactive indication line and an object to be controlled which is intersected with the to-be-repaired orthogonal mutual indication line according to the target pointing information and the viewpoint position information of the viewpoint controller;
The target interaction indication line determining module is used for determining a target interaction indication line according to the indicator position information of the direction indicator and the to-be-repaired orthogonal interaction indication line so as to control the to-be-controlled object based on the target interaction indication line;
the target interaction indication line determining module further includes:
a to-be-selected connection point determining unit, configured to determine each to-be-selected connection point on the to-be-repaired orthogonal mutual indication line;
the target connection point determining unit is used for determining a target connection point according to the indicator position information and each connection point to be selected;
and the target interaction indication line determining unit is used for determining a target interaction indication line according to the indicator position information, the target connection point and the to-be-repaired orthogonal interaction indication line.
12. An electronic device, the electronic device comprising:
one or more processors;
storage means for storing one or more programs,
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of determining interaction indication lines of any of claims 1-10.
13. A storage medium containing computer executable instructions for performing the method of determining an interaction indication line according to any of claims 1-10 when executed by a computer processor.
CN202210176746.2A 2022-02-25 2022-02-25 Method and device for determining interaction indication line, electronic equipment and storage medium Active CN114564106B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210176746.2A CN114564106B (en) 2022-02-25 2022-02-25 Method and device for determining interaction indication line, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210176746.2A CN114564106B (en) 2022-02-25 2022-02-25 Method and device for determining interaction indication line, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114564106A CN114564106A (en) 2022-05-31
CN114564106B true CN114564106B (en) 2023-11-28

Family

ID=81714845

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210176746.2A Active CN114564106B (en) 2022-02-25 2022-02-25 Method and device for determining interaction indication line, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114564106B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115237289B (en) * 2022-07-01 2024-02-23 杭州涂鸦信息技术有限公司 Hot zone range determining method, device, equipment and storage medium
CN116243795A (en) * 2023-02-20 2023-06-09 南方科技大学 Mixed reality-based object grabbing method and mixed reality equipment

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1049691A (en) * 1996-07-31 1998-02-20 Dainippon Screen Mfg Co Ltd Method and device for tracing dot sequence with bezier curve
JP2012256280A (en) * 2011-06-10 2012-12-27 Nippon Telegr & Teleph Corp <Ntt> Dynamic image processing method, dynamic image processor and dynamic image processing program
CN105912110A (en) * 2016-04-06 2016-08-31 北京锤子数码科技有限公司 Method, device and system for performing target selection in virtual reality space
CN106886038A (en) * 2015-12-15 2017-06-23 骑记(厦门)科技有限公司 The processing method and processing device of movement locus
CN107797752A (en) * 2017-11-07 2018-03-13 广州视睿电子科技有限公司 The rendering method and device of written handwriting, interactive intelligent tablet computer and storage medium
CN108388347A (en) * 2018-03-15 2018-08-10 网易(杭州)网络有限公司 Interaction control method and device in virtual reality and storage medium, terminal
CN108981722A (en) * 2017-05-31 2018-12-11 通用汽车环球科技运作有限责任公司 The trajectory planning device using Bezier for autonomous driving
CN111242966A (en) * 2020-01-15 2020-06-05 北京大豪科技股份有限公司 Image boundary correction method, device, electronic equipment and storage medium
CN111369619A (en) * 2020-02-28 2020-07-03 神华铁路装备有限责任公司 VR visual angle correction method, device, system and storage medium
CN111383296A (en) * 2018-12-28 2020-07-07 北京小米移动软件有限公司 Display method and device for drawing track and storage medium
CN112148125A (en) * 2020-09-23 2020-12-29 北京市商汤科技开发有限公司 AR interaction state control method, device, equipment and storage medium
CN112927119A (en) * 2019-12-06 2021-06-08 富士施乐实业发展(中国)有限公司 Anti-theft information embedding method and using method of TrueType font library
CN113041616A (en) * 2021-02-22 2021-06-29 网易(杭州)网络有限公司 Method and device for controlling jumping display in game, electronic equipment and storage medium
CN113538623A (en) * 2021-07-20 2021-10-22 北京京东振世信息技术有限公司 Method and device for determining target image, electronic equipment and storage medium
CN113655813A (en) * 2021-10-20 2021-11-16 北京微纳星空科技有限公司 Flight deviation correction control method and system, storage medium and electronic equipment
CN113760087A (en) * 2021-02-08 2021-12-07 北京沃东天骏信息技术有限公司 Method and device for determining hit point position information

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5653141B2 (en) * 2010-09-01 2015-01-14 キヤノン株式会社 Image processing method, image processing apparatus, and program
JP6501727B2 (en) * 2016-06-02 2019-04-17 株式会社スペース・バイオ・ラボラトリーズ Walking motion assistance device

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1049691A (en) * 1996-07-31 1998-02-20 Dainippon Screen Mfg Co Ltd Method and device for tracing dot sequence with bezier curve
JP2012256280A (en) * 2011-06-10 2012-12-27 Nippon Telegr & Teleph Corp <Ntt> Dynamic image processing method, dynamic image processor and dynamic image processing program
CN106886038A (en) * 2015-12-15 2017-06-23 骑记(厦门)科技有限公司 The processing method and processing device of movement locus
CN105912110A (en) * 2016-04-06 2016-08-31 北京锤子数码科技有限公司 Method, device and system for performing target selection in virtual reality space
CN108981722A (en) * 2017-05-31 2018-12-11 通用汽车环球科技运作有限责任公司 The trajectory planning device using Bezier for autonomous driving
CN107797752A (en) * 2017-11-07 2018-03-13 广州视睿电子科技有限公司 The rendering method and device of written handwriting, interactive intelligent tablet computer and storage medium
CN108388347A (en) * 2018-03-15 2018-08-10 网易(杭州)网络有限公司 Interaction control method and device in virtual reality and storage medium, terminal
CN111383296A (en) * 2018-12-28 2020-07-07 北京小米移动软件有限公司 Display method and device for drawing track and storage medium
CN112927119A (en) * 2019-12-06 2021-06-08 富士施乐实业发展(中国)有限公司 Anti-theft information embedding method and using method of TrueType font library
CN111242966A (en) * 2020-01-15 2020-06-05 北京大豪科技股份有限公司 Image boundary correction method, device, electronic equipment and storage medium
CN111369619A (en) * 2020-02-28 2020-07-03 神华铁路装备有限责任公司 VR visual angle correction method, device, system and storage medium
CN112148125A (en) * 2020-09-23 2020-12-29 北京市商汤科技开发有限公司 AR interaction state control method, device, equipment and storage medium
CN113760087A (en) * 2021-02-08 2021-12-07 北京沃东天骏信息技术有限公司 Method and device for determining hit point position information
CN113041616A (en) * 2021-02-22 2021-06-29 网易(杭州)网络有限公司 Method and device for controlling jumping display in game, electronic equipment and storage medium
CN113538623A (en) * 2021-07-20 2021-10-22 北京京东振世信息技术有限公司 Method and device for determining target image, electronic equipment and storage medium
CN113655813A (en) * 2021-10-20 2021-11-16 北京微纳星空科技有限公司 Flight deviation correction control method and system, storage medium and electronic equipment

Also Published As

Publication number Publication date
CN114564106A (en) 2022-05-31

Similar Documents

Publication Publication Date Title
CN114564106B (en) Method and device for determining interaction indication line, electronic equipment and storage medium
US11506904B1 (en) Presentation content update method, head-mounted display, and computer-readable medium
CN114461064B (en) Virtual reality interaction method, device, equipment and storage medium
CN110794962A (en) Information fusion method, device, terminal and storage medium
US11610287B2 (en) Motion trail update method, head-mounted display device and computer-readable medium
WO2024066756A1 (en) Interaction method and apparatus, and display device
US11935176B2 (en) Face image displaying method and apparatus, electronic device, and storage medium
CN112925593A (en) Method and device for scaling and rotating target layer
CN113515201B (en) Cursor position updating method and device and electronic equipment
US20240126088A1 (en) Positioning method, apparatus and system of optical tracker
US20230409121A1 (en) Display control method, apparatus, electronic device, medium, and program product
CN117631810A (en) Operation processing method, device, equipment and medium based on virtual reality space
US20240028130A1 (en) Object movement control method, apparatus, and device
CN117641025A (en) Model display method, device, equipment and medium based on virtual reality space
CN117632063A (en) Display processing method, device, equipment and medium based on virtual reality space
CN117008709A (en) Control method and device based on VR equipment, electronic equipment and storage medium
CN115576419A (en) Interaction method, interaction device, terminal equipment and computer-readable storage medium
CN115619918A (en) Image rendering method, device and equipment and storage medium
CN117170491A (en) Method, device, equipment and medium for determining virtual cursor in virtual reality scene
CN117641026A (en) Model display method, device, equipment and medium based on virtual reality space
CN116431033A (en) Virtual interface display method, device, head-mounted display equipment and readable medium
CN117170487A (en) Interaction method, device, apparatus, storage medium and program product
CN117420907A (en) Interaction control method and device, electronic equipment and storage medium
CN115617164A (en) Interaction method and device and near-to-eye display equipment
CN116126141A (en) Pose data processing method, pose data processing system, electronic equipment and computer readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant