CN111831178A - Auxiliary target selection method and system in three-dimensional environment based on motion trend information - Google Patents

Auxiliary target selection method and system in three-dimensional environment based on motion trend information Download PDF

Info

Publication number
CN111831178A
CN111831178A CN202010608808.3A CN202010608808A CN111831178A CN 111831178 A CN111831178 A CN 111831178A CN 202010608808 A CN202010608808 A CN 202010608808A CN 111831178 A CN111831178 A CN 111831178A
Authority
CN
China
Prior art keywords
user
target
selection
track
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010608808.3A
Other languages
Chinese (zh)
Other versions
CN111831178B (en
Inventor
田丰
李念龙
黄进
王宏安
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Software of CAS
Original Assignee
Institute of Software of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Software of CAS filed Critical Institute of Software of CAS
Priority to CN202010608808.3A priority Critical patent/CN111831178B/en
Publication of CN111831178A publication Critical patent/CN111831178A/en
Application granted granted Critical
Publication of CN111831178B publication Critical patent/CN111831178B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures

Abstract

The invention provides an auxiliary target selection method and system based on motion trend information in a three-dimensional space. The method comprises the following steps: establishing a coordinate system in a virtual space, and representing each selectable target in the interface by using a target set; acquiring an actual selection track of a user, predicting the selection track of the user for each selectable target according to the actual selection track of the user, and acquiring a predicted user selection track; calculating the similarity between the actual selection track of the user and the predicted selection track of the user of each selectable target to obtain a track similarity set; and obtaining the predicted user selection track with the highest similarity according to the track similarity set, and taking the corresponding selectable target as an activation target. The method and the device aim at the target selection task in the three-dimensional space, predict the user intention target in the process of selecting the target by the user by utilizing the track trend information of the target selected by the user, and improve the efficiency of the interaction task.

Description

Auxiliary target selection method and system in three-dimensional environment based on motion trend information
Technical Field
The invention belongs to the field of computer graphic user interfaces, and particularly relates to an auxiliary target selection method and system based on motion trend information in a three-dimensional space.
Background
Target selection is a basic interaction task in human-computer interaction, and with the popularization of Virtual Reality (VR) technology, target selection scenes in three-dimensional space are more and more common. Two object selection approaches commonly used in three-dimensional space today are metaphors based on virtual hands and virtual rays, respectively (Argelaguet F, Andujar C. Special selection Touching the 3rd Dimension: A subvey of 3D object selection techniques for virtual environments [ J ]. Computers & Graphics,2013,37(3): 121-. However, due to the fact that objects in the three-dimensional space are mutually shielded, factors such as lack of physical plane support exist during operation of a user, and great challenges are brought to the user for accurately selecting a target. With the development of graphical technology in recent years, interactive scenes become more complex, and the appearance of dynamic interactive scenes (such as shooting games, control systems, and the like) makes it more difficult for users to select targets.
To address this problem, researchers have devised a number of secondary goal selection techniques. One common approach is to continually narrow the target by a step-wise refinement approach. For example, the SQUAD technique proposed by Kopper et al, which presents objects in a region to a User after quartering the objects when the User selects the region, continues to separate after selection by the User until the User selects the final object (reference: Kopper, R., Bacim, F., Bowman, D.A.,2011.Rapid and access 3D selection by general definition. in: Proceedings of the IEEE symposium 3D User interface (3DUI), pp.67-74.). Another common approach is to increase the active area of the cursor, for example, by using a three-dimensional bubble cursor, attached to which is a semi-transparent sphere whose radius can be automatically enlarged to wrap the nearest object (Vanacken, L., Grossman, T., Coninx, K.,2007. expanding the effects sensitivity and target visibility on object selection in 3D vision environments. in: Proceedings of the 2007IEEE Symposium on 3D User Interfaces (3DUI), pp.117-124.). Although these techniques can improve the selection efficiency of the user, they also have many problems, such as increasing the operation steps of the user or modifying the original interactive interface.
Disclosure of Invention
The invention aims to provide an auxiliary target selection method and an auxiliary target selection system based on motion trend information in a three-dimensional space.
In order to achieve the purpose, the invention adopts the following technical scheme:
a method for assisting three-dimensional space target selection based on user motion trend information comprises the following steps:
establishing a coordinate system in a virtual space, and representing each selectable target in the interface by using a target set;
acquiring an actual selection track of a user, predicting the selection track of the user for each selectable target according to the actual selection track of the user, and acquiring a predicted user selection track;
calculating the similarity between the actual selection track of the user and the predicted selection track of the user of each selectable target to obtain a track similarity set;
and obtaining the predicted user selection track with the highest similarity according to the track similarity set, and taking the corresponding selectable target as an activation target.
Further, establishing a coordinate system in the virtual space means establishing a rectangular coordinate system or a spherical coordinate system with the user as an origin, establishing the rectangular coordinate system if the user uses a selection technology based on a virtual hand, and selecting the rectangular coordinate system with the user orientation as a z-axis; if a virtual ray-based selection technique is used, a spherical coordinate system is established. The interface comprises selectable targets, and the selectable targets are represented by target center positions, sizes and moving speeds.
Further, the target set refers to a set of coordinate points of each selectable target in the corresponding coordinate system.
Further, if the user uses a selection technique based on a virtual hand, the actual selection track of the user refers to a coordinate point set generated along with the movement of the input device and under the same rectangular coordinate system with the selectable target,can be expressed as
Figure BDA0002560154420000021
Wherein p isiIndicating that the user actually selected a coordinate point in the trajectory,
Figure BDA0002560154420000022
is piN represents the number of coordinate points in the actual selection track of the user; if the user uses a selection technology based on the virtual ray, the actual selection track of the user refers to a spherical intersection set with the radius r of the virtual ray and the selectable target under the same spherical coordinate, and r can take any value and can be expressed as
Figure BDA0002560154420000023
Wherein, θ is
Figure BDA0002560154420000024
Respectively representing the polar angle and the azimuth angle in a spherical coordinate system.
Further, the predicted user selection track of each selectable target and the actual user selection track are in the same coordinate system, and the predicted user selection track under the rectangular coordinate system is expressed as
Figure BDA0002560154420000025
Wherein q isiIndicating that a coordinate point in the predicted user-selected trajectory,
Figure BDA0002560154420000026
is qiN represents the number of coordinate points in the predicted user selected trajectory; the predicted user selection track in the spherical coordinate system is expressed as
Figure BDA0002560154420000027
Wherein, θ is
Figure BDA0002560154420000028
Respectively representing the polar angle and the azimuth angle in a spherical coordinate system.
Further onThe method for predicting the selection track of the user for each selectable target comprises the following steps: taking an initial time t0User actually selects a locus point p0Selecting an initial point q of a trajectory for a predicted user0A 1 is to p0And q is0And respectively adding the actual selection track set and the predicted user selection track set. At a time t thereafteri(i 1.., n-1), at the end point q of the predicted user-selected trajectory seti-1Taking one point in the direction of a connecting line with the position of the selectable target at the current moment as a predicted user selection track point q at the current momentiTaken the point to qiDistance of-1 from the actual user selected trace point p at the current momentiTo the end point p of the trace set actually selected by the useri-1The same distance therebetween. Q is to beiAnd piAnd respectively adding a predicted user selection track set and a user actual selection track set. If the selectable target is a static target, predicting that the track selected by the user is a straight line; and if the selectable target is a moving target, predicting that the user selection track is composed of a series of straight lines.
Further, the similarity (denoted by D) is calculated using a distance similarity measure DdistanceAnd direction similarity measure DdiretionThe fusion ratio parameter is controlled by a constant lambda:
D(P,Q)=λDdistance(P,Q)+(1-λ)Ddirection(P,Q),λ∈(0,1)
and the parameter lambda is a constant between 0 and 1, and the optimal value can be selected according to the selection track data obtained by repeatedly selecting the target in the interface by the user. The distance similarity measure D is based on the difference between the coordinate systems selected by the virtual hand selection technique and the virtual ray selection techniquedistanceAnd direction similarity measure DdiretionDifferent calculation formulas are adopted:
selection technique based on virtual hands, the user actually selecting a trajectory
Figure BDA0002560154420000031
And predicting user selection trajectories
Figure BDA0002560154420000032
Distance similarity measure D ofdistanceUsing Euclidean distance, a measure of directional similarity DdiretionUsing vector distances, respectively expressed as:
Figure BDA0002560154420000033
Figure BDA0002560154420000034
wherein d iseuclideanRepresenting the euclidean distance between two points.
Selection technique based on virtual rays, the user actually selecting a trajectory
Figure BDA0002560154420000035
And predicting user selection trajectories
Figure BDA0002560154420000036
The distance similarity measure DdistanceUsing great circle distance, direction similarity measure DdiretionUsing vector distances, respectively expressed as:
Figure BDA0002560154420000041
wherein
Figure BDA0002560154420000042
dcircleRepresenting the great circle distance between two points;
Figure BDA0002560154420000043
further, the fusion proportion parameter λ is a constant and is obtained by fitting empirical data, and the method includes the steps of:
matching the interface content with the target definition and the used selection technology, and designating the number of targets, the side length of the targets and the moving speed as experimental conditions;
presenting an initial display interface to a user, recording track data selected by the user for many times as experience data under the experimental condition, and repeatedly acquiring the experience data under the experimental conditions;
and (3) taking different values at intervals in the range (0, 1) to which the lambda belongs, substituting the values into a similarity calculation formula, and calculating the accuracy of the target intended by the user, wherein the value of the lambda corresponding to the highest accuracy is the value of the model parameter.
Further, the activation target may be presented to the user in a highlighted form, and the user presses a control button to select the activation target.
An assisted three-dimensional space object selection system based on user motion trend information, comprising a memory and a processor, the memory storing a computer program configured to be executed by the processor, the program comprising instructions for carrying out the steps of the above method.
Aiming at a target acquisition task in virtual reality, a user selection track prediction model is established based on motion trend information in the process of selecting a target by a user, so that an intention target which the user wants to select is predicted, and the efficiency of the interaction task is improved. The interactive system applies the method of the invention, and the user can select the target without accurately pointing to the target according to the traditional target selection mode, but the user can press the button to select the target in the selection process by utilizing the track trend in the selection process.
Compared with the prior art, the invention has the following positive effects: the method of the invention advances the time of judging the user intention target by the system, does not need the user to confirm the selection after pointing to the target, but makes a prediction in the process of user selection; the distance between the cursor and the target is not taken as a unique measurement standard, but the motion trend information of the user is combined, the track similarity is compared, and the target selection efficiency of the user under VR is improved; user confirmation of the selected target is enhanced by visual feedback cues.
Drawings
Fig. 1 is a flowchart of an auxiliary three-dimensional space target selection method based on user motion trend information according to the present invention.
Fig. 2A and 2B are schematic diagrams of coordinate systems where the target is located in the virtual hand selection technique and the virtual ray selection technique, respectively.
FIG. 3 is a diagram illustrating a method for predicting a user selection trajectory according to the present invention.
Detailed Description
In order to make the aforementioned and other features and advantages of the invention more comprehensible, embodiments accompanied with figures are described in detail below.
The present embodiment provides an auxiliary three-dimensional space target selection method based on user motion trend information, as shown in fig. 1, the steps are as follows:
1) displaying interface content, wherein the candidate targets contained in the interface content are represented as T ═ { T ═ T in a set1,t2,…,tn};
2) Detecting the motion of the user input device, and starting to record the motion track of the user input device, namely the actual selection track P ═ P of the user1,p2,…,pt};
3) When the actual selection track P of the user is obtained, for the current time T, calculating the predicted user selection track Q of each target in the selectable target set T, wherein the predicted user selection track Q is { Q ═ Q }1,q2,…,qt};
4) Calculating the similarity between the predicted user selection track Q and the actual user selection track P, and expressing D as { D ═ D in a sett1,Dt2,…,Dtn};
5) Get t*(t*E.t) such that D (T)*) Maximum in D, and t*As the final selected target.
In step 1), the interface content is generally a content displayed by a three-dimensional interactive system, and the invention is not limited to a specific content form. The objects contained within the interface are all three-dimensional objects and can be represented as:
Figure BDA0002560154420000051
wherein the content of the first and second substances,
Figure BDA0002560154420000052
the coordinate expressions in the rectangular coordinate system and the spherical coordinate system are (C) for the target center position, respectivelyx,Cy,Cz) And (C)r,Cθ,Cψ) As shown in fig. 2A and 2B, the object shape is a cube with a side length L, and V is the object moving speed.
In step 2), the present invention does not limit the specific form of detecting the motion of the user input device, and the event may be provided by any device and corresponding technology, or an activation button may be used as a mark for starting recording the motion track.
In step 3), after obtaining the motion trajectory of the input device, i.e. the actual selection trajectory P of the user, the method for calculating the predicted trajectory of the target, i.e. the predicted user selection trajectory, comprises: taking an initial time t0User actually selects a locus point p0Selecting an initial point q of a trajectory for a predicted user0A 1 is to p0And q is0And respectively adding the actual selection track set and the predicted user selection track set. At a time t thereafteri(i 1.., n-1), at the end point q of the predicted user-selected trajectory seti-1Taking one point in the direction of a connecting line with the position of the selectable target at the current moment as a predicted user selection track point q at the current momentiTaken the point to qi-1Distance from the user actually selecting the track point p at the current momentiTo the end point p of the trace set actually selected by the useri-1The same distance therebetween. Q is to beiAnd piAnd respectively adding a predicted user selection track set and a user actual selection track set. If the selectable target is a static target, predicting that the track selected by the user is a straight line; and if the selectable target is a moving target, predicting that the user selection track is composed of a series of straight lines. FIG. 3 is a diagram illustrating a method for predicting a user selection trajectory according to the present invention. Wherein L is1Denotes q1To q0Distance from p1To p0Same distance, L2Denotes q2To q1Distance from p2To p1The distances are the same.
In step 4), after obtaining the predicted track Q of each targetThe calculation formula of the similarity D is measured by the distance similarity DdistanceAnd direction similarity measure DdiretionComposition and controlled by a parameter λ:
D(P,Q)=λDdistance(P,Q)+(1-λ)Ddirection(P,Q),λ∈(0,1) (1)
wherein, if the user adopts the virtual hand selection technology, the distance similarity measurement calculation method comprises the following steps:
Figure BDA0002560154420000061
directional similarity measure DdiretionThe calculation method comprises the following steps:
Figure BDA0002560154420000062
distance similarity D if the user employs the virtual ray selection techniquedistanceThe calculation method of the measurement comprises the following steps:
Figure BDA0002560154420000063
direction similarity DdiretionThe calculation method of the measurement comprises the following steps:
Figure BDA0002560154420000064
the parameter lambda is a constant between 0 and 1, and the optimal value can be selected according to empirical data.
The empirical data refers to selection trajectory data obtained by a user repeatedly selecting a target in an interface before the method is used for any specific user display interface. The acquisition of empirical data is accomplished by user experimentation, which follows the general man-machine interaction user experiment procedures and guidelines, which are now briefly described for ease of understanding as follows:
i. the interface content is matched to the target definition and the selection technique used.
According to the inside of the interfaceSpecifying a target number ρ(i)Length of target side L(i)And a moving speed V(i)As experimental conditions.
Presenting to N users in the interface script form and asking them to select M times repeatedly under the condition.
Recording the selection tracks of the users in all M selection processes, and predicting track data obtained for each target.
v. repeating steps ii to iv above, empirical data is obtained under other experimental conditions.
The design and implementation process of the experiment need to consider the influence of factors such as sequence effect, learning effect, fatigue degree, user difference, sample number and the like, and the influence of the factors can be eliminated by following the general man-machine interaction experiment design principle.
For example, 2 targets, 3 targets, one side and 3 moving speeds may be selected for the 2 × 3 × 3 — 18 experimental conditions according to the density, size and speed of the specific interface target, the number N of users may be 12, and the number M of repeated selections under the same conditions may be 30, so that 12 × 30 — 360 user selection tracks may be obtained under one condition, and empirical data consisting of 18 × 360 — 6480 selection tracks may be formed.
The method for selecting the optimal value of the model parameter lambda by using the empirical data comprises the steps of taking different values at intervals in the interval (0, 1) to which the lambda belongs, substituting the different values into a similarity calculation formula, and calculating the target accuracy of the predicted user intention, wherein the value with the highest accuracy corresponding to the lambda is the value of the model parameter.
The similarity calculation formula in the above embodiments of the present invention is only an example, and other measurement formulas may also be used in calculating the similarity.
Based on the same inventive concept, another embodiment of the present invention provides a VR target selection system (which may be a computer, a server, a smartphone, etc.) based on user movement trend information, comprising a memory storing a computer program configured to be executed by the processor and a processor, the computer program comprising instructions for performing the steps of the inventive method.
Based on the same inventive concept, another embodiment of the present invention provides a computer-readable storage medium (e.g., ROM/RAM, magnetic disk, optical disk) storing a computer program, which when executed by a computer, performs the steps of the inventive method.
The method of the present invention has been described in detail by way of the form expression and examples, but the specific form of implementation of the present invention is not limited thereto. Various obvious changes and modifications can be made by one skilled in the art without departing from the spirit and principles of the process of the invention. The protection scope of the present invention shall be subject to the claims.

Claims (10)

1. A method for assisting three-dimensional space target selection based on user motion trend information is characterized by comprising the following steps:
establishing a coordinate system in a virtual space, and representing each selectable target in the interface by using a target set;
acquiring an actual selection track of a user, predicting the selection track of the user for each selectable target according to the actual selection track of the user, and acquiring a predicted user selection track;
calculating the similarity between the actual selection track of the user and the predicted selection track of the user of each selectable target to obtain a track similarity set;
and obtaining the predicted user selection track with the highest similarity according to the track similarity set, and taking the corresponding selectable target as an activation target.
2. The method of claim 1, wherein a rectangular coordinate system or a spherical coordinate system is established in the virtual space with the user as an origin, wherein the rectangular coordinate system is established if the user uses a selection technique based on a virtual hand, and wherein the spherical coordinate system is established if the selection technique based on a virtual ray is used; the selectable targets are included in the interface and are represented by target center positions, sizes and moving speeds.
3. The method of claim 1, wherein the actual selection track of the user refers to a set of coordinate points in the same coordinate system as the selectable target generated as the input device moves.
4. The method of claim 1, wherein predicting a user's selection trajectory for each selectable target comprises: taking an initial time t0User actually selects a locus point p0Selecting an initial point q of a trajectory for a predicted user0A 1 is to p0And q is0Respectively adding an actual user selection track set and a predicted user selection track set; at a time t thereafteri(i 1.., n-1), at the end point q of the predicted user-selected trajectory seti-1Taking one point in the direction of a connecting line with the position of the selectable target at the current moment as a predicted user selection track point q at the current momentiTaken the point to qi-1Distance from the user actually selecting the track point p at the current momentiTo the end point p of the trace set actually selected by the useri-1The distances between them are the same; q is to beiAnd piRespectively adding a predicted user selection track set and a user actual selection track set; if the selectable target is a static target, predicting that the track selected by the user is a straight line; and if the selectable target is a moving target, predicting that the user selection track is composed of a series of straight lines.
5. The method according to claim 1, wherein the similarity is calculated by fusing a distance similarity measure and a direction similarity measure, and controlling a fusion ratio through a fusion ratio parameter λ; the similarity calculation formula is as follows:
D(P,Q)=λDdistance(P,Q)+(1-λ)Ddirection(P,Q),λ∈(0,1)
wherein D (P, Q) represents the similarity between the actual selection track P of the user and the predicted selection track Q of the user, and DdistanceRepresenting a distance similarity measure, DdiretionRepresenting a directional similarity measure, and λ is a fusion scale parameter.
6. The method of claim 5, wherein the distance similarity measure is Euclidean distance or great circle distance, and the direction similarity measure is vector distance, and the calculation formula is as follows:
for selection techniques based on virtual hands, DdistanceUsing Euclidean distance, DdiretionUsing vector distances, respectively expressed as:
Figure FDA0002560154410000021
Figure FDA0002560154410000022
wherein d iseuclideanRepresenting the euclidean distance between two points;
for virtual ray-based selection techniques, DdistanceBy using great circle distance, DdiretionUsing vector distances, respectively expressed as:
Figure FDA0002560154410000023
Figure FDA0002560154410000024
wherein the content of the first and second substances,
Figure FDA0002560154410000025
dcircleindicating the great circle distance between two points.
7. The method of claim 5, wherein the fusion scaling parameter λ is a constant and is fit from empirical data, and the step comprises:
matching the interface content with the target definition and the used selection technology, and designating the number of targets, the side length of the targets and the moving speed as experimental conditions;
presenting an initial display interface to a user, recording track data selected by the user for many times as experience data under the experimental condition, and repeatedly acquiring the experience data under the experimental conditions;
and (3) taking different values at intervals in the range (0, 1) to which the lambda belongs, substituting the values into a similarity calculation formula, and calculating the accuracy of the target intended by the user, wherein the value of the lambda corresponding to the highest accuracy is the value of the model parameter.
8. The method of claim 1, wherein the activation target is presented to the user in a highlighted form, and the user presses a control button to select the activation target.
9. A VR target selection system based on user movement trend information, comprising a memory and a processor, the memory storing a computer program configured to be executed by the processor, the program comprising instructions for performing the steps of the method of any of claims 1-8.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when executed by a computer, implements the method of any one of claims 1 to 8.
CN202010608808.3A 2020-06-29 2020-06-29 Method and system for assisting target selection in three-dimensional environment based on motion trend information Active CN111831178B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010608808.3A CN111831178B (en) 2020-06-29 2020-06-29 Method and system for assisting target selection in three-dimensional environment based on motion trend information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010608808.3A CN111831178B (en) 2020-06-29 2020-06-29 Method and system for assisting target selection in three-dimensional environment based on motion trend information

Publications (2)

Publication Number Publication Date
CN111831178A true CN111831178A (en) 2020-10-27
CN111831178B CN111831178B (en) 2023-01-17

Family

ID=72899574

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010608808.3A Active CN111831178B (en) 2020-06-29 2020-06-29 Method and system for assisting target selection in three-dimensional environment based on motion trend information

Country Status (1)

Country Link
CN (1) CN111831178B (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160140391A1 (en) * 2014-11-14 2016-05-19 Intel Corporation Automatic target selection for multi-target object tracking
US20180004283A1 (en) * 2016-06-29 2018-01-04 Cheyne Rory Quin Mathey-Owens Selection of objects in three-dimensional space
CN108829248A (en) * 2018-06-01 2018-11-16 中国科学院软件研究所 A kind of mobile target selecting method and system based on the correction of user's presentation model
CN109782914A (en) * 2019-01-13 2019-05-21 吉林大学 The selection method of target in virtual three-dimensional scene based on pen device axial-rotation
CN109829405A (en) * 2019-01-22 2019-05-31 深圳大学 Data correlation method, device and the storage medium of video object
CN110135314A (en) * 2019-05-07 2019-08-16 电子科技大学 A kind of multi-object tracking method based on depth Trajectory prediction
CN110555061A (en) * 2019-09-06 2019-12-10 北京百度网讯科技有限公司 method and device for determining track similarity
CN110825833A (en) * 2019-11-11 2020-02-21 杭州数澜科技有限公司 Method for predicting user moving track point
CN110837326A (en) * 2019-10-24 2020-02-25 浙江大学 Three-dimensional target selection method based on object attribute progressive expression
CN110928914A (en) * 2018-08-30 2020-03-27 百度在线网络技术(北京)有限公司 Method and apparatus for outputting information
CN111260950A (en) * 2020-01-17 2020-06-09 清华大学 Trajectory prediction-based trajectory tracking method, medium and vehicle-mounted equipment
CN111298443A (en) * 2020-01-21 2020-06-19 广州虎牙科技有限公司 Game object control method and device, electronic equipment and storage medium

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106796657A (en) * 2014-11-14 2017-05-31 英特尔公司 For the automatic target selection of multiple target Object tracking
US20160140391A1 (en) * 2014-11-14 2016-05-19 Intel Corporation Automatic target selection for multi-target object tracking
US20180004283A1 (en) * 2016-06-29 2018-01-04 Cheyne Rory Quin Mathey-Owens Selection of objects in three-dimensional space
CN108829248A (en) * 2018-06-01 2018-11-16 中国科学院软件研究所 A kind of mobile target selecting method and system based on the correction of user's presentation model
CN110928914A (en) * 2018-08-30 2020-03-27 百度在线网络技术(北京)有限公司 Method and apparatus for outputting information
CN109782914A (en) * 2019-01-13 2019-05-21 吉林大学 The selection method of target in virtual three-dimensional scene based on pen device axial-rotation
CN109829405A (en) * 2019-01-22 2019-05-31 深圳大学 Data correlation method, device and the storage medium of video object
CN110135314A (en) * 2019-05-07 2019-08-16 电子科技大学 A kind of multi-object tracking method based on depth Trajectory prediction
CN110555061A (en) * 2019-09-06 2019-12-10 北京百度网讯科技有限公司 method and device for determining track similarity
CN110837326A (en) * 2019-10-24 2020-02-25 浙江大学 Three-dimensional target selection method based on object attribute progressive expression
CN110825833A (en) * 2019-11-11 2020-02-21 杭州数澜科技有限公司 Method for predicting user moving track point
CN111260950A (en) * 2020-01-17 2020-06-09 清华大学 Trajectory prediction-based trajectory tracking method, medium and vehicle-mounted equipment
CN111298443A (en) * 2020-01-21 2020-06-19 广州虎牙科技有限公司 Game object control method and device, electronic equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
李想等: "基于回溯的移动对象时序轨迹在线化简方法", 《湖南大学学报(自然科学版)》 *
谢彬等: "基于轨迹相似度的移动目标轨迹预测算法", 《计算机工程》 *

Also Published As

Publication number Publication date
CN111831178B (en) 2023-01-17

Similar Documents

Publication Publication Date Title
De Haan et al. IntenSelect: Using Dynamic Object Rating for Assisting 3D Object Selection.
Jankowski et al. Advances in interaction with 3D environments
Jankowski et al. A survey of interaction techniques for interactive 3D environments
US9684372B2 (en) System and method for human computer interaction
Coelho et al. Pointing task evaluation of leap motion controller in 3D virtual environment
US8169414B2 (en) Control of electronic games via finger angle using a high dimensional touchpad (HDTP) touch user interface
KR102110811B1 (en) System and method for human computer interaction
US10895950B2 (en) Method and system for generating a holographic image having simulated physical properties
CN109782914B (en) Method for selecting target in virtual three-dimensional scene based on axial rotation of pen-type device
CN110969687A (en) Collision detection method, device, equipment and medium
Adhikarla et al. Freehand interaction with large-scale 3D map data
Jörg et al. Virtual hands in VR: Motion capture, synthesis, and perception
Huang et al. Conceptual three-dimensional modeling using intuitive gesture-based midair three-dimensional sketching technique
CN107978018A (en) Construction method, device, electronic equipment and the storage medium of solid figure model
CN111831178B (en) Method and system for assisting target selection in three-dimensional environment based on motion trend information
US10379639B2 (en) Single-hand, full-screen interaction on a mobile device
JPH0269798A (en) Method of turning displayed object
Caputo et al. The smart pin: a novel object manipulation technique for immersive virtual environments
EP2779116B1 (en) Smooth manipulation of three-dimensional objects
CN112435316B (en) Method and device for preventing mold penetration in game, electronic equipment and storage medium
Liu et al. COMTIS: Customizable touchless interaction system for large screen visualization
Halim et al. Raycasting method using hand gesture for target selection on the occluded object in handheld augmented reality
CN114116109A (en) Equipment layout processing method, system, device and storage medium
Messaci et al. Zoom‐fwd: Efficient technique for 3D gestual interaction with distant and occluded objects in virtual reality
CN112596659A (en) Drawing method and device based on intelligent voice and image processing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant