CN108595095B - Method and device for simulating movement locus of target body based on gesture control - Google Patents

Method and device for simulating movement locus of target body based on gesture control Download PDF

Info

Publication number
CN108595095B
CN108595095B CN201810332828.5A CN201810332828A CN108595095B CN 108595095 B CN108595095 B CN 108595095B CN 201810332828 A CN201810332828 A CN 201810332828A CN 108595095 B CN108595095 B CN 108595095B
Authority
CN
China
Prior art keywords
gesture
target body
reference point
change
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810332828.5A
Other languages
Chinese (zh)
Other versions
CN108595095A (en
Inventor
刘飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Baidu Online Network Technology Beijing Co Ltd
Original Assignee
Baidu Online Network Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Baidu Online Network Technology Beijing Co Ltd filed Critical Baidu Online Network Technology Beijing Co Ltd
Priority to CN201810332828.5A priority Critical patent/CN108595095B/en
Publication of CN108595095A publication Critical patent/CN108595095A/en
Application granted granted Critical
Publication of CN108595095B publication Critical patent/CN108595095B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Abstract

The embodiment of the application discloses a method and a device for controlling a motion trail of a simulated target body based on gestures. One specific implementation of the method comprises the following steps: in response to the detection of the gesture operation of the user, determining the gesture offset of the gesture of the user relative to the reference point according to the gesture operation; mapping the gesture offset to a position change parameter of the simulation target body according to a preset mapping rule; acquiring current positioning information of a simulation target body; combining the current positioning information of the simulated target body with the position change parameters to determine the positioning data of the simulated target body after the change; and updating and positioning the simulated target body according to the positioning data after the position change so as to control the motion trail of the simulated target body. The embodiment can improve the effectiveness of checking the navigation performance of the electronic map.

Description

Method and device for simulating movement locus of target body based on gesture control
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to the technical field of electronic maps, and particularly relates to a method and a device for controlling a motion trail of a simulated target body based on gestures.
Background
Navigation through an electronic map has become a common means for people to go out. The electronic map navigation performance directly affects the user experience. The navigation performance of the electronic map is often embodied in aspects such as the accuracy of route planning, the response of a user after deviating from a route, and the like. In the prior art, the navigation performance of the electronic map is often checked through actual drive tests or simulation positioning tools to simulate a specific position or a section of track. However, the road test requires high labor and time costs, has low working efficiency, and is difficult to cover some remote and dangerous positions and complicated road traffic scenes; the method can only play back specific tracks or positions by simulating the playback tracks of the positioning tools, needs to construct or collect a large amount of track data in advance, can only play back the positions according to specific routes and speeds, and has certain difference from the simulation of real position change scenes such as driving and riding scenes.
Disclosure of Invention
The embodiment of the application provides a method and a device for controlling a motion trail of a simulated target body based on gestures.
In a first aspect, an embodiment of the present application provides a method for controlling a motion trajectory of a simulated target body based on a gesture, where the method includes: in response to the detection of the gesture operation of the user, determining the gesture offset of the gesture of the user relative to the reference point according to the gesture operation; mapping the gesture offset to a position change parameter of the simulation target body according to a preset mapping rule; acquiring current positioning information of a simulation target body; combining the current positioning information of the simulated target body with the position change parameters to determine the positioning data of the simulated target body after the change; and updating and positioning the simulated target body according to the positioning data after the position change so as to control the motion track of the simulated target body.
In some embodiments, the gesture offset includes a distance, an angle, a time of change of the user gesture from a reference point.
In some embodiments, the position change parameters include direction of change, speed of change, distance of change.
In some embodiments, the predetermined mapping rule comprises: determining a change direction in the position change parameter according to the angle between the user gesture and the reference point; determining a variation distance in the position variation parameter according to the distance between the user gesture and the reference point; and determining the change speed in the position change parameters according to the distance between the gesture and the datum point and the change time.
In some embodiments, the positioning information comprises at least one of: coordinates, angle, speed.
In some embodiments, monitoring the motion trajectory of the user gesture relative to the reference point is achieved by at least one of: the device comprises a detection handle, a mouse and a gesture detection control.
In a second aspect, an embodiment of the present application further provides an apparatus for controlling a motion trajectory of a simulated target body based on a gesture, where the apparatus includes: the determining module is configured to respond to the detection of the gesture operation of the user, and determine the gesture offset of the user gesture relative to the reference point according to the gesture operation; the mapping module is configured to map the gesture offset into a position change parameter of the simulation target body according to a preset mapping rule; the acquisition module is configured to acquire current positioning information of the simulation target body; the combination module is configured to combine the current positioning information of the simulated target body with the position change parameters to determine the positioning data of the simulated target body after the change; and the updating module is configured for updating and positioning the simulated target body according to the positioning data after the position change so as to control the motion track of the simulated target body.
In some embodiments, the gesture offset includes a distance, an angle, a time of change of the user gesture from a reference point.
In some embodiments, the position change parameters include direction of change, speed of change, distance of change.
In some embodiments, the predetermined mapping rules of the mapping module include: determining a change direction in the position change parameter according to the angle between the user gesture and the reference point; determining a variation distance in the position variation parameter according to the distance between the user gesture and the reference point; and determining the change speed in the position change parameter according to the distance between the gesture and the datum point and the change time.
In some embodiments, the positioning information comprises at least one of: coordinates, angle, speed.
In some embodiments, monitoring the motion trajectory of the user gesture relative to the reference point is achieved by at least one of: the detection device comprises a detection handle, a mouse and a gesture detection control.
In a third aspect, the present application further provides a computing device, comprising: one or more processors; storage means for storing one or more programs; when executed by the one or more processors, cause the one or more processors to implement a method as described in any implementation of the first aspect.
In a fourth aspect, the present application provides a computer-readable medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the method as described in any implementation manner of the first aspect.
The method and the device for controlling the motion trail of the simulation target body based on the gesture respond to the gesture operation of a user, the gesture offset of the gesture of the user relative to a reference point is determined according to the gesture operation, then the gesture offset is mapped into position change parameters of the simulation target body according to a preset mapping rule, then the current positioning information of the simulation target body is obtained, then the current positioning information and the position change parameters of the simulation target body are combined to determine the positioning data of the simulation target body after the change, then the simulation target body is updated and positioned according to the positioning data after the position change, and the motion trail of the simulation target body is controlled. Because real equipment field drive test is not needed, the method for simulating the movement track of the target body based on the gesture control can flexibly realize various scene simulation, and therefore the effectiveness of checking the navigation performance of the electronic map can be improved.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the detailed description of non-limiting embodiments made with reference to the following drawings:
FIG. 1 illustrates an exemplary system architecture to which the method or apparatus for controlling a motion trajectory of a simulated target based on gestures according to embodiments of the present application may be applied;
FIG. 2 is a flow diagram of one embodiment of a method for gesture-based control of a simulated target volume motion trajectory according to the present application;
3a and 3b are schematic application scenarios of an embodiment of the method for controlling a motion trajectory of a simulated target body based on a gesture according to the present application;
FIG. 4 is a schematic diagram illustrating an embodiment of an apparatus for gesture-based control of a motion profile of a simulated object according to the present application;
FIG. 5 is a schematic block diagram of a computer system suitable for use in implementing an electronic device of an embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not to be construed as limiting the invention. It should be noted that, for convenience of description, only the portions related to the present invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 illustrates an exemplary system architecture 100 to which the method or apparatus for controlling a motion trajectory of a simulated target based on gestures according to embodiments of the present application may be applied.
As shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, a network 103, and a server 104. The network 103 serves as a medium for providing communication links between the terminal devices 101, 102 and the server 104. Network 103 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The user may use the terminal devices 101, 102 to interact with the server 104 over the network 103 to receive or transmit information (e.g., positioning, route navigation processing requests), etc. Various client applications, such as electronic map applications, search applications, communication client applications such as social platform software, and the like, may be installed on the terminal devices 101 and 102.
The terminal apparatuses 101 and 102 may be hardware or software. When the terminal devices 101 and 102 are hardware, they may be various electronic devices having a display screen and/or supporting information transmission, including but not limited to smart phones, tablet computers, e-book readers, MP3 players (Moving Picture Experts Group Audio Layer III, mpeg compression standard Audio Layer 3), MP4 players (Moving Picture Experts Group Audio Layer IV, mpeg compression standard Audio Layer 4), laptop portable computers, desktop computers, and the like. When the terminal apparatuses 101 and 102 are software, they can be installed in the electronic apparatuses listed above. It may be implemented as multiple pieces of software or software modules (e.g., software or software modules used to provide distributed services) or as a single piece of software or software module. And is not particularly limited herein.
The server 104 may be a server that provides various services, such as a background information processing server that supports information such as positioning or route guidance requests sent by the terminal apparatuses 101 and 102. The background information processing server may analyze and process the received data such as the positioning or route navigation request, and feed back a processing result (e.g., a navigation route) to the terminal device.
The server 104 may be hardware or software. When the server 104 is hardware, it may be implemented as a distributed server cluster composed of multiple servers, or may be implemented as a single server. When the server 104 is software, it may be implemented as multiple pieces of software or software modules (e.g., software or software modules used to provide distributed services), or as a single piece of software or software module. And is not particularly limited herein.
It should be further noted that the method for controlling the movement trace of the simulated target object based on the gesture provided in the embodiment of the present application may be executed by the server 104, may also be executed by the terminal devices 101 and 102, and may also be executed by a third party executive main body that may be connected to the server 104 in a wireless or wired manner. When the method provided by the embodiment of the present application is executed by a third-party device that can be connected to the server 104, the third-party device may be hardware or software. The third-party device realizes scene simulation through the method provided by the embodiment of the application, so that the evaluation of the performance of the applications such as electronic maps supported by the server 104 can be completed in an auxiliary manner. Correspondingly, the device for controlling the motion trail of the simulated target body based on the gesture may be disposed in the server 104, or disposed in the terminal devices 101 and 102, or disposed in a third-party device which may be connected to the server 104.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for an implementation. The system architecture may not include a network when the electronic device on which the gesture-based control method for simulating a target body movement trajectory operates does not need to perform data transmission with other electronic devices.
FIG. 2 illustrates a flow 200 of one embodiment of a method for gesture-based control of a simulated target volume motion trajectory according to the present application. The method for controlling the motion trail of the simulated target body based on the gesture can be suitable for an execution main body with an image display function. The execution subject can simulate a moving object such as a car, a bicycle, a person, etc. on the electronic map by the simulation object. The process 200 includes the following steps:
step 201, in response to the detection of the gesture operation of the user, determining a gesture offset of the user gesture relative to the reference point according to the gesture operation.
In this embodiment, an executing subject (for example, a desktop computer) running the method for controlling the motion trajectory of the simulated target body based on the gesture may first detect a gesture operation of a user, and when the gesture operation is detected, determine a gesture offset of the user gesture with respect to a reference point according to the detected gesture operation.
A gesture is a hand gesture. The gesture may be a movement of the entire hand or a movement of a finger. In practice, the executing body may detect the user gesture by at least one of detecting a handle, a mouse, or a gesture detecting control, but not limited thereto. When the execution main body detects a user gesture through the detection handle, the detection handle may be a part of the execution main body, at this time, the execution main body is hardware or a combination of hardware and software, and the detection handle may also be an external device of the execution main body, at this time, if the detection main body is hardware, the detection handle may be connected to the execution main body through hardware, and if the detection main body is software, the detection handle may be connected to hardware to which the execution main body operates. When the execution main body detects the gesture operation of the user through the gesture detection control, the touch operation of the screen can be detected through the user, and the movement of a mouse pointer related to the movement of the mouse on the screen gesture detection control can also be detected. Alternatively, the detection handle may be any physical component such as a steering wheel, a game pad, etc. that can be used to input a gesture change signal. This is not a specific limitation in the present application.
The execution body can measure the change of the user gesture through the gesture offset. The execution main body may preset a fixed reference point for acquiring a gesture offset, for example, the reference point is a center point of the detection handle, and the execution main body may also acquire a starting point when the user performs a gesture operation as the reference point. The present application is not limited to this.
In some optional implementations of this embodiment, the gesture offset includes a distance, an angle, and a change time of the user gesture from the reference point. In one gesture operation, the execution main body can acquire one gesture offset, and can also acquire multiple gesture offsets according to a preset time interval. When the gesture offset is acquired for multiple times according to the preset time interval, the reference point can also be changed differently, the reference point acquired by the first word is the gesture operation starting point, and the reference point acquired each time is the gesture operation ending point acquired last time. The distance between the gesture of the user and the reference point can be the distance between the gesture operation ending point and the reference point. The angle between the user gesture and the reference point is used to indicate the moving direction of the user gesture, and may be described by using a coordinate system with the reference point as an origin (for example, a planar coordinate system with the reference point as an origin and left-right, up-down and left-right coordinate axes on the user gesture collection surface) as a reference system, such as an angle of 45 degrees with respect to the right coordinate axis. The time of change of the user gesture may be used to measure the speed at which the user gesture changes, along with the distance of the user gesture from a reference point. The greater the distance of the user gesture from the reference point within the same change time, or the shorter the change time within the same distance, the greater the speed of change of the user gesture.
Step 202, mapping the gesture offset to a position change parameter of the simulation target body according to a preset mapping rule.
In this embodiment, the execution main body may store a predetermined mapping rule, and in this step, the execution main body may then map the gesture offset to the position change parameter of the simulation target body according to the predetermined mapping rule.
The position change parameter of the simulation target body may be a parameter for expressing a change to be made by the simulation target body. For example, the position change parameter may include, but is not limited to, at least one of a direction of change, a speed of change, a distance of change, and the like. Wherein: the changing direction can be the direction of the change of the simulation target body compared with the current direction or the changed direction; the change speed can be the speed after the simulation target body changes, or the speed when the movement speed of the simulation target body changes, namely the acceleration; the varying distance may be a distance that simulates movement of the target.
It can be understood that the gesture operation is used for controlling the motion trajectory of the simulation target, but the gesture operation is not used for completely replacing the simulation target in a coordinate system for real-time simulation, and therefore, after the gesture offset is acquired by the execution main body, the gesture offset and the position change parameter of the simulation target need to be converted. In practice, the scaling may be performed according to a predetermined mapping rule. The predetermined mapping rule may be, for example, that the gesture offset of the user gesture with respect to the reference point is converted equally or proportionally to a position change parameter of the simulated object.
It should be noted that the position change parameter of the simulation target body may be a reference point for simulating the position change of the target body at the start point of the movement locus of the simulation target body, may be a reference point for simulating the position change of the target body at the origin of the coordinate system in which the movement locus of the simulation target body is located, or may be a reference point for simulating the position change of the target body at the current position of the simulation target body. To simplify the calculation and reduce the data processing amount, in some implementations, the execution subject may use the current position of the simulation target as a reference point for the change in the position of the simulation target.
In some optional implementations of this embodiment, the gesture offset includes a distance, an angle, and a change time between the user gesture and the reference point, and the position change parameter includes a change direction, a change speed, and a change distance, where the predetermined mapping rule may include: determining a change direction in the position change parameter according to an angle between the user gesture and the reference point, for example, the change direction in the position change parameter may be consistent with the angle between the user gesture and the reference point; determining a variation distance in the position variation parameter according to a distance between the user gesture and the reference point, for example, the variation distance in the position variation parameter may be positively correlated with the distance between the user gesture and the reference point; the speed of change in the location change parameter is determined from the distance and time of change of the gesture from the reference point, e.g., the speed of change in the location change parameter may be positively correlated to a quotient of the distance and time of change of the gesture from the reference point.
And step 203, acquiring the current positioning information of the simulation target body.
In this embodiment, the executing body may further obtain current positioning information of the simulation target body. Wherein the current positioning information of the simulated target volume may include, but is not limited to, at least one of: coordinates, angle, speed. The coordinates can be longitude and latitude coordinate values of the simulation target body in a coordinate system, such as an electronic map coordinate system; the angle can be an angle formed by a connecting line of the simulation target body and the origin of the coordinate system and a coordinate axis; the speed may be a moving speed of the simulated object in the coordinate system, which may be represented by a speed converted into a real situation (for example, a vehicle speed of 45 kilometers per hour), or may be represented by an actual moving distance in the electronic map (for example, 5 pixels per second).
And step 204, combining the current positioning information of the simulated target body and the position change parameters to determine the changed positioning data of the simulated target body.
In this embodiment, the executing body may further determine the positioning data after the change of the simulated object by combining the current positioning information of the simulated object and the position change parameter. It will be appreciated that the changed positioning data of the simulated object may correspond to the current positioning information of the simulated object in step 203, and may include parameters corresponding to the current positioning information to determine the changed movement status and/or position status of the simulated object.
In some implementations, if the direction of change in the position change parameter is a direction after the change of the simulated object, the execution subject may directly determine the direction of change as a direction in the simulated object positioning data. In other implementations, the direction of change may be a direction in which the simulated object changes compared to the current direction, and the execution subject may superimpose the angle and the speed of change in the current positioning information of the simulated object as the direction in the simulated object positioning data. The variation speed can also be determined in the same way, and is not described in detail here. For the coordinates in the positioning data after the position change, the execution main body can be obtained by decomposing the coordinate axis according to the change direction and the change distance and superposing the coordinate axis with the current coordinates of the simulation target body. It will be appreciated that some alternative implementations are listed in this example, but they are not exhaustive, and they do not cover all the various cases shown.
And step 205, updating and positioning the simulated target body according to the positioning data after the position change so as to control the motion track of the simulated target body.
In this embodiment, the executing body may further update and position the simulation target according to the positioning data after the position change, so as to control the motion trajectory of the simulation target. As previously described, the positioning may include positioning one or more of the speed, direction, and coordinates of the simulated object travel. In some implementations, the execution subject may further display the updated positioning information and the updated motion trajectory of the simulation target through a screen.
Referring to fig. 3a and 3b, fig. 3a and 3b show an application scenario of the method for controlling a motion trajectory of a simulated object based on a gesture according to the embodiment of the present application. In the application scene, the electronic device executing the method for controlling the motion trail of the simulated target body based on the gesture at least comprises a touch screen and a processor, wherein the processor can store a program for operating the method for controlling the motion trail of the simulated target body based on the gesture, the touch screen can display the motion trail of the simulated target body in the electronic map, and a man-machine interaction medium can be provided for acquiring the user gesture through the gesture detection control. As shown in fig. 3a, the simulated object 311 is located in the electronic map 310. Fig. 3b shows a gesture detection control 320, a user gesture 322, in the touch screen. In this application scenario, the reference point for gesture capture is the center point 321 of the gesture detection control 320.
First, a user slides the surface of the touch screen with a hand, the gesture detection control 320 determines an initial point of the user contacting the touch screen as a central point 321 of the gesture detection control 320 and displays the initial point on the touch screen, the gesture detection control 320 automatically detects a user gesture operation, and the electronic device may determine a gesture offset (e.g., a distance, an angle, a change time, etc. of the user gesture 322 from the central point 321) according to the user gesture operation. Then, the electronic device may map the gesture offset to a position change parameter of the simulation object 311 in the electronic map 310 according to a predetermined mapping rule. Then, the electronic device may obtain the current positioning information (such as coordinates, angle, speed, etc.) of the simulation object 311 in the electronic map 310. Further, the electronic device may combine the current positioning information of the simulation object 311 and the position change parameter to determine the positioning data of the simulation object 311 after the change in the electronic map 310. Then, the electronic device may further update the positioning of the simulation object 311 according to the positioning data after the position change and display the updated positioning data in the electronic map 310 to control the motion trajectory of the simulation object 311.
Therefore, the method for controlling the motion trail of the simulated target body based on the gesture controls the simulated target body in the electronic map based on the gesture, and the field drive test of real equipment is not needed, so that the effectiveness of checking the navigation performance of the electronic map can be improved.
With further reference to fig. 4, as an implementation of the method for controlling a motion trajectory of a simulated object based on a gesture, the present application provides an embodiment of an apparatus for controlling a motion trajectory of a simulated object based on a gesture, where the embodiment of the apparatus corresponds to the embodiment of the method shown in fig. 2.
As shown in fig. 4, the apparatus 400 for controlling the motion trajectory of the simulated target based on the gesture of the present embodiment includes: a determination module 401, a mapping module 402, an acquisition module 403, a combination module 404, and an update module 405. The determining module 401 may be configured to, in response to detecting the gesture operation of the user, determine a gesture offset of the gesture of the user relative to the reference point according to the gesture operation; the mapping module 402 may be configured to map the gesture offset to a position change parameter of the simulated target volume according to a predetermined mapping rule; the obtaining module 403 may be configured to obtain current positioning information of the simulation target; the combining module 404 may be configured to combine the current positioning information of the simulated object with the position change parameter to determine the changed positioning data of the simulated object; the update module 405 may be configured to update the position of the simulated object according to the position-changed positioning data to control the motion trajectory of the simulated object.
In this embodiment, the determining module 401 may first determine, in response to detecting the gesture operation of the user, a gesture offset of the gesture of the user with respect to the reference point according to the gesture operation. The gesture offset may be used to measure the change in the user's gesture. In some implementations, the gesture offset may include one or more of a distance, an angle, a time of change of the user gesture from a reference point.
In this embodiment, the mapping module 402 may store a predetermined mapping rule thereon, and the mapping module 402 may then map the gesture offset to the position change parameter of the simulation target body according to the predetermined mapping rule. The position change parameter of the simulation target body may be a parameter for expressing a change to be made by the simulation target body. For example, the location change parameter may include, but is not limited to, at least one of a direction of change, a speed of change, a distance of change, and the like. In some optional implementations of this embodiment, the gesture offset includes a distance, an angle, and a change time between the user gesture and the reference point, and the position change parameter includes a change direction, a change speed, and a change distance, where the predetermined mapping rule may include: determining a change direction in the position change parameter according to the angle between the user gesture and the reference point; determining a variation distance in the position variation parameter according to the distance between the user gesture and the reference point; and determining the change speed in the position change parameters according to the distance between the gesture and the datum point and the change time.
In this embodiment, the obtaining module 403 may then obtain the current positioning information of the simulated target volume. Wherein the current positioning information of the simulated target volume may include, but is not limited to, at least one of: coordinates, angle, speed.
In this embodiment, the combining module 404 may then combine the current positioning information of the simulated object and the position change parameter to determine the changed positioning data of the simulated object. It is understood that the positioning data of the simulated object after being changed may correspond to the current positioning information of the simulated object obtained by the obtaining module 403, which may include parameters corresponding to the current positioning information for determining the changed motion state and/or position state of the simulated object.
In this embodiment, the updating module 405 may then update the position of the simulated object according to the positioning data after the position change, so as to control the motion trajectory of the simulated object. In some implementations, the gesture-based apparatus 400 for controlling the movement trace of the simulation target may further include a display module for displaying the updated positioning information and the movement trace of the simulation target through a screen.
It is noted that the modules described in the device 400 for controlling the motion trail of the simulated target body based on the gesture correspond to the steps of the method described with reference to fig. 2. Thus, the operations and features described above for the method are equally applicable to the apparatus 400 and the modules or units included therein, and are not described in detail here.
Those skilled in the art will appreciate that the apparatus 400 for controlling a trajectory of a simulated object based on gestures described above may also include some other well-known structures, such as a processor, memory, etc., which are not shown in fig. 4 in order to not unnecessarily obscure embodiments of the present disclosure.
Reference is now made to fig. 5, which illustrates a block diagram of a computer system 500 suitable for implementing a terminal device/server of an embodiment of the present application. The terminal device/server shown in fig. 5 is only an example, and should not bring any limitation to the functions and the use range of the embodiments of the present application.
As shown in fig. 5, the computer system 500 includes a Central Processing Unit (CPU)501 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)502 or a program loaded from a storage section 508 into a Random Access Memory (RAM) 503. In the RAM503, various programs and data necessary for the operation of the system 500 are also stored. The CPU501, ROM 502, and RAM503 are connected to each other through a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
The following components are connected to the I/O interface 505: an input portion 506 including a keyboard or a touch panel; an output section 507 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), or the like; a storage portion 508 including a hard disk and the like; and a communication section 509 including a network interface card such as a LAN card, a modem, or the like. The communication section 509 performs communication processing via a network such as the internet. The driver 510 is also connected to the I/O interface 505 as necessary. A removable medium 511 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 510 as necessary, so that a computer program read out therefrom is mounted into the storage section 508 as necessary.
In particular, the processes described above with reference to the flow diagrams may be implemented as computer software programs, according to embodiments of the present disclosure. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 509, and/or installed from the removable medium 511. The computer program performs the above-described functions defined in the method of the present application when executed by the Central Processing Unit (CPU) 501. It should be noted that the non-volatile computer readable medium referred to in this application may be a non-volatile computer readable signal medium or a non-volatile computer readable storage medium or any combination of the two. A non-transitory computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the non-volatile computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a non-transitory computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules described in the embodiments of the present application may be implemented by software or hardware. The described modules may also be provided in a processor, which may be described as: a processor includes a determination module, a mapping module, an acquisition module, a combining module, and an update module. The names of these modules do not in some cases constitute a limitation on the unit itself, and for example, the acquisition module may also be described as a "module configured to acquire current positioning information of a simulation target volume".
As another aspect, the present application also provides a computer-readable medium, which may be contained in the apparatus described in the above embodiments; or may be present separately and not assembled into the device. The computer readable medium carries one or more programs which, when executed by the apparatus, cause the apparatus to: in response to detecting the gesture operation of the user, determining the gesture offset of the gesture of the user relative to the reference point according to the gesture operation; mapping the gesture offset into a position change parameter of the simulation target body according to a preset mapping rule; acquiring current positioning information of a simulation target body; combining the current positioning information of the simulated target body with the position change parameters to determine the positioning data of the simulated target body after the change; and updating and positioning the simulated target body according to the positioning data after the position change so as to control the motion track of the simulated target body.
The foregoing description is only exemplary of the preferred embodiments of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (14)

1. A method for controlling a motion trail of a simulated target body based on gestures comprises the following steps:
responding to the detected gesture operation of a user, determining the gesture offset of the gesture of the user relative to a reference point according to the gesture operation, wherein the gesture offset is used for representing multiple gesture offsets acquired by different acquisition reference points according to a preset time interval, when the multiple gesture offsets are acquired according to the preset time interval, the reference point acquired for the first time is a gesture operation starting point, the reference point acquired for each time is a gesture operation ending point acquired for the previous time, the gesture offset at least comprises the distance between the gesture of the user and the reference point, and the distance between the gesture of the user and the reference point is the distance between the gesture operation ending point and the reference point;
Mapping the gesture offset to be a position change parameter of the simulation target body according to a preset mapping rule; the simulation target body is used for simulating a moving object on an electronic map in a map application;
acquiring current positioning information of a simulation target body;
combining the current positioning information of the simulated target body with the position change parameters to determine the changed positioning data of the simulated target body;
updating and positioning the simulated target body according to the positioning data after the position change so as to control the motion trail of the simulated target body;
and performing performance evaluation on the map application based on the motion trail.
2. The method of claim 1, wherein the gesture offset comprises an angle, time of change, of a user gesture from a reference point.
3. The method of claim 2, wherein the position change parameters include direction of change, speed of change, distance of change.
4. The method of claim 3, wherein the predetermined mapping rule comprises:
determining a change direction in the position change parameter according to an angle between a user gesture and the reference point;
determining a variation distance in the position variation parameter according to a distance between the user gesture and the reference point;
And determining the change speed in the position change parameters according to the distance between the gesture and the datum point and the change time.
5. The method of claim 1, wherein the positioning information comprises at least one of: coordinates, angle, speed.
6. The method of claim 1, wherein monitoring a motion trajectory of a user gesture relative to a reference point is achieved by at least one of:
the detection device comprises a detection handle, a mouse and a gesture detection control.
7. An apparatus for controlling a motion trajectory of a simulated object based on a gesture, comprising:
the determining module is configured to determine a gesture offset of a user gesture relative to a reference point according to the gesture operation in response to detection of the gesture operation of a user, wherein the gesture offset is used for representing multiple gesture offsets acquired by different acquisition reference points according to a preset time interval, when the multiple gesture offsets are acquired according to the preset time interval, the reference point acquired for the first time is a gesture operation starting point, the reference point acquired for each time is a gesture operation ending point acquired for the previous time, the gesture offset at least comprises a distance between the user gesture and the reference point, and the distance between the user gesture and the reference point is a distance between the gesture operation ending point and the reference point;
The mapping module is configured to map the gesture offset to be a position change parameter of the simulation target body according to a preset mapping rule; the simulation target body is used for simulating a moving object on an electronic map in a map application;
the acquisition module is configured to acquire current positioning information of the simulation target body;
the combination module is configured to combine the current positioning information of the simulated target body with the position change parameter to determine the changed positioning data of the simulated target body;
the updating module is configured to update and position the simulated object according to the positioning data after the position change so as to control the motion trail of the simulated object;
and the evaluation module is configured for evaluating the performance of the map application based on the motion track.
8. The apparatus of claim 7, wherein the gesture offset comprises an angle, a time of change, of a user gesture from a reference point.
9. The apparatus of claim 8, wherein the position change parameters include direction of change, speed of change, distance of change.
10. The apparatus of claim 9, wherein the predetermined mapping rule of the mapping module comprises:
Determining a change direction in the position change parameter according to an angle between a user gesture and the reference point;
determining a variation distance in the position variation parameter according to a distance between the user gesture and the reference point;
and determining the change speed in the position change parameters according to the distance between the gesture and the datum point and the change time.
11. The apparatus of claim 7, wherein the positioning information comprises at least one of: coordinates, angle, speed.
12. The apparatus of claim 7, wherein monitoring a motion trajectory of a user gesture relative to a reference point is achieved by at least one of:
the detection device comprises a detection handle, a mouse and a gesture detection control.
13. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-6.
14. A computer storage medium having a computer program stored thereon, wherein the program when executed by a processor implements the method of any one of claims 1-6.
CN201810332828.5A 2018-04-13 2018-04-13 Method and device for simulating movement locus of target body based on gesture control Active CN108595095B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810332828.5A CN108595095B (en) 2018-04-13 2018-04-13 Method and device for simulating movement locus of target body based on gesture control

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810332828.5A CN108595095B (en) 2018-04-13 2018-04-13 Method and device for simulating movement locus of target body based on gesture control

Publications (2)

Publication Number Publication Date
CN108595095A CN108595095A (en) 2018-09-28
CN108595095B true CN108595095B (en) 2022-07-15

Family

ID=63622567

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810332828.5A Active CN108595095B (en) 2018-04-13 2018-04-13 Method and device for simulating movement locus of target body based on gesture control

Country Status (1)

Country Link
CN (1) CN108595095B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109445585A (en) * 2018-10-22 2019-03-08 广州星唯信息科技有限公司 One kind simulating true steering direction method based on gesture manipulation
CN109710066B (en) * 2018-12-19 2022-03-25 平安普惠企业管理有限公司 Interaction method and device based on gesture recognition, storage medium and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103206952A (en) * 2012-01-16 2013-07-17 联想(北京)有限公司 Positioning method and positioning apparatus
CN106126073A (en) * 2016-06-12 2016-11-16 天脉聚源(北京)教育科技有限公司 A kind of method and device of mobile destination object
CN106178504A (en) * 2016-06-27 2016-12-07 网易(杭州)网络有限公司 Virtual objects motion control method and device
CN107861682A (en) * 2017-11-03 2018-03-30 网易(杭州)网络有限公司 The control method for movement and device of virtual objects

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10627904B2 (en) * 2014-02-07 2020-04-21 Ultrahaptics IP Two Limited Systems and methods of determining interaction intent in three-dimensional (3D) sensory space

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103206952A (en) * 2012-01-16 2013-07-17 联想(北京)有限公司 Positioning method and positioning apparatus
CN106126073A (en) * 2016-06-12 2016-11-16 天脉聚源(北京)教育科技有限公司 A kind of method and device of mobile destination object
CN106178504A (en) * 2016-06-27 2016-12-07 网易(杭州)网络有限公司 Virtual objects motion control method and device
CN107861682A (en) * 2017-11-03 2018-03-30 网易(杭州)网络有限公司 The control method for movement and device of virtual objects

Also Published As

Publication number Publication date
CN108595095A (en) 2018-09-28

Similar Documents

Publication Publication Date Title
CN109141464B (en) Navigation lane change prompting method and device
CN109215372B (en) Road network information updating method, device and equipment
US10077986B2 (en) Storing trajectory
US20180354512A1 (en) Driverless Vehicle Testing Method and Apparatus, Device and Storage Medium
US10650598B2 (en) Augmented reality-based information acquiring method and apparatus
US11073396B2 (en) Integrated positioning method and system
CN112798004B (en) Positioning method, device and equipment for vehicle and storage medium
CN115616937B (en) Automatic driving simulation test method, device, equipment and computer readable medium
CN110781263A (en) House resource information display method and device, electronic equipment and computer storage medium
CN109345015B (en) Method and device for selecting route
CN108595095B (en) Method and device for simulating movement locus of target body based on gesture control
CN116088538B (en) Vehicle track information generation method, device, equipment and computer readable medium
CN111382701B (en) Motion capture method, motion capture device, electronic equipment and computer readable storage medium
CN110321854B (en) Method and apparatus for detecting target object
CN110542425B (en) Navigation path selection method, navigation device, computer equipment and readable medium
CN111340880A (en) Method and apparatus for generating a predictive model
CN109827610A (en) Method and apparatus for check sensor fusion results
CN112651535A (en) Local path planning method and device, storage medium, electronic equipment and vehicle
CN113984109B (en) Track detection data correction method and device and electronic equipment
CN115372020A (en) Automatic driving vehicle test method, device, electronic equipment and medium
JP2019101622A (en) Information processing device, control method, and program
JP6383063B1 (en) Calculation device, calculation method, and calculation program
CN111609859A (en) Navigation information display method and device, storage medium and electronic equipment
CN111562749A (en) AI-based general intelligent household scheme automatic design method and device
CN111401229A (en) Visual small target automatic labeling method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant