CN115933877A - Virtual interaction method, system and readable storage medium based on haptic force feedback - Google Patents

Virtual interaction method, system and readable storage medium based on haptic force feedback Download PDF

Info

Publication number
CN115933877A
CN115933877A CN202211544124.7A CN202211544124A CN115933877A CN 115933877 A CN115933877 A CN 115933877A CN 202211544124 A CN202211544124 A CN 202211544124A CN 115933877 A CN115933877 A CN 115933877A
Authority
CN
China
Prior art keywords
virtual
force
robot
user
virtual object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211544124.7A
Other languages
Chinese (zh)
Inventor
张明明
孙晨阳
刘昱东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southern University of Science and Technology
Original Assignee
Southern University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southern University of Science and Technology filed Critical Southern University of Science and Technology
Priority to CN202211544124.7A priority Critical patent/CN115933877A/en
Publication of CN115933877A publication Critical patent/CN115933877A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Manipulator (AREA)

Abstract

The application relates to the technical field of virtual interaction, and discloses a virtual interaction method, a system and a readable storage medium based on haptic force feedback, wherein the method comprises the following steps: when a user interacts with a virtual object in a virtual environment through a robot, acquiring a virtual acting force of the virtual object in the virtual environment on the user; calculating the motor control force of the robot based on the virtual acting force and the compensation amount of the robot; and controlling the robot to generate an interactive force acting on the user at the tail end according to the motor control force, and acquiring tail end motion information of the robot. The method can realize the tactile feedback when the user interacts with the virtual environment, and improve the user experience degree and the like through the tactile interconnection between the virtual environment and other users.

Description

Virtual interaction method, system and readable storage medium based on tactile force feedback
Technical Field
The present application relates to the field of virtual interaction technology, and in particular, to a virtual interaction method, system and readable storage medium based on haptic force feedback.
Background
Virtual Reality (VR) interaction technology is applied more and more widely in many fields, wherein in the existing VR interaction technology, the implementation is often unidirectional, for example, a user can only perform some unidirectional control operations in a virtual environment through corresponding virtual devices, such as 3D games and the like; or, the interaction is only realized among the characters in the virtual environment, for each user, the character which is controlled by the user is mainly seen from the vision, the character can interact with other characters in the virtual environment, and the user cannot feel the real interaction force from other users to the user, so that the user cannot feel the force used by the user, and the experience of the user is poor. Especially, when some tasks need to be completed for a long time by means of a virtual technology, such as playing some game sports or rehabilitation training, the existing virtual technology can only realize one-way control, has a single interaction mode, cannot realize multi-person interaction, and the like, and as the training period is long, a user hardly insists on training or sports and the like.
Disclosure of Invention
In view of the above, to solve at least one of the above technical problems, embodiments of the present application provide a virtual interaction method, system and readable storage medium based on haptic force feedback.
In a first aspect, an embodiment of the present application provides a virtual interaction method based on haptic force feedback, including:
when a user interacts with a virtual object in a virtual environment through a robot, acquiring a virtual acting force of the virtual object in the virtual environment on the user;
calculating the motor control force of the robot based on the virtual acting force and the compensation amount of the robot;
controlling the robot to generate an interactive force acting on the user at the tail end according to the motor control force, and acquiring tail end motion information of the robot;
controlling the virtual object to move in the virtual environment based on the end motion information.
In some embodiments, the virtual environment is built based on a model that enables conversion between force and displacement.
In some embodiments, said obtaining a virtual effort of said virtual object on said user in said virtual environment comprises:
converting the terminal motion information of the robot into a first acting force, and calculating the virtual environment force of the virtual object currently in the virtual environment;
and calculating the motion information of the virtual object according to the virtual environment force and the first acting force, and further determining the virtual acting force of the virtual object on the user in the virtual environment based on a force interaction model corresponding to the virtual object according to the motion information of the virtual object.
In some embodiments, the virtual environment force is calculated from one or more combinations of gravity, friction, and impact forces experienced by the virtual object in the virtual environment when the virtual object is only interacting with the user.
In some embodiments, when the user interacts with at least one other user through the virtual object, the virtual environment force is calculated according to one or more combinations of gravity, friction, and collision force to which the virtual object is subjected in the virtual environment, and a resultant vector force applied to the virtual object by the other user.
In some embodiments, the compensation amount includes a robot system resistance compensation amount, and if the robot system is based on open-loop control, the calculating the motor control force of the robot based on the virtual acting force and the compensation amount of the robot includes:
and taking the sum of the virtual acting force and the system resistance compensation quantity of the robot as the motor control force of the robot.
In some embodiments, the calculating the motor control force of the robot based on the virtual acting force and the compensation amount of the robot includes:
and taking the sum of the virtual acting force, the system resistance compensation quantity and the inertia compensation quantity of the robot as the motor control force of the robot.
In some embodiments, the robot tip is provided with a force sensor; the compensation quantity further comprises an inertia compensation quantity, and the inertia compensation quantity is obtained by performing compensation calculation of a preset compensation coefficient on the actual acting force of the user acting on the tail end based on the force sensor feedback.
In some embodiments, the robot tip is provided with a force sensor for feeding back the actual force applied by the user to the tip; if the robot system is based on closed-loop control, calculating the motor control force of the robot based on the virtual acting force and the compensation amount of the robot, wherein the calculation comprises the following steps:
outputting closed-loop control quantity after closed-loop calculation is carried out through a preset closed-loop controller according to the error between the virtual acting force and the counterforce of the actual acting force; and adding the closed-loop control quantity and the system resistance compensation quantity to obtain the motor control force of the robot.
In some embodiments, the force interaction model is constructed based on a virtual spring damping system.
In some embodiments, the haptic force feedback-based virtual interaction method further comprises:
and when the user does not interact with any virtual object in the virtual environment through the robot, calculating the motor control force of the robot based on the compensation amount of the robot.
In some embodiments, the haptic force feedback-based virtual interaction method further comprises:
and visually displaying the motion state of the virtual object in the virtual environment.
In a second aspect, the present application provides a virtual interaction system based on haptic force feedback, including: the robot system comprises at least one terminal device running a virtual environment program and at least one robot;
each terminal device is used for executing the virtual interaction method based on the tactile force feedback when a corresponding user acts on the robot terminal.
In a third aspect, the present application provides a readable storage medium storing a computer program, which when executed on a processor implements the above-mentioned virtual interaction method based on haptic force feedback.
The embodiment of the application has the following beneficial effects:
according to the virtual interaction method based on the tactile force feedback, the virtual environment and the robot control model are constructed in advance, the robot is used as a mediation, a user can control the virtual object to move and other interactive operations in the virtual environment through the robot, meanwhile, the virtual acting force of the virtual object in the virtual environment on the user is calculated, the force acting on the user is generated at the tail end of the robot, and therefore the user feels the matched virtual interaction force, and the tactile feedback in the real scene is felt in the interaction process; in addition, the method can also realize the touch interconnection operation among a plurality of users, namely, each user can feel the virtual force of other users to the same virtual object, and the like, thereby greatly improving the interestingness and the like during user experience.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
FIG. 1 is a block diagram illustrating a virtual interactive control system without inertia compensation based on open loop control according to an embodiment of the present application;
FIG. 2 is a flow chart diagram illustrating a haptic force feedback based virtual interaction method according to an embodiment of the present application;
FIG. 3 illustrates a schematic diagram of a force interaction model for a virtual object in a virtual environment in accordance with an embodiment of the present application;
FIG. 4 is a diagram illustrating a force interaction model for multiple users acting on the same rigid body virtual object in a virtual environment according to an embodiment of the present application;
FIG. 5 illustrates a schematic diagram of a force interaction model for multiple users acting on the same non-rigid body virtual object in a virtual environment according to an embodiment of the present application;
FIG. 6 is a block diagram of a virtual interactive system based on open-loop control and inertia compensation according to an embodiment of the present application;
FIG. 7 is a block diagram of a virtual interactive system based on closed-loop control according to an embodiment of the present application;
FIG. 8 illustrates a structural schematic diagram of a force interaction model of a virtual environment of an embodiment of the present application;
FIG. 9 is a schematic diagram of a first structure of a virtual interactive system based on haptic force feedback according to an embodiment of the present application;
FIG. 10 is a diagram illustrating a second structure of a virtual interactive system based on haptic force feedback according to an embodiment of the present application;
fig. 11 shows a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
Hereinafter, the terms "including", "having", and their derivatives, which may be used in various embodiments of the present application, are intended to indicate only specific features, numbers, steps, operations, elements, components, or combinations of the foregoing, and should not be construed as first excluding the existence of, or adding to, one or more other features, numbers, steps, operations, elements, components, or combinations of the foregoing. Furthermore, the terms "first," "second," "third," and the like are used solely to distinguish one from another, and are not to be construed as indicating or implying relative importance.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which the various embodiments of the present application belong. The terms (such as those defined in commonly used dictionaries) should be interpreted as having a meaning that is consistent with their contextual meaning in the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein in various embodiments.
Some embodiments of the present application will be described in detail below with reference to the accompanying drawings. The embodiments and features of the embodiments described below can be combined with each other without conflict.
Referring to fig. 1, an embodiment of the present application provides a virtual interaction method based on haptic force feedback, which is applied to a virtual interaction system, where the virtual interaction system includes a terminal device capable of operating a virtual environment, and a robot accessing the system, where the robot, as a mediation, may be used for a user to control a virtual object in the virtual environment and to enable the user to feel a reaction force of the virtual environment to the robot, that is, to implement haptic feedback. It is understood that the application scenario of the virtual interactive system is not limited, and may be, for example, a pure entertainment scenario, a rehabilitation motion scenario, or any other scenario that performs corresponding operations in a virtual environment in combination with a robot. Accordingly, the form of presentation of the robot is not limited. For example, if applied to a purely entertainment scenario, the robot may be a gaming robot; if the robot is applied to a rehabilitation motion scene, the robot can be a rehabilitation robot for the limb rehabilitation of a patient, and the like. Furthermore, when a plurality of users act on the virtual object in the virtual environment at the same time, each user can also feel the acting force of other users, namely, the tactile interconnection is realized, so that the sense of reality of the user during operation can be greatly increased, the operation interestingness is further improved, and the like.
As shown in fig. 2, the virtual interaction method based on the tactile force feedback exemplarily includes:
step S110, when the user interacts with the virtual object in the virtual environment through the robot, acquiring the virtual acting force of the virtual object in the virtual environment to the user.
The robot may include, but is not limited to, various robot structures linked with a virtual environment, such as a rehabilitation robot for upper limb rehabilitation training, an entertainment robot, and the like. For example, for a robot having a handle, the user may move the handle along the sliding guide by applying an external force to the handle, for example, by applying a hand to the handle of the robot. For example, when a user selects one of the virtual objects in the virtual environment, the motion state of the virtual object in the virtual environment, etc. may be controlled by the motion of the control handle. It can be understood that when the handle is displaced by the user, the displacement information is transmitted to the virtual interactive system, and the system controls the motion of the virtual object according to the displacement information and other information.
Exemplarily, when a user makes contact with the robot end, for example, holds the robot end with a hand, the robot end is caused to undergo a displacement change, and then a displacement of the virtual object in the virtual environment is induced, and a corresponding virtual force signal is generated by the displacement, and then the next cycle control period is entered. It can be understood that the virtual environment is essentially a system for converting displacement into virtual acting force, that is, the displacement of the end of the robot is input into the virtual environment, and accordingly, the virtual environment calculates the virtual acting force of the virtual object on the user according to the information such as the displacement of the end. This virtual force is then transmitted to the robot end through the robot control, so that the user feels a matching virtual interaction force. The virtual acting force is calculated by a designed conversion model between the force and the displacement, in other words, the virtual environment model is constructed based on the conversion model between the force and the displacement, and the implementation manners are many, and are not limited herein.
In the virtual environment, various virtual objects and operation rules corresponding to the virtual objects may be set, for example, the virtual environment may include a mode such as box carrying and the like which can be operated by a single person, or may include one or more combinations of modes such as two-person or three-person and three-person or more than three-person operation modes and the like which need cooperation, for example, tug-of-war, table tennis, common box carrying and the like. The user can be allowed to select to act on the same virtual object at the same time or different virtual objects at the same time in the double or multi-person operation mode, and when the virtual objects are used for different virtual objects, if contact is generated among the virtual objects, the touch interaction feeling and the like can be brought to the user.
Considering real environment, when a user interacts with any object, the object can be subjected to environmental forces, such as gravity, friction, collision force, elastic force, and the like, besides the corresponding acting force. So for a virtual object in a virtual environment, it will also possess basic physical properties, such as mass, coefficient of friction, etc.; in addition, the interaction between the virtual object and the virtual environment also follows basic physical laws, such as Newton's law, momentum theorem and the like. By considering the acting forces, more accurate virtual environment force can be calculated, and further more accurate virtual acting force of the virtual object in the virtual environment to the user can be calculated, so that the user can generate more real tactile feedback.
For example, when the virtual object interacts with only one of the current users, the virtual environment force at this time may include, but is not limited to, a combination of one or more of gravity, friction, impact, etc. according to the virtual object being subjected to in the virtual environment. Wherein the collision force may be an instantaneous force generated by other users in the virtual environment by controlling the other virtual objects to the virtual object acted on by the current user. For example, if the current user pushes one box and another user pushes another box and collides with it, it can be known that the box on which the current user acts only interacts with one person, but the box is subjected to other forces in the virtual environment, such as the collision force or impact force, and therefore, the acting force with time delay on the virtual object can be taken into account so as to closely match the effect of the force action in the real environment.
For another example, when the virtual object interacts with at least two users, for example, two persons kicking a shuttlecock together, lifting a box, or the like, the virtual environment force at this time may be calculated according to one or more combinations of gravity, friction, impact, or the like, which are received by the virtual object in the virtual environment, and the resultant vector force applied to the virtual object by the other users. It should be noted that if there is only one other user, the resultant vector force is the acting force of the user; if there are a plurality of other users, the acting forces of all other users except the current user acting on the virtual object need to be vector-synthesized to obtain a vector resultant force.
It should be noted that the acting force of the other users may be an immediate acting force generated by contacting the virtual object, or a non-immediate acting force, and the two forces are mainly different from whether the acting force of the current user is synchronous or not. For example, when two persons carry the box together, the acting force of the other user is the instant acting force; when two people take turns kicking a shuttlecock, the shuttlecock is acted by the other user, and the acting force of the other user is non-instantaneous acting force.
Accordingly, in order to calculate the virtual effort of the virtual object on the user, a force and motion analysis needs to be performed based on the corresponding force interaction model. Because the robot system adopts an impedance control mode, the pose of a user can be overlapped with the pose of a virtual object in a virtual scene, so that the virtual coupling of the user and a virtual environment can be realized by adopting a classical spring-damping system, and the stability of the interaction force felt by the user can be ensured. It should be noted that, for a rigid object without deformation characteristics or a non-rigid object with deformation characteristics, a spring damping system may be used to construct the force interaction model, but it is specifically necessary to adjust corresponding parameters of the spring damping system according to material properties of the virtual object, for example, the interaction model of the rigid object has a higher elastic coefficient and damping than the interaction model of the non-rigid object.
For example, as shown in fig. 3, for a rigid body virtual object without deformation characteristics, the current user and the rigid body virtual object are equivalent to a virtual spring damping system, and the control system calculates a virtual environment force F received by the rigid body virtual object i When the robot is used, a resultant force is calculated by combining a first acting force obtained by converting displacement information of the tail end of the robot, then the displacement of the robot is calculated through a physical law (such as a Newton's second law) and then the displacement is subjected to virtual coupling through an equivalent spring damping system to be converted into a virtual force F which is expected to be applied to a virtual object by a user in a virtual environment v And the virtual force F v The counterforce of the method is the virtual force-F of the virtual object in the virtual environment to be solved to the user v
Optionally, if multiple users act on the same virtual object at the same time, force interaction may occur between the users, where each user and the virtual object are equivalent to a spring damping system for analysis (n users correspond to n virtual spring damping systems, as shown in fig. 4), respectively, where a specific connection manner may be determined by a current task, and different interconnection effects may be caused. Moreover, for a single user, the acting force of other users can be regarded as the force of the virtual object in the virtual environment, and belongs to a part of the force of the virtual environment. It can be understood that for a rigid body virtual object, the virtual object can be simplified into a mass point for stress analysis; for a non-rigid body virtual object, that is, a corresponding deformation may occur during acting, as shown in fig. 5, a force-sharing analysis may be performed based on a Finite Element Model (FEM), and the like, which is not limited herein.
For the above step S110, when the user acts on the end of the robot, the displacement information generated by the end may be converted into a force (which is regarded as the force acted on the virtual object by the user), the resultant force exerted on the virtual object may be calculated by combining with other external forces exerted on the virtual object in the virtual environment (which may include the force exerted on the virtual object by other users and/or the force exerted in the virtual environment), and further, the resultant force may be calculated by using a physical law (such as the newton' S second law, etc.)The displacement of the virtual object is then converted to F by virtual coupling (i.e., based on the corresponding force interaction model of the virtual object) v I.e. the virtual environment expects a virtual force F applied by the user on the virtual object v With the virtual force-F of the virtual object in the virtual environment on the user v Equal in size and opposite in direction. It will be appreciated that this virtual force-F v I.e. the virtual environment outputs the virtual interaction force that the desired user experiences from the robot end.
And step S120, calculating the motor control force of the robot based on the virtual acting force and the compensation amount of the robot.
In this embodiment, the motor control force of the robot refers to the control force sent to the motor, and may actually be a current value, because the current and the force are in a direct proportion relationship. It can be understood that the motor control force mainly outputs torque through the motor, a part of the motor control force is used for compensating the resistance force when the robot operates, and a part of the motor control force acts on the user, so that the user feels the matched virtual interaction force.
In one embodiment, the system inertia compensation term may be considered to be not considered if the mechanical structure inertia of the robot is small, and in this case, the compensation amount of the robot includes a robot system resistance compensation amount. Various compensation approaches may be used, which may include, but are not limited to, compensation by modeling the system resistance; or by numerical compensation, etc., without limitation.
For example, in one embodiment, the system resistance compensation amount may be calculated and output by a robot system resistance compensation model, for example, the expression of the robot system resistance compensation model is:
Figure BDA0003973603950000121
in the formula, F f The system resistance compensation quantity is output;
Figure BDA0003973603950000122
for moving the end of the robotThe first derivative of position x, also velocity; b c And I f Are all preset constants; sign is a sign function. It will be appreciated that the resistance compensation model is only an example and that the parameter adjustment may also be based on the actual demand, for example instead of using the position x, the first derivative->
Figure BDA0003973603950000123
Furthermore, the second derivative of x may also be combined in some other cases>
Figure BDA0003973603950000124
I.e., the motion parameter, i.e., the acceleration, etc., is used to calculate the corresponding amount of resistance compensation, and is not limited herein.
For another example, in some other embodiments, the system resistance compensation amount may be obtained by empirically, testing/experimenting in advance, etc., to generate a resistance comparison table of the robot system in different motion states, and further obtain the corresponding compensation amount by looking up the table when the robot is running.
If the robot system is based on open loop control, in one embodiment, as shown in fig. 1, the calculating the motor control force of the robot in step S120 includes: the virtual acting force-F calculated in the step S110 v And a system resistance compensation amount F of the robot f The sum of the two is used as the motor control force F of the robot c
As an alternative, the end of the robot may be provided with a force sensor that can be used to detect the actual force applied by the user to the end and feed it back to the control system. If the inertia compensation amount needs to be considered, the compensation amount may include two parts, one part is a system resistance compensation amount, and the other part is an inertia compensation amount, wherein the inertia compensation amount is mainly obtained by performing compensation calculation of a preset compensation coefficient on the actual acting force of the user acting on the tail end based on the feedback of the force sensor.
Similarly, if the robot system is based on open loop control, as shown in fig. 6, for step S120, the motors of the robot systemCalculation of control force, comprising: the virtual acting force-F calculated in the step S110 v And the above-mentioned system resistance compensation quantity F of said robot f Inertia compensation amount KF h The sum of the three is used as the motor control force F of the robot c
Additionally, as another alternative, the robot system may also be based on closed-loop control, for example, in an embodiment, for step S120, calculating the motor control force of the robot includes: the virtual acting force-F calculated according to the above step S110 v Actual force F of the user on the tip based on force sensor feedback h Reaction force-F of h After the error between the two is subjected to closed-loop calculation through a preset closed-loop controller, outputting a closed-loop control quantity; the closed-loop control quantity is compared with a system resistance compensation quantity F f Adding to obtain the motor control force F of the robot c . The preset closed-loop controller may be, but is not limited to, a PID controller (as shown in fig. 7), a PI controller, a PD controller, or the like, and may also be other controllers, and the like, which is not limited herein.
In addition, as an optional scheme, the method further includes:
when the user does not interact with any virtual object in the virtual environment through the robot, the motor control force of the robot is calculated based on the compensation amount of the robot.
When the user does not interact with any virtual object in the virtual environment through the robot, the robot system adopts a zero-force control mode, wherein zero-force control means that the handle of the robot is in a similar 'weightless' state, and the robot can move in compliance with the external force only by overcoming small inertia force under the traction of the external force. The user is now provided with free movement, mainly by compensating for robot system inertia, resistance etc. For example, in one embodiment, the expression of the zero-force control mode is:
F c =F f +KF h
similarly, the motor control force F is calculated c The motor control force F can then be adjusted c Is sent toAnd the joint motor is used for driving the tail end to move so as to generate torque and enable the tail end to move correspondingly. It is understood that the above expression of the zero-force control mode is constructed based on open-loop control, which is only an example, and if the expression is based on closed-loop or other control modes, the expression of the zero-force control mode is also adjusted, and should be within the protection range.
And step S130, controlling the robot to generate an interactive force acting on a user at the tail end according to the motor control force, and acquiring tail end motion information of the robot.
In this embodiment, when a user interacts with any virtual object in the virtual environment through the robot and generates a virtual force in the virtual environment, the virtual environment outputs a control signal to the robot, so that the robot motor outputs a corresponding torque, and the user feels a matching virtual interaction force at the end of the robot, that is, the haptic force feedback is implemented. Meanwhile, the tail end of the robot is subjected to the driving force of the motor, so that the motion state is changed, and the motion information of the tail end, such as displacement, speed, acceleration and the like, can be detected and fed back through a corresponding sensor arranged at the tail end of the robot. Accordingly, the motion information of the terminal can again induce the motion of the virtual object in the virtual environment, i.e. enter a control loop process.
It is understood that the motion of the robot tip is not limited to a single-directional motion in a horizontal or vertical direction on a two-dimensional plane, but may be a multi-dimensional motion in a three-dimensional space, or the like.
And step S140, controlling the virtual object to move in the virtual environment based on the terminal movement information.
It can be understood that the terminal motion information of the robot in the last cycle is converted into the corresponding acting force, the resultant force applied to the virtual object is calculated by combining with other external forces applied to the virtual object, and then the displacement information of the virtual object can be calculated through a physical law (such as the newton's second law). And then, the displacement of the virtual object can be converted into an acting force expected to be felt by a user in the current cycle period through a force interaction model of the virtual object in the virtual environment, and then the acting force is transmitted to the tail end of the robot through the robot controller, and corresponding tail end motion information is generated, so that the motion information of the tail end of the robot enters the next cycle period.
For example, in one embodiment, as shown in fig. 8, the virtual environment model may be designed using the following expression of a conversion model between force and displacement:
Figure BDA0003973603950000151
where Fv is the virtual environment expecting the virtual force applied by the user on the virtual object, kv and b are both preset coefficients, x and
Figure BDA0003973603950000152
respectively, the displacement and the speed of the robot end, and xv is the displacement of the virtual object in the virtual environment.
It is to be understood that the conversion model between the force and the displacement is not limited to the above example, for example, in addition to the displacement and the velocity of the robot tip, the deviation between the motion velocity and the acceleration between the virtual object and the robot tip may be considered, or other coefficient supplements may also be performed, and the conversion model may be adaptively adjusted according to the actual application scenario.
It will be appreciated that the virtual environment may also be subject to forces from other users when the user interacts with the virtual environment
Figure BDA0003973603950000153
If these other forces are present>
Figure BDA0003973603950000154
Acting on the same virtual object, these forces will also have an influence on the movement of the virtual object and thus on the displacement x of the virtual object v And the like.
For the step S140, as for the example of the rehabilitation robot, if the user controls the virtual object by pushing the handle of the robot to move horizontally, the displayed screen is that the virtual object in the virtual environment will change in displacement along with the pushing, and at the same time, the user will feel the corresponding resistance feeling given to the user by the virtual environment on the handle.
In the virtual interaction method based on the tactile force feedback, the virtual environment and the robot system model are constructed in advance, the robot is used as a mediation, the user can realize the motion control of a virtual object in the virtual environment through the robot, and the acting force acting on the user is generated at the tail end of the robot, so that the user feels the matched virtual interaction force, and the tactile feedback in the interaction process is felt as the tactile feedback in a real scene; particularly, the method can realize the touch interconnection operation among a plurality of users, can be applied to scenes such as rehabilitation sports, leisure entertainment and the like, and can greatly improve the enthusiasm of the users.
Referring to fig. 9 and fig. 10, based on the methods of the foregoing embodiments, the present embodiment provides a virtual interactive system based on haptic force feedback, exemplarily including: the system comprises at least one robot and at least one terminal device running the virtual environment program, wherein each terminal device is used for executing the virtual interaction method based on the tactile force feedback in the embodiment when a corresponding user acts on the tail end of the robot. For example, the terminal device may be a user's cell phone, tablet, notebook, etc.
In one embodiment, as shown in fig. 9, the system may be implemented by connecting one terminal device to multiple robots simultaneously or by connecting each terminal device to one or multiple robots respectively, and the terminal devices are further connected to the background server through communication, as shown in fig. 10. The background server is used for forwarding information, so that the information of the virtual environment selected by each terminal device which is online at the same time is synchronous. For example, a user a interacts with a virtual object a in a virtual environment through a robot in place a, and a user B selects to act on the virtual object a in place B, so that a background server in the system can forward information of the virtual object controlled by the other party in real time to ensure that the two views are synchronous.
Fig. 11 is a schematic structural diagram of a terminal device according to an embodiment of the present application. The terminal device may be a terminal used by a user, such as a mobile phone or a tablet, or may be a server, and its existence form is not limited. In an embodiment, the terminal device at least includes a memory 11 and a processor 12, where the memory 11 stores a computer program, and the processor 12 can implement the haptic feedback, the haptic interaction, and the like when the user interacts with the virtual environment when executing the computer program to implement the virtual interaction method based on the haptic force feedback according to the embodiment of the present application. It is understood that the executing functions of the terminal device in the system of the present embodiment correspond to the corresponding steps in the method of the above embodiment, and the options in the above embodiment are also applicable to the present embodiment, so the description is not repeated here.
The Memory 11 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read Only Memory (PROM), an Erasable Read Only Memory (EPROM), an electrically Erasable Read Only Memory (EEPROM), and the like. The memory 11 is used for storing a computer program, and the processor 12 can execute the computer program after receiving the execution instruction.
The processor 12 may be an integrated circuit chip having signal processing capabilities. The Processor 12 may be a general-purpose Processor including at least one of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), and a Network Processor (NP), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The general purpose processor may be a microprocessor or the processor may be any conventional processor or the like that implements or executes the methods, steps and logic blocks disclosed in the embodiments of the present application.
The present application also provides a readable storage medium for storing the computer program used in the terminal device described above.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method can be implemented in other ways. The apparatus embodiments described above are merely illustrative and, for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, each functional module or unit in each embodiment of the present application may be integrated together to form an independent part, or each module may exist alone, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a smart phone, a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application.

Claims (10)

1. A virtual interaction method based on haptic force feedback, comprising:
when a user interacts with a virtual object in a virtual environment through a robot, acquiring a virtual acting force of the virtual object in the virtual environment on the user;
calculating the motor control force of the robot based on the virtual acting force and the compensation amount of the robot;
controlling the robot to generate an interactive force acting on the user at the tail end according to the motor control force, and acquiring tail end motion information of the robot;
controlling the virtual object to move in the virtual environment based on the end motion information.
2. A haptic force feedback-based virtual interaction method according to claim 1, wherein said virtual environment is constructed based on a model of conversion between force and displacement; the acquiring the virtual acting force of the virtual object on the user in the virtual environment comprises:
converting the terminal motion information of the robot into a first acting force, and calculating the virtual environment force of the virtual object currently in the virtual environment;
and calculating the motion information of the virtual object according to the virtual environment force and the first acting force, and further determining the virtual acting force of the virtual object on the user in the virtual environment based on a force interaction model corresponding to the virtual object according to the motion information of the virtual object.
3. A haptic force feedback-based virtual interaction method according to claim 2, wherein when the virtual object only interacts with the user, the virtual environment force is calculated according to one or more combinations of gravity, friction, and collision forces to which the virtual object is subjected in the virtual environment;
when the user interacts with at least one other user through the virtual object, the virtual environment force is calculated according to one or more combinations of gravity, friction force and collision force of the virtual object in the virtual environment and other vector resultant force acted on the virtual object by the user.
4. A virtual interaction method based on haptic force feedback as claimed in claim 1, wherein the compensation amount comprises a robot system resistance compensation amount, and if the robot system is based on open loop control, the calculating the motor control force of the robot based on the virtual acting force and the compensation amount of the robot comprises:
and taking the sum of the virtual acting force and the system resistance compensation quantity of the robot as the motor control force of the robot.
5. A haptic force feedback-based virtual interaction method as claimed in claim 1, wherein the compensation amount comprises a robot system resistance compensation amount and an inertia compensation amount, and if the robot system is based on open loop control, the calculating the motor control force of the robot based on the virtual acting force and the compensation amount of the robot comprises:
and taking the sum of the virtual acting force and the system resistance compensation quantity and the inertia compensation quantity of the robot as the motor control force of the robot.
6. A virtual interaction method based on haptic force feedback as claimed in claim 5, wherein the robot tip is provided with a force sensor, and the inertial compensation amount is obtained by compensating the actual acting force of the user acting on the tip based on the force sensor by a preset compensation coefficient.
7. A virtual interaction method based on haptic force feedback as claimed in claim 4, characterized in that the robot tip is provided with a force sensor for feeding back the actual force applied by the user to the tip; if the robot system is based on closed-loop control, calculating the motor control force of the robot based on the virtual acting force and the compensation amount of the robot, wherein the calculation comprises the following steps:
outputting closed-loop control quantity after closed-loop calculation is carried out through a preset closed-loop controller according to the error between the virtual acting force and the counterforce of the actual acting force; and adding the closed-loop control quantity and the system resistance compensation quantity to obtain the motor control force of the robot.
8. A haptic force feedback-based virtual interaction method according to any one of claims 1 to 7, further comprising:
and when the user does not interact with any virtual object in the virtual environment through the robot, calculating the motor control force of the robot based on the compensation amount of the robot.
9. A virtual interactive system based on haptic force feedback, comprising: the robot system comprises at least one terminal device running a virtual environment program and at least one robot;
wherein each of the terminal devices is configured to perform the virtual interaction method based on haptic force feedback of any one of claims 1-8 when a respective user acts on the robot tip.
10. A readable storage medium, characterized in that it stores a computer program which, when executed on a processor, implements the haptic force feedback-based virtual interaction method according to any one of claims 1-8.
CN202211544124.7A 2022-11-30 2022-11-30 Virtual interaction method, system and readable storage medium based on haptic force feedback Pending CN115933877A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211544124.7A CN115933877A (en) 2022-11-30 2022-11-30 Virtual interaction method, system and readable storage medium based on haptic force feedback

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211544124.7A CN115933877A (en) 2022-11-30 2022-11-30 Virtual interaction method, system and readable storage medium based on haptic force feedback

Publications (1)

Publication Number Publication Date
CN115933877A true CN115933877A (en) 2023-04-07

Family

ID=86648687

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211544124.7A Pending CN115933877A (en) 2022-11-30 2022-11-30 Virtual interaction method, system and readable storage medium based on haptic force feedback

Country Status (1)

Country Link
CN (1) CN115933877A (en)

Similar Documents

Publication Publication Date Title
US8350806B2 (en) Force/tactile display, method for controlling force/tactile display, and computer program
Bowman et al. Questioning naturalism in 3D user interfaces
CN106873767B (en) Operation control method and device for virtual reality application
US20110148607A1 (en) System,device and method for providing haptic technology
Sun et al. Augmented reality based educational design for children
CN102004840B (en) Method and system for realizing virtual boxing based on computer
Sreedharan et al. 3D input for 3D worlds
JP2013516708A (en) Three-dimensional motion simulation using haptic actuators
US20230334744A1 (en) Method and apparatus for generating walk animation of virtual role, device and storage medium
CN106780681A (en) A kind of role action generation method and device
WO2019087564A1 (en) Information processing device, information processing method, and program
Le Chénéchal et al. When the giant meets the ant an asymmetric approach for collaborative and concurrent object manipulation in a multi-scale environment
CN113771043B (en) Control method and device for enabling robot to follow virtual object and rehabilitation robot
Lee et al. A development of virtual reality game utilizing Kinect, Oculus Rift and smartphone
Benzina et al. Empirical evaluation of mapping functions for navigation in virtual reality using phones with integrated sensors
Achberger et al. STROE: An ungrounded string-based weight simulation device
CN115933877A (en) Virtual interaction method, system and readable storage medium based on haptic force feedback
JP2022153476A (en) Animation creation system
Aliaga Virtual and real object collisions in a merged environment
KR20180122869A (en) Method and apparatus for processing 3 dimensional image
Sziebig et al. Navigating in 3D Immersive Environments: a VirCA usability study
TWI231912B (en) Force sensing feedback control method of entertaining system and device thereof
Sakamoto et al. Human interaction issues in a digital-physical hybrid world
De Paolis et al. The simulation of a billiard game using a haptic interface
CN114770511B (en) Robot control method and device based on physical touch sense and robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination