WO2021195916A1 - 动态手部仿真方法、装置和系统 - Google Patents
动态手部仿真方法、装置和系统 Download PDFInfo
- Publication number
- WO2021195916A1 WO2021195916A1 PCT/CN2020/082271 CN2020082271W WO2021195916A1 WO 2021195916 A1 WO2021195916 A1 WO 2021195916A1 CN 2020082271 W CN2020082271 W CN 2020082271W WO 2021195916 A1 WO2021195916 A1 WO 2021195916A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- hand
- scene
- target object
- model
- collision
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/455—Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
Definitions
- the invention relates to industrial automation, in particular to a dynamic hand simulation method, device and system of a switch cabinet connector.
- hand simulation for ergonomic analysis of product and production design is very necessary. Among them, hand simulation includes collision detection and workload analysis. Therefore, hand simulation is becoming more and more popular in ergonomic analysis.
- one of the solutions in the prior art is to utilize the experience of engineers and various action libraries. Specifically, some of the experienced engineers are able to evaluate the simulation scenario and select the most similar pose from the pose library. And adjust the joints of the hand to simulate a hand movement in less time. Other inexperienced engineers will spend more time on selecting poses from the existing library of motions and adjusting hand joints. If there is no existing action library to provide hand posture templates or the action library does not have enough postures, a lot of time will be spent adjusting the hand posture to simulate hand movements. The disadvantage of this scheme is that it relies on the empirical judgment of the engineer. In addition, the action library only limits a few types of actions for standard parts, and manually operated parts are not necessarily standard. Once the parts are replaced, the action library needs to be re-adjusted.
- the motion acquisition device for simulation of the gripping force of the hand can accelerate the adjustment speed of the hand joints.
- a hand model is generated based on the structural features of the hand and is driven by data from a data glove (action acquisition device).
- action acquisition device For example, a linear spring is used to provide force feedback of the fingertips of the hand to simulate a grasping action. This is just simulation and calculation of the grasping force. The action is limited to capturing the grasping action and how much force the hand needs or receives. It cannot make judgments such as possible collisions, nor can it plan the best path and operation of the hand. action.
- the first aspect of the present invention provides a dynamic hand simulation method, wherein the object of the simulation is an operator's hand performing an operation in a scene, the operator's hand is wearing a motion sensing device, and the dynamic
- the hand simulation method includes the following steps: S1, constructing the hand model and the target object model in the scene, and performing 3D rendering on the hand model, the scene and the target object model; S2, mapping The hand model and the action posture of the hand, and capture the action posture of the hand to generate action posture data, when at least one of the hand and the scene is detected in the 3D rendering
- the target object model has overlap, it is judged that a collision has occurred, and a collision prompt message is sent.
- the hand model is indicated by fingertip points and joint points, wherein the fingertip points and joint points are represented by data vectors (x, y, z, Rx, Ry, Rz), where x, y , Z are the coordinates of the fingertip or joint point on the x-axis, y-axis, and z-axis, respectively, and Rx, Ry, and Rz are the coordinates of the fingertip or joint point with respect to the x-axis, y-axis, and z-axis, respectively.
- Euler rotation angle wherein the action posture data is a data vector collection of all fingertips and joint points of the hand model.
- the dynamic hand simulation method includes the following steps: when it is detected that the distance between the hand and at least one target object model in the scene is less than a safe distance threshold, sending tactile feedback information to the motion sensor equipment.
- the motion sensing device is an exoskeleton glove, and when it is detected that the distance between the hand and at least one target object model in the scene is less than a safe distance threshold, tactile feedback information is sent to the exoskeleton glove , The tactile feedback information drives the vibration sensor on the exoskeleton glove to vibrate.
- the motion sensing device is an exoskeleton glove, and when it is detected that the distance between the hand and at least one target object model in the scene is less than a safe distance threshold, tactile feedback information is sent to the exoskeleton glove The tactile feedback information drives the motor and the connecting rod on the exoskeleton glove to apply a reverse force to the hand.
- the dynamic hand simulation method includes the following steps: generating a report file, the report file including: a set of motion posture data of the hand performing an operation process in a scene within a sampling time, and A set of distances between the hand and the target object in the scene, and collision judgment information between the hand and the target object in the scene.
- the second aspect of the present invention provides a dynamic hand simulation device, wherein the object of the simulation is an operator's hand performing an operation in a scene, the operator's hand is wearing a motion sensing device, and the dynamic
- the hand simulation device includes: a modeling device that constructs the hand model and a target object model in the scene; a 3D rendering device that performs 3D on the hand model, the scene, and the target object model Rendering; a motion capture device, which maps the hand model and the motion posture of the hand, and captures the hand motion posture to generate motion posture data, collision computing device, when detected in the 3D rendering When the hand and at least one target object model in the scene overlap, it is determined that a collision has occurred, and a collision prompt message is sent.
- the hand model is indicated by fingertip points and joint points, wherein the fingertip points and joint points are represented by data vectors (x, y, z, Rx, Ry, Rz), where x, y , Z are the coordinates of the fingertip or joint point on the x-axis, y-axis, and z-axis, respectively, and Rx, Ry, and Rz are the coordinates of the fingertip or joint point with respect to the x-axis, y-axis, and z-axis, respectively.
- Euler rotation angle wherein the action posture data is a data vector collection of all fingertips and joint points of the hand model.
- the collision calculation device detects that the distance between the hand and the at least one target object model in the scene is less than a safe distance threshold, it sends tactile feedback information to the motion sensing device.
- the motion sensing device is an exoskeleton glove.
- the collision computing device sends tactile feedback information to all In the exoskeleton glove, the tactile feedback information drives a vibration sensor on the exoskeleton glove to vibrate.
- the motion sensing device is an exoskeleton glove
- the collision calculation device detects that the distance between the hand and at least one target object model in the scene is less than a safe distance threshold, it sends tactile feedback information to all In the exoskeleton glove, the tactile feedback information drives the motor and the connecting rod on the exoskeleton glove to apply a reverse force to the hand.
- the collision calculation device is also used to generate a report file, the report file including: a set of motion posture data of the hand performing an operation process in a scene within a sampling time, and the hand A collection of distances to the target object in the scene, and collision judgment information between the hand and the target object in the scene.
- the third aspect of the present invention provides a dynamic hand simulation system, which includes: a processor; and a memory coupled to the processor, the memory having instructions stored therein, and the instructions when executed by the processor enable
- the electronic device performs an action, and the action includes: S1, constructing the hand model and a target object model in the scene, and performing 3D rendering on the hand model, the scene, and the target object model S2, map the hand model and the action posture of the hand, and capture the action posture of the hand to generate action posture data, when the hand and the scene are detected in the 3D rendering
- a collision prompt message is sent.
- the fourth aspect of the present invention provides a computer program product, which is tangibly stored on a computer-readable medium and includes computer-executable instructions that, when executed, cause at least one processor to execute The method described in the first aspect of the present invention.
- the fifth aspect of the present invention provides a computer-readable medium on which computer-executable instructions are stored, and when executed, the computer-executable instructions cause at least one processor to perform the method according to the first aspect of the present invention.
- the present invention provides a dynamic hand simulation mechanism, which makes the hand posture more natural through motion capture, and uses tactile feedback to reduce the iteration of hand posture adjustment, so that the time of hand simulation is effectively reduced, especially during continuous operation. Dynamic simulation analysis.
- Fig. 1 is a schematic structural diagram of a dynamic hand simulation device according to an embodiment of the present invention
- Figure 2 is a schematic diagram of a hand model according to an embodiment of the present invention.
- Fig. 3 is a schematic diagram of performing 3D rendering on the hand model, the scene, and the target object model according to an embodiment of the present invention
- Fig. 4 is a schematic diagram of a collision between the hand and at least one target object model in the scene in the 3D rendering according to an embodiment of the present invention.
- the present invention provides a dynamic hand simulation mechanism, which makes the hand posture more natural through motion capture, and uses tactile feedback to reduce the iteration of hand posture adjustment, so that the time of hand simulation is effectively reduced, especially during continuous operation. Dynamic simulation analysis.
- the first aspect of the present invention provides a dynamic hand simulation method, wherein the object of the simulation is the hand of an operator performing an operation process in a scene, and the hand of the operator is wearing a motion sensing device, so
- the dynamic hand simulation method includes the following steps.
- Fig. 1 is a schematic structural diagram of a dynamic hand simulation device according to an embodiment of the present invention.
- the dynamic hand simulation device 100 includes a motion acquisition device 110 and a simulation device 120.
- the motion acquisition device 110 further includes a motion capture device 111 and a tactile feedback device 112
- the simulation device 120 further includes a modeling device 121, a collision calculation device 122, and a 3D rendering device 123.
- the simulation device 120 can be executed on a simulation software, so it has an interface with the action acquisition device 110.
- Step S1 is first performed, and the modeling device 121 constructs the hand model and the target object model in the scene, and performs 3D rendering on the hand model, the scene, and the target object model through the 3D rendering device.
- the goal of the dynamic hand simulation mechanism provided by the present invention is to simulate a series of hand movements in a scene.
- the modeling device 121 generates a hand model and a target model around the hand in the scene.
- the 3-DOF finger joint center and fingertip global position will be used for hand movements.
- the hand model 200 includes five fingers, of which, the thumb is a carpometacarpal joint (carpometacarpal) 214, a metacarpophalangeal joint (metacarpophalangeal joint) 213, an interphalangeal joint (interphalangeal joint) 212, and a thumb
- the tip position 214 is described.
- the joint position of the palm of the thumb is optional.
- the two fingers are described by a metacarpal joint 224, a proximal interphalangeal joint 223, a distal interphalangeal joint 222, and a second fingertip position 221.
- the three fingers are described by a metacarpal joint 234, a proximal interphalangeal joint 233, a distal interphalangeal joint 232, and a three-finger tip position 231.
- the four fingers are described by a metacarpal joint 244, a proximal interphalangeal joint 243, a distal interphalangeal joint 242, and a four-finger tip position 241.
- the little finger is described by a metacarpal joint 254, a proximal interphalangeal joint 253, a distal interphalangeal joint 252, and a little fingertip position 251.
- FIG. 3 shows a scene after 3D rendering, which is a scene in which the hand 200 operates a robotic arm 300, where the hand 200 operates an element 310 of the robotic arm 300 that is close to the robotic arm 300.
- step S2 the motion capture device 111 maps the hand model in the modeling device 121 and the motion posture of the hand, and the motion capture device 111 captures the motion posture of the hand to generate motion posture data, And send the action posture data to the modeling device 121.
- the motion capture device 111 may be based on an inertial measurement unit (IMU, inertial measurement unit), or an exoskeleton, or image processing or an optical reactor.
- IMU inertial measurement unit
- the accuracy of the motion capture device 111 is higher than 0.01 degree, and at least 25 samples are captured per second.
- the action step device 111 first maps the hand model shown in FIG. 2 formed by modeling to the action posture of the hand, that is, connects the actual action with the virtual hand model.
- the hand model is indicated by fingertip points and joint points, wherein the fingertip points and joint points are represented by data vectors (x, y, z, Rx, Ry, Rz), where x, y , Z are the coordinates of the fingertip or joint point on the x-axis, y-axis, and z-axis, respectively, and Rx, Ry, and Rz are the coordinates of the fingertip or joint point with respect to the x-axis, y-axis, and z-axis, respectively.
- Euler rotation angle wherein the action posture data is a data vector collection of all fingertips and joint points of the hand model.
- a certain time interval is used as the sampling period to sample the spatial position and posture data of the related nodes of the hand.
- the data storage form can be in the form of BVH format.
- the sampling interval is 0.017 seconds, and a total of 3004 frames are sampled.
- One line is the data sampled at one time.
- the hand model 200 has a total of 20 joint points and fingertip points. Therefore, the action posture data generated each time includes 20 data vectors (x, y, z, Rx, Ry, Rz).
- the action posture database 130 stores hand action posture data continuously operated in a simulation scene, so that the hand simulation can perform calculation collision with the hand action posture data, and take it in other simulation scenarios.
- the modeling device 121 sends target object information to the collision calculation device 122, where the target information includes target transmission, target shape, and the like.
- the collision calculation device 122 determines that a collision has occurred, and then the collision calculation device 122 sends a collision prompt message. As shown in FIG. 4, when the hand 200 and the component 310 collide, there will be an overlap (not shown) between the models of the two. Once the overlap is detected, the collision prompt message will be sent.
- the collision calculation device 122 detects the collision between the hand and other objects in a simulation scene. Specifically, the collision calculation device 122 may optionally receive hand movement data from the hand movement database 130 to drive the hand movement from the modeling device 121, and other target objects in the simulation scene will also be sent from the modeling device 121 to The collision calculation device 122.
- the collision calculation result will be sent to the collision result database 124 and also sent to the haptic feedback device 112 in the action acquisition device 110 at the same time to drive the haptic feedback device 112.
- the 3D rendering device 123 displays the dynamic process and simulation results in the 3D environment based on the collision calculation results of the modeling device 121 and the collision calculation device 122, and it also has a screen capture function when a collision occurs.
- the hand motion posture data is obtained from the motion capture device 111, and the hand motion posture data is also sent to the collision calculation device 122 to perform collision detection.
- the collision result database 124 is used to store the collision detection result output by the collision calculation device 122, and is further applied in ergonomic analysis and product production optimization. Basically, the collision result database 124 is also used to filter the hand posture movement data at the time of the collision result, so that the engineer can choose to obtain the hand movement posture data when there is no collision or when there is a collision.
- the collision calculation device 122 performs collision detection based on a collision algorithm, such as the construction and traversal of an object bounding box hierarchical tree. Among them, the minimum distance between the hand model and other target models will be calculated.
- the collision calculation device 122 will send the collision calculation result data to the haptic feedback device 112 and the 3D rendering device 123, and store it in the collision result database 124 .
- the collision data storage is also used for re-executing the simulation and further analysis. For example, it can generate a simulation where no collision occurs, and capture all screenshots when the collision occurs. Therefore, it provides evidence to identify which needs to be adjusted first, as well as implementation methods and product design.
- the dynamic hand simulation method includes the following steps: when the collision calculation device 122 detects that the distance between the hand and at least one target object model in the scene is less than a safe distance threshold, sending a tactile sensation Feedback information to the motion sensing device.
- the tactile feedback device 112 provides tactile feedback, the tactile feedback includes vibration or feedback force, so that the person performing the simulation process can more directly feel the simulation result and adjust the hand posture and movement in time.
- the motion sensing device is an exoskeleton glove.
- the tactile feedback device 112 sends tactile sensations.
- Feedback information is provided to the exoskeleton glove, and the tactile feedback information drives the vibration sensor on the exoskeleton glove to vibrate.
- the tactile feedback exoskeleton is involved or the mechanical structure is restricted.
- the exoskeleton glove has a vibration sensor to prompt some tactile feedback.
- the purpose of the tactile feedback is to know that the simulation result can be reflected on the hand in a timely manner. If there is a collision, the degree of the collision will be reflected in the exoskeleton glove. For example, the hand and the target object will vibrate slightly, and the high-frequency vibration will be caused when the hand is close to the target object. The degree of collision is distinguished by the vibration frequency.
- the tactile feedback device 112 sends tactile feedback information to the external
- the tactile feedback information drives a motor and a connecting rod on the exoskeleton glove to apply a reverse force to the hand.
- the exoskeleton glove has a motor and a connecting rod, and the motor applies a reverse force to the hand through the connecting rod and user torque.
- the exoskeleton glove uses force to remind the designer that the part of the hand has been restricted by a certain part of the target object or reached a certain boundary.
- the operator can wear exoskeleton gloves and perform a simulation of an operation process in a scene through the 3D rendered hands displayed on the screen and constantly adjust their actions in time, find the best operation process, and avoid collisions.
- the dynamic hand simulation method includes the following steps: generating a report file, the report file comprising: a set of motion posture data of the hand performing an operation process in a scene within a sampling time, and all A set of distances between the hand and the target object in the scene, and collision judgment information between the hand and the target object in the scene.
- the report file can clearly show which finger of the hand collided at a certain point in time and the distance to the target object during normal operation.
- the operator can analyze and compare the best operation process afterwards.
- Figure 2 shows a list of report documents when no collision occurs, as follows:
- the second aspect of the present invention provides a dynamic hand simulation device, wherein the object of the simulation is an operator's hand performing an operation in a scene, the operator's hand is wearing a motion sensing device, and the dynamic
- the hand simulation device includes: a modeling device that constructs the hand model and a target object model in the scene; a 3D rendering device that performs 3D on the hand model, the scene, and the target object model Rendering; a motion capture device, which maps the hand model and the motion posture of the hand, and captures the hand motion posture to generate motion posture data, collision computing device, when detected in the 3D rendering When the hand and at least one target object model in the scene overlap, it is determined that a collision has occurred, and a collision prompt message is sent.
- the hand model is indicated by fingertip points and joint points, wherein the fingertip points and joint points are represented by data vectors (x, y, z, Rx, Ry, Rz), where x, y , Z are the coordinates of the fingertip or joint point on the x-axis, y-axis, and z-axis, respectively, and Rx, Ry, and Rz are the coordinates of the fingertip or joint point with respect to the x-axis, y-axis, and z-axis, respectively.
- Euler rotation angle wherein the action posture data is a data vector collection of all fingertips and joint points of the hand model.
- the collision calculation device detects that the distance between the hand and the at least one target object model in the scene is less than a safe distance threshold, it sends tactile feedback information to the motion sensing device.
- the motion sensing device is an exoskeleton glove.
- the collision computing device sends tactile feedback information to all In the exoskeleton glove, the tactile feedback information drives a vibration sensor on the exoskeleton glove to vibrate.
- the motion sensing device is an exoskeleton glove
- the collision calculation device detects that the distance between the hand and at least one target object model in the scene is less than a safe distance threshold, it sends tactile feedback information to all In the exoskeleton glove, the tactile feedback information drives the motor and the connecting rod on the exoskeleton glove to apply a reverse force to the hand.
- the collision calculation device is also used to generate a report file, the report file including: a set of motion posture data of the hand performing an operation process in a scene within a sampling time, and the hand A collection of distances to the target object in the scene, and collision judgment information between the hand and the target object in the scene.
- the third aspect of the present invention provides a dynamic hand simulation system, which includes: a processor; and a memory coupled to the processor, the memory having instructions stored therein, and the instructions when executed by the processor enable
- the electronic device performs an action, and the action includes: S1, constructing the hand model and a target object model in the scene, and performing 3D rendering on the hand model, the scene, and the target object model S2, map the hand model and the action posture of the hand, and capture the action posture of the hand to generate action posture data, when the hand and the scene are detected in the 3D rendering
- a collision prompt message is sent.
- the fourth aspect of the present invention provides a computer program product, which is tangibly stored on a computer-readable medium and includes computer-executable instructions that, when executed, cause at least one processor to execute The method described in the first aspect of the present invention.
- the fifth aspect of the present invention provides a computer-readable medium on which computer-executable instructions are stored, and when executed, the computer-executable instructions cause at least one processor to perform the method according to the first aspect of the present invention.
- the dynamic hand simulation mechanism provided by the present invention can greatly reduce the operation time, the hand posture is more natural and spontaneous, and the iteration of the hand posture movement adjustment is reduced. Among them, the collision result can be directly displayed in the 3D rendering and directly prompted to the hand through tactile feedback, so that the engineer can change the hand posture during the execution of the simulation.
- the present invention is not only suitable for hand gripping operations, but also suitable for simulating all hand operations.
- the present invention can timely capture hand gestures and directly provide hand tactile feedback based on a safe distance.
- the collision structure can be calculated in time and remind the engineer to adjust the hand gesture immediately.
- the database can provide a series of hand gestures to simulate a series of operations, not just one hand motion.
- the present invention is more efficient and convenient.
- the present invention can reduce the time of hand simulation and accelerate production planning and product optimization.
Landscapes
- Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Manipulator (AREA)
- Processing Or Creating Images (AREA)
Abstract
本发明提供了动态手部仿真方法、装置和系统,其中,所述仿真的对象是操作者手部在一个场景中执行一个操作过程,所述操作者的手部穿戴了动作感应设备,所述动态手部仿真方法包括如下步骤:S1,构建所述手部模型和所述场景中的目标物体模型,并对所述手部模型、所述场景和所述目标物体模型执行3D渲染;S2,映射所述手部模型和所述手部的动作姿态,并捕捉所述手部的动作姿态以生成动作姿态数据,当检测到在所述3D渲染中所述手部和所述场景中的至少一个目标物体模型具有重叠时则判断发生了碰撞,并发送碰撞提示信息。本发明效率更高更方便,并且减少了调整手部姿态的迭代,能够适时对手部执行操作的一系列姿态动作进行仿真。
Description
本发明涉及工业自动化,尤其涉及一种开关柜连接件的动态手部仿真方法、装置和系统。
虽然现在有很多工作都被机器人取代了,但是仍然有一些工作人工无法取代,例如电缆安装和接线等。在这样的人工工作中,针对产品和生产设计的人类工程学分析(ergonomic analysis)的手部仿真非常必要。其中,手部仿真包括碰撞检测(collision detection)和工作量分析等。因此,手部仿真在人类工程学分析中越来越流行。
然而随着手部仿真的增加,仿真速度也仍然应该提高以满足每个客户的需求。通常工程师将大部分时间花在手部形态调整(posture adjustment)和距离测试。特别在连续执行中,手部形态调整的迭代会增加很多。一个情景运作的手部仿真会比单个姿态花费更多的时间。
为了解决这个问题,现有技术的其中一个方案是利用工程师的经验和各种动作库。具体地,其中一些有经验的工程师能够评估仿真场景和从姿态库中选出最相似的姿态。并且调整手部的关节来用较少时间来仿真一个手部运行。其他没有经验的工程师会花更多的时间在从现有动作库中选出姿态以及调整手部关节上。如果没有现存的动作库提供手部姿态模板或者动作库并没有足够的姿态,会花大量时间在调整手部姿态来仿真手部动作上。这种方案缺陷在于依赖于工程师的经验判断。此外,动作库只限定了针对标准零件的几类动作,手动操作的零件却不一定是标准的,一旦更换零件则需要重新调整动作库。
现有技术的另一个方案是对手部的抓握力仿真的动作获取装置能够加速手部关节的调整速度。例如,手部模型基于手部结构特征来产生并且通过数据手套(动作获取装置)的数据来驱动。例如用线性弹簧用于提供手部指尖的力反馈以仿真抓握动作。这种只是仿真和计算抓取的力 度,动作仅限于捕捉抓取动作手部需要或者受到多大的力,而并不能针对例如可能的碰撞做出判断,也不能规划出手部操作的最佳路径和动作。
发明内容
本发明第一方面提供了动态手部仿真方法,其中,所述仿真的对象是操作者手部在一个场景中执行一个操作过程,所述操作者的手部穿戴了动作感应设备,所述动态手部仿真方法包括如下步骤:S1,构建所述手部模型和所述场景中的目标物体模型,并对所述手部模型、所述场景和所述目标物体模型执行3D渲染;S2,映射所述手部模型和所述手部的动作姿态,并捕捉所述手部的动作姿态以生成动作姿态数据,当检测到在所述3D渲染中所述手部和所述场景中的至少一个目标物体模型具有重叠时则判断发生了碰撞,并发送碰撞提示信息。
进一步地,所述手部模型用指尖点和关节点来指示,其中所述指尖点和关节点用数据向量(x,y,z,Rx,Ry,Rz)表示,其中,x、y、z分别为所述指尖点或关节点在x轴、y轴、z轴的坐标,Rx、Ry、Rz分别为所述指尖点或关节点相对于x轴、y轴、z轴的欧拉旋转角,其中,所述动作姿态数据为所述手部模型所有指尖和关节点的数据向量集合。
进一步地,述动态手部仿真方法包括如下步骤:当检测到所述手部和所述场景中的至少一个目标物体模型的距离小于一个安全距离阈值时,则发送触觉反馈信息给所述动作感应设备。
进一步地,述动作感应设备为外骨骼手套,当检测到所述手部和所述场景中的至少一个目标物体模型的距离小于一个安全距离阈值时,则发送触觉反馈信息给所述外骨骼手套,所述触觉反馈信息驱动所述外骨骼手套上的震动感应器发出震动。
进一步地,述动作感应设备为外骨骼手套,当检测到所述手部和所述场景中的至少一个目标物体模型的距离小于一个安全距离阈值时,则发送触觉反馈信息给所述外骨骼手套,所述触觉反馈信息驱动所述外骨骼手套上的电机和连杆施加反向作用力给所述手部。
进一步地,所述动态手部仿真方法包括如下步骤:产生一个汇报文 件,所述汇报文件包括:在一个采样时间内所述手部在一个场景中执行一个操作过程的动作姿态数据集合,和所述手部和所述场景中的所述目标物体的距离集合,以及所述手部和所述场景中的所述目标物体之间的碰撞判断信息。
本发明第二方面提供了动态手部仿真装置,其中,所述仿真的对象是操作者手部在一个场景中执行一个操作过程,所述操作者的手部穿戴了动作感应设备,所述动态手部仿真装置包括:建模装置,其构建所述手部模型和所述场景中的目标物体模型;3D渲染装置,其对所述手部模型、所述场景和所述目标物体模型执行3D渲染;动作捕捉装置,其映射所述手部模型和所述手部的动作姿态,并捕捉所述手部的动作姿态以生成动作姿态数据,碰撞计算装置,当检测到在所述3D渲染中所述手部和所述场景中的至少一个目标物体模型具有重叠时则判断发生了碰撞,并发送碰撞提示信息。
进一步地,所述手部模型用指尖点和关节点来指示,其中所述指尖点和关节点用数据向量(x,y,z,Rx,Ry,Rz)表示,其中,x、y、z分别为所述指尖点或关节点在x轴、y轴、z轴的坐标,Rx、Ry、Rz分别为所述指尖点或关节点相对于x轴、y轴、z轴的欧拉旋转角,其中,所述动作姿态数据为所述手部模型所有指尖和关节点的数据向量集合。
进一步地,所述碰撞计算装置检测到所述手部和所述场景中的至少一个目标物体模型的距离小于一个安全距离阈值时,则发送触觉反馈信息给所述动作感应设备。
进一步地,所述动作感应设备为外骨骼手套,当检测到所述手部和所述场景中的至少一个目标物体模型的距离小于一个安全距离阈值时,则碰撞计算装置发送触觉反馈信息给所述外骨骼手套,所述触觉反馈信息驱动所述外骨骼手套上的震动感应器发出震动。
进一步地,所述动作感应设备为外骨骼手套,当碰撞计算装置检测到所述手部和所述场景中的至少一个目标物体模型的距离小于一个安全距离阈值时,则发送触觉反馈信息给所述外骨骼手套,所述触觉反馈信息驱动所述外骨骼手套上的电机和连杆施加反向作用力给所述手部。
进一步地,所述碰撞计算装置还用于产生一个汇报文件,所述汇报 文件包括:在一个采样时间内所述手部在一个场景中执行一个操作过程的动作姿态数据集合,和所述手部和所述场景中的所述目标物体的距离集合,以及所述手部和所述场景中的所述目标物体之间的碰撞判断信息。
本发明第三方面提供了动态手部仿真系统,其中,包括:处理器;以及与所述处理器耦合的存储器,所述存储器具有存储于其中的指令,所述指令在被处理器执行时使所述电子设备执行动作,所述动作包括:S1,构建所述手部模型和所述场景中的目标物体模型,并对所述手部模型、所述场景和所述目标物体模型执行3D渲染;S2,映射所述手部模型和所述手部的动作姿态,并捕捉所述手部的动作姿态以生成动作姿态数据,当检测到在所述3D渲染中所述手部和所述场景中的至少一个目标物体模型具有重叠时则判断发生了碰撞,并发送碰撞提示信息。
本发明第四方面提供了计算机程序产品,所述计算机程序产品被有形地存储在计算机可读介质上并且包括计算机可执行指令,所述计算机可执行指令在被执行时使至少一个处理器执行根据本发明第一方面所述的方法。
本发明第五方面提供了计算机可读介质,其上存储有计算机可执行指令,所述计算机可执行指令在被执行时使至少一个处理器执行根据本发明第一方面所述的方法。
本发明提供了动态手部仿真机制,其通过动作捕捉使得手部姿态更加自然,并利用触觉反馈来减少手部姿态调整的迭代,使得手部仿真的时间有效减少,特别是在连续操作时的动态仿真分析。
图1是根据本发明一个实施例的动态手部仿真装置的结构示意图;
图2是根据本发明一个实施例的手部模型示意图;
图3是根据本发明一个实施例的对所述手部模型、所述场景和所述目标物体模型执行3D渲染的示意图;
图4是根据本发明一个实施例的在所述3D渲染中所述手部和所述场景中的至少一个目标物体模型碰撞时的示意图。
以下结合附图,对本发明的具体实施方式进行说明。
本发明提供了动态手部仿真机制,其通过动作捕捉使得手部姿态更加自然,并利用触觉反馈来减少手部姿态调整的迭代,使得手部仿真的时间有效减少,特别是在连续操作时的动态仿真分析。
本发明第一方面提供了一种动态手部仿真方法,其中,所述仿真的对象是操作者手部在一个场景中执行一个操作过程,所述操作者的手部穿戴了动作感应设备,所述动态手部仿真方法包括如下步骤。
图1是根据本发明一个实施例的动态手部仿真装置的结构示意图。如图1所示,动态手部仿真装置100包括动作获取装置110和仿真装置120。其中,所述动作获取装置110进一步地包括动作捕捉装置111和触感反馈装置112,所述仿真装置120进一步地包括建模装置121、碰撞计算装置122和3D渲染装置123。其中所述仿真装置120可以在一个仿真软件上执行,因此其具有和动作获取装置110的接口。
首先执行步骤S1,建模装置121构建所述手部模型和所述场景中的目标物体模型,并通过3D渲染装置对所述手部模型、所述场景和所述目标物体模型执行3D渲染。
其中,本发明提供的动态手部仿真机制的目标在于在的一个场景中仿真手部的一系列动作。所述建模装置121会产生手部模型和该情景中手部周围的目标模型。并且,3-DOF手指关节中心和指尖global位置会用于手部做动作。
如图2所示,手部模型200包括五个手指,其中,拇指用腕掌的关节(carpometacarpal)214,掌指的关节(metacarpophalangeal joint)213、指节间的关节(interphalangeal joint)212和拇指尖位置214来描述。拇指的腕掌的关节位置是可选地。其中,二指用掌指的关节224、近端指间的关节(proximal interphalangeal joint)223、远端指间的关节(distal interphalangeal joint)222和二指尖位置221来描述。同样,三指用掌指的关节234、近端指间的关节(proximal interphalangeal joint)233、远端指间的关节(distal interphalangeal joint)232和三指尖位置231来描述。四指用掌指的关节244、近端指间的关节(proximal interphalangeal joint)243、远端指间的关节(distal interphalangeal joint)242和四指尖位置241来描述。小指用掌指的关节254、近端指间的关节(proximal interphalangeal joint)253、远端指间的关节(distal interphalangeal joint)252和小指尖位置251来描述。
图3示出了3D渲染之后的场景,其为手部200操作一个机械臂300的场景,其中,手部200操作一个机械臂300中靠近了机械臂300的元件310。
然后执行步骤S2,动作捕捉装置111映射所述建模装置121中的所述手部模型和所述手部的动作姿态,动作捕捉装置111捕捉所述手部的动作姿态以生成动作姿态数据,并将所述动作姿态数据发送给建模装置121。
可选地,所述动作捕捉装置111可以基于惯性测量单元(IMU,inertial measurement unit),或者基于外骨骼,又或者是图像处理或者光学反应器。业内有很多成熟的技术用于捕捉手部动作,并且精度好,捕捉速度高。在本发明的一个优选例中,动作捕捉装置111的精确度高于0.01度,至少每秒捕捉25个样本。
具体地,所述动作步骤装置111先将建模形成的如图2所示的手部模型映射到所述手部的动作姿态,也就是将实际动作与虚拟手部模型进行连接。
进一步地,所述手部模型用指尖点和关节点来指示,其中所述指尖点和关节点用数据向量(x,y,z,Rx,Ry,Rz)表示,其中,x、y、z分别为所述指尖点或关节点在x轴、y轴、z轴的坐标,Rx、Ry、Rz分别为所述指尖点或关节点相对于x轴、y轴、z轴的欧拉旋转角,其中,所述动作姿态数据为所述手部模型所有指尖和关节点的数据向量集合。
优选地,以某一时间间隔作为采样周期,采样手部所有关节点的空间位置与姿态数据,其数据存储形式可以形如BVH格式,该示例中采样间隔为0.017秒,共采样3004帧,每一行为一次采样的数据。如图2所示手部模型200一共有20个关节点和指尖点,因此,每次产生的动作姿态数据就包括20个数据向量(x,y,z,Rx,Ry,Rz)。
动作姿态数据库130存储在一个仿真场景中连续操作的手部动作姿态数据,以使得手部仿真能用手部动作姿态数据来执行计算碰撞,并在其他仿真场景中服用。
同时,建模装置121会发送目标物体信息给碰撞计算装置122,其中, 所述目标信息包括目标传输、目标形状等。
当检测到在所述3D渲染中所述手部和所述场景中的至少一个目标物体模型具有重叠时碰撞计算装置122则判断发生了碰撞,然后碰撞计算装置122发送碰撞提示信息。如图4所示,手部200和元件310发生碰撞时两者的模型会有重叠部分(未示出),一旦检测到该重叠部分就发送碰撞提示信息。
其中,碰撞计算装置122会检测手部在一个仿真场景中和其他物体之间产生的碰撞。具体地,碰撞计算装置122可选地从手部动作数据库130中接收手部动作数据来从建模装置121驱动手部动作,仿真场景中的其他目标物体也会从建模装置121中发送给碰撞计算装置122。
可选地,碰撞计算结果会发送给碰撞结果数据库124,也会同时发送给动作获取装置110中的触感反馈装置112以驱动触感反馈装置112。
进一步地,3D渲染装置123基于建模装置121和所述碰撞计算装置122的碰撞计算结果来展示3D环境中的动态过程和仿真结果,其也具有当碰撞发生时的截屏功能。
手部动作姿态数据是从动作捕捉装置111中获取的,手部动作姿态数据也会发送给碰撞计算装置122执行碰撞检测。
碰撞结果数据库124用于存储所述碰撞计算装置122输出的碰撞检测结果,并进一步在人类工程学分析和产品生产优化中应用。基本上,碰撞结果数据库124还用于过滤碰撞结果时的手部姿态动作数据,以使得工程师可以选择获取没有碰撞或者有碰撞时的手部动作姿态数据。碰撞计算装置122会基于碰撞算法执行碰撞检测,例如对象边界框层次树(object bounding box hierarchical tree)的构建和穿越。其中,会计算在手部模型和其他目标模型的最小距离。如果手部在仿真场景中浸入其他目标,所述距离就会返回一个复制,碰撞计算装置122则会发送碰撞计算结果数据给触感反馈装置112和3D渲染装置123,并存储在碰撞结果数据库124中。
在碰撞结果数据库124中,碰撞数据存储也用于重新执行仿真并进一步分析。例如,其可以产生一个没有碰撞发生的仿真,并捕捉当碰撞发生时所有的截屏。因此,其提供了鉴别哪个需要首先调整的证据,以及执行方法和产品设计等。
进一步地,所述动态手部仿真方法包括如下步骤:当所述碰撞计算装置122检测到所述手部和所述场景中的至少一个目标物体模型的距离小于一个安全距离阈值时,则发送触觉反馈信息给所述动作感应设备。
其中,触感反馈装置112提供了触觉反馈,所述触觉反馈包括震动或者反馈力,以使得执行仿真过程的人能够更直接地感受到仿真结果并且及时调整手部姿态和动作。
所述动作感应设备为外骨骼手套,当所述碰撞计算装置122检测到所述手部和所述场景中的至少一个目标物体模型的距离小于一个安全距离阈值时,触感反馈装置112则发送触觉反馈信息给所述外骨骼手套,所述触觉反馈信息驱动所述外骨骼手套上的震动感应器发出震动。触觉反馈外骨骼的牵扯或者机械结构的限制,例如所述外骨骼手套里面有震动感应器,提示给出一些触觉反馈,触觉反馈的目的只是想知道仿真结果能够适时的反应在手部上。如果发生了碰撞,碰撞程度会反应在外骨骼手套上,比如手和目标物体接近了则轻微震动,完全碰上了则高频振动,以震动频率来区分碰撞程度。
可选地,当所述碰撞计算装置122检测到所述手部和所述场景中的至少一个目标物体模型的距离小于一个安全距离阈值时,触感反馈装置112则发送触觉反馈信息给所述外骨骼手套,所述触觉反馈信息驱动所述外骨骼手套上的电机和连杆施加反向作用力给所述手部。其中,所述外骨骼手套中有电机和连杆,电机会通过连杆及用户扭矩给手部施加一个反向作用力。外骨骼手套通过力的施加,来提示给设计人员,手的部分已经被目标物体某个部分限制或者达到某个边界。
因此,操作者可以佩戴外骨骼手套,通过屏幕上显示的3D渲染后的手部在一个场景中执行一个操作过程的仿真不停适时调整自己的动作,寻找最佳的操作过程,并且避免碰撞。
优选地,所述动态手部仿真方法包括如下步骤:产生一个汇报文件,所述汇报文件包括:在一个采样时间内所述手部在一个场景中执行一个操作过程的动作姿态数据集合,和所述手部和所述场景中的所述目标物体的距离集合,以及所述手部和所述场景中的所述目标物体之间的碰撞判断信息。
表1 发生碰撞时的汇报文件列表
时间 | 手部信息 | 目标物体信息 | 碰撞判断 | 距离 |
0.8 | 手部.右手指1 | 物体1.位置1 | 正常 | 2.65 |
0.8 | 手部.右手指2 | 物体1.位置1 | 碰撞 | |
0.8 | 手部.右手指3 | 物体1.位置1 | 正常 | 0.03 |
0.9 | 手部.右手指1 | 物体1.位置1 | 碰撞 | |
0.9 | 手部.右手指2 | 物体1.位置1 | 正常 | 0.51 |
0.9 | 手部.右手指3 | 物体1.位置1 | 碰撞 | |
0.9 | 手部.右手指1 | 物体1.位置1 | 碰撞 |
如上表所示,汇报文件能够清楚显示手部的哪个手指在某个时间点发生碰撞的情况,以及正常操作时和目标物体的距离,操作者可以在事后进行分析比较出最佳操作过程。例如,图2中示出了未发生碰撞时的汇报文件列表,如下:
表2 未发生碰撞时的汇报文件列表
时间 | 手部信息 | 目标物体信息 | 碰撞判断 | 距离 |
1.9 | 手部.右手指1 | 物体1.位置2 | 正常 | 3.87 |
1.9 | 手部.右手指2 | 物体1.位置2 | 正常 | 4.89 |
1.9 | 手部.右手指3 | 物体1.位置1 | 正常 | 4.41 |
1.9 | 手部.右手指1 | 物体1.位置3 | 正常 | 4.65 |
2.0 | 手部.右手指2 | 物体1.位置1 | 正常 | 2.46 |
2.0 | 手部.右手指3 | 物体1.位置3 | 正常 | 3.56 |
2.0 | 手部.右手指1 | 物体1.位置2 | 正常 | 4.59 |
本发明第二方面提供了动态手部仿真装置,其中,所述仿真的对象是操作者手部在一个场景中执行一个操作过程,所述操作者的手部穿戴了动作感应设备,所述动态手部仿真装置包括:建模装置,其构建所述手部模型和所述场景中的目标物体模型;3D渲染装置,其对所述手部模型、所述场景和所述目标物体模型执行3D渲染;动作捕捉装置,其映射所述手部模型和所述手部的动作姿态,并捕捉所述手部的动作姿态以生成动作姿态数据,碰撞计算装置,当检测到在所述3D渲染中所述手部和所述场景中的至少一个目标物体模型具有重叠时则判断发生了碰撞,并发送碰撞提示信息。
进一步地,所述手部模型用指尖点和关节点来指示,其中所述指尖 点和关节点用数据向量(x,y,z,Rx,Ry,Rz)表示,其中,x、y、z分别为所述指尖点或关节点在x轴、y轴、z轴的坐标,Rx、Ry、Rz分别为所述指尖点或关节点相对于x轴、y轴、z轴的欧拉旋转角,其中,所述动作姿态数据为所述手部模型所有指尖和关节点的数据向量集合。
进一步地,所述碰撞计算装置检测到所述手部和所述场景中的至少一个目标物体模型的距离小于一个安全距离阈值时,则发送触觉反馈信息给所述动作感应设备。
进一步地,所述动作感应设备为外骨骼手套,当检测到所述手部和所述场景中的至少一个目标物体模型的距离小于一个安全距离阈值时,则碰撞计算装置发送触觉反馈信息给所述外骨骼手套,所述触觉反馈信息驱动所述外骨骼手套上的震动感应器发出震动。
进一步地,所述动作感应设备为外骨骼手套,当碰撞计算装置检测到所述手部和所述场景中的至少一个目标物体模型的距离小于一个安全距离阈值时,则发送触觉反馈信息给所述外骨骼手套,所述触觉反馈信息驱动所述外骨骼手套上的电机和连杆施加反向作用力给所述手部。
进一步地,所述碰撞计算装置还用于产生一个汇报文件,所述汇报文件包括:在一个采样时间内所述手部在一个场景中执行一个操作过程的动作姿态数据集合,和所述手部和所述场景中的所述目标物体的距离集合,以及所述手部和所述场景中的所述目标物体之间的碰撞判断信息。
本发明第三方面提供了动态手部仿真系统,其中,包括:处理器;以及与所述处理器耦合的存储器,所述存储器具有存储于其中的指令,所述指令在被处理器执行时使所述电子设备执行动作,所述动作包括:S1,构建所述手部模型和所述场景中的目标物体模型,并对所述手部模型、所述场景和所述目标物体模型执行3D渲染;S2,映射所述手部模型和所述手部的动作姿态,并捕捉所述手部的动作姿态以生成动作姿态数据,当检测到在所述3D渲染中所述手部和所述场景中的至少一个目标物体模型具有重叠时则判断发生了碰撞,并发送碰撞提示信息。
本发明第四方面提供了计算机程序产品,所述计算机程序产品被有形地存储在计算机可读介质上并且包括计算机可执行指令,所述计算机可执行指令在被执行时使至少一个处理器执行根据本发明第一方面所述 的方法。
本发明第五方面提供了计算机可读介质,其上存储有计算机可执行指令,所述计算机可执行指令在被执行时使至少一个处理器执行根据本发明第一方面所述的方法。
本发明提供的动态手部仿真机制能够极大地减少操作时间,手部姿态更加自然和自发,减少了手部姿态动作调整的迭代。其中,碰撞结果能直接在3D渲染中显示并直接通过触觉反馈提示到手部,以使得工程师能够在仿真执行过程中改变手部姿态。此外,本发明不仅适用于手部抓握操作,也适用于仿真所有手部操作的动作。
本发明能够适时捕捉手部姿态动作,并基于安全距离直接提供手部触觉反馈。碰撞结构能够适时计算并提醒工程师立刻调整手部姿态动作。此外,数据库能够提供一系列手部动作姿态以对一系列操作执行仿真,而不仅仅能够一个手部动作执行仿真。
并且,本发明更有效率更方便。对于最终用户,本发明能够减少手部仿真的时间并加速生产计划和产品优化。
尽管本发明的内容已经通过上述优选实施例作了详细介绍,但应当认识到上述的描述不应被认为是对本发明的限制。在本领域技术人员阅读了上述内容后,对于本发明的多种修改和替代都将是显而易见的。因此,本发明的保护范围应由所附的权利要求来限定。此外,不应将权利要求中的任何附图标记视为限制所涉及的权利要求;“包括”一词不排除其它权利要求或说明书中未列出的装置或步骤;“第一”、“第二”等词语仅用来表示名称,而并不表示任何特定的顺序。
Claims (15)
- 动态手部仿真方法,其中,所述仿真的对象是操作者手部在一个场景中执行一个操作过程,所述操作者的手部穿戴了动作感应设备,所述动态手部仿真方法包括如下步骤:S1,构建所述手部模型和所述场景中的目标物体模型,并对所述手部模型、所述场景和所述目标物体模型执行3D渲染;S2,映射所述手部模型和所述手部的动作姿态,并捕捉所述手部的动作姿态以生成动作姿态数据,当检测到在所述3D渲染中所述手部和所述场景中的至少一个目标物体模型具有重叠时则判断发生了碰撞,并发送碰撞提示信息。
- 根据权利要求1所述的动态手部仿真方法,其特征在于,所述手部模型用指尖点和关节点来指示,其中所述指尖点和关节点用数据向量(x,y,z,Rx,Ry,Rz)表示,其中,x、y、z分别为所述指尖点或关节点在x轴、y轴、z轴的坐标,Rx、Ry、Rz分别为所述指尖点或关节点相对于x轴、y轴、z轴的欧拉旋转角,其中,所述动作姿态数据为所述手部模型所有指尖和关节点的数据向量集合。
- 根据权利要求1所述的动态手部仿真方法,其特征在于,所述动态手部仿真方法包括如下步骤:当检测到所述手部和所述场景中的至少一个目标物体模型的距离小于一个安全距离阈值时,则发送触觉反馈信息给所述动作感应设备。
- 根据权利要求3所述的动态手部仿真方法,其特征在于,所述动作感应设备为外骨骼手套,当检测到所述手部和所述场景中的至少一个目标物体模型的距离小于一个安全距离阈值时,则发送触觉反馈信息给所述外骨骼手套,所述触觉反馈信息驱动所述外骨骼手套上的震动感应器发出震动。
- 根据权利要求3所述的动态手部仿真方法,其特征在于,所述动作感应设备为外骨骼手套,当检测到所述手部和所述场景中的至少一个目标物体模型的距离小于一个安全距离阈值时,则发送触觉反馈信息给所述外骨骼手套,所述触觉反馈信息驱动所述外骨骼手套上的电机和连 杆施加反向作用力给所述手部。
- 根据权利要求1所述的动态手部仿真方法,其特征在于,所述动态手部仿真方法包括如下步骤:产生一个汇报文件,所述汇报文件包括:在一个采样时间内所述手部在一个场景中执行一个操作过程的动作姿态数据集合,和所述手部和所述场景中的所述目标物体的距离集合,以及所述手部和所述场景中的所述目标物体之间的碰撞判断信息。
- 动态手部仿真装置,其中,所述仿真的对象是操作者手部在一个场景中执行一个操作过程,所述操作者的手部穿戴了动作感应设备,所述动态手部仿真装置包括:建模装置,其构建所述手部模型和所述场景中的目标物体模型;3D渲染装置,其对所述手部模型、所述场景和所述目标物体模型执行3D渲染;动作捕捉装置,其映射所述手部模型和所述手部的动作姿态,并捕捉所述手部的动作姿态以生成动作姿态数据,碰撞计算装置,当检测到在所述3D渲染中所述手部和所述场景中的至少一个目标物体模型具有重叠时则判断发生了碰撞,并发送碰撞提示信息。
- 根据权利要求7所述的动态手部仿真装置,其特征在于,所述手部模型用指尖点和关节点来指示,其中所述指尖点和关节点用数据向量(x,y,z,Rx,Ry,Rz)表示,其中,x、y、z分别为所述指尖点或关节点在x轴、y轴、z轴的坐标,Rx、Ry、Rz分别为所述指尖点或关节点相对于x轴、y轴、z轴的欧拉旋转角,其中,所述动作姿态数据为所述手部模型所有指尖和关节点的数据向量集合。
- 根据权利要求7所述的动态手部仿真装置,其特征在于,当所述碰撞计算装置检测到所述手部和所述场景中的至少一个目标物体模型的距离小于一个安全距离阈值时,则发送触觉反馈信息给所述动作感应设备。
- 根据权利要求9所述的动态手部仿真装置,其特征在于,所述动作感应设备为外骨骼手套,当检测到所述手部和所述场景中的至少一个目标物体模型的距离小于一个安全距离阈值时,则碰撞计算装置发送触 觉反馈信息给所述外骨骼手套,所述触觉反馈信息驱动所述外骨骼手套上的震动感应器发出震动。
- 根据权利要求9所述的动态手部仿真装置,其特征在于,所述动作感应设备为外骨骼手套,当碰撞计算装置检测到所述手部和所述场景中的至少一个目标物体模型的距离小于一个安全距离阈值时,则发送触觉反馈信息给所述外骨骼手套,所述触觉反馈信息驱动所述外骨骼手套上的电机和连杆施加反向作用力给所述手部。
- 根据权利要求7所述的动态手部仿真装置,其特征在于,所述碰撞计算装置还用于产生一个汇报文件,所述汇报文件包括:在一个采样时间内所述手部在一个场景中执行一个操作过程的动作姿态数据集合,和所述手部和所述场景中的所述目标物体的距离集合,以及所述手部和所述场景中的所述目标物体之间的碰撞判断信息。
- 动态手部仿真系统,其中,包括:处理器;以及与所述处理器耦合的存储器,所述存储器具有存储于其中的指令,所述指令在被处理器执行时使所述电子设备执行动作,所述动作包括:S1,构建所述手部模型和所述场景中的目标物体模型,并对所述手部模型、所述场景和所述目标物体模型执行3D渲染;S2,映射所述手部模型和所述手部的动作姿态,并捕捉所述手部的动作姿态以生成动作姿态数据,当检测到在所述3D渲染中所述手部和所述场景中的至少一个目标物体模型具有重叠时则判断发生了碰撞,并发送碰撞提示信息。
- 计算机程序产品,所述计算机程序产品被有形地存储在计算机可读介质上并且包括计算机可执行指令,所述计算机可执行指令在被执行时使至少一个处理器执行根据权利要求1至6中任一项所述的方法。
- 计算机可读介质,其上存储有计算机可执行指令,所述计算机可执行指令在被执行时使至少一个处理器执行根据权利要求1至6中任一项所述的方法。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2020/082271 WO2021195916A1 (zh) | 2020-03-31 | 2020-03-31 | 动态手部仿真方法、装置和系统 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2020/082271 WO2021195916A1 (zh) | 2020-03-31 | 2020-03-31 | 动态手部仿真方法、装置和系统 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021195916A1 true WO2021195916A1 (zh) | 2021-10-07 |
Family
ID=77927721
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/082271 WO2021195916A1 (zh) | 2020-03-31 | 2020-03-31 | 动态手部仿真方法、装置和系统 |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2021195916A1 (zh) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114750168A (zh) * | 2022-06-14 | 2022-07-15 | 苏州上舜精密工业科技有限公司 | 一种基于机器视觉的机械手控制方法及系统 |
CN116188704A (zh) * | 2023-05-04 | 2023-05-30 | 北京红棉小冰科技有限公司 | 手部图像生成方法、装置、电子设备及可读存储介质 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101739879A (zh) * | 2009-12-22 | 2010-06-16 | 天津市天堰医教科技开发有限公司 | 基于实时位姿捕捉的手术仿真系统和方法及图形渲染模块 |
CN102207997A (zh) * | 2011-06-07 | 2011-10-05 | 哈尔滨工业大学 | 基于力反馈的机器人微创手术仿真系统 |
CN102663197A (zh) * | 2012-04-18 | 2012-09-12 | 天津大学 | 一种基于运动捕获的虚拟手抓取的仿真方法 |
CN103955295A (zh) * | 2014-04-17 | 2014-07-30 | 北京航空航天大学 | 一种基于数据手套和物理引擎的虚拟手的实时抓取方法 |
CN108983978A (zh) * | 2018-07-20 | 2018-12-11 | 北京理工大学 | 虚拟手控制方法及装置 |
US10429923B1 (en) * | 2015-02-13 | 2019-10-01 | Ultrahaptics IP Two Limited | Interaction engine for creating a realistic experience in virtual reality/augmented reality environments |
-
2020
- 2020-03-31 WO PCT/CN2020/082271 patent/WO2021195916A1/zh active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101739879A (zh) * | 2009-12-22 | 2010-06-16 | 天津市天堰医教科技开发有限公司 | 基于实时位姿捕捉的手术仿真系统和方法及图形渲染模块 |
CN102207997A (zh) * | 2011-06-07 | 2011-10-05 | 哈尔滨工业大学 | 基于力反馈的机器人微创手术仿真系统 |
CN102663197A (zh) * | 2012-04-18 | 2012-09-12 | 天津大学 | 一种基于运动捕获的虚拟手抓取的仿真方法 |
CN103955295A (zh) * | 2014-04-17 | 2014-07-30 | 北京航空航天大学 | 一种基于数据手套和物理引擎的虚拟手的实时抓取方法 |
US10429923B1 (en) * | 2015-02-13 | 2019-10-01 | Ultrahaptics IP Two Limited | Interaction engine for creating a realistic experience in virtual reality/augmented reality environments |
CN108983978A (zh) * | 2018-07-20 | 2018-12-11 | 北京理工大学 | 虚拟手控制方法及装置 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114750168A (zh) * | 2022-06-14 | 2022-07-15 | 苏州上舜精密工业科技有限公司 | 一种基于机器视觉的机械手控制方法及系统 |
CN114750168B (zh) * | 2022-06-14 | 2022-09-20 | 苏州上舜精密工业科技有限公司 | 一种基于机器视觉的机械手控制方法及系统 |
CN116188704A (zh) * | 2023-05-04 | 2023-05-30 | 北京红棉小冰科技有限公司 | 手部图像生成方法、装置、电子设备及可读存储介质 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Wang et al. | Real-virtual components interaction for assembly simulation and planning | |
CN103778301A (zh) | 一种基于虚拟样机技术的机械臂仿真方法 | |
Ullmann et al. | Intuitive virtual grasping for non haptic environments | |
Zubrycki et al. | Using integrated vision systems: three gears and leap motion, to control a 3-finger dexterous gripper | |
Mulatto et al. | Using postural synergies to animate a low-dimensional hand avatar in haptic simulation | |
WO2021195916A1 (zh) | 动态手部仿真方法、装置和系统 | |
Tian et al. | Realtime hand-object interaction using learned grasp space for virtual environments | |
Prattichizzo et al. | Digital handwriting with a finger or a stylus: a biomechanical comparison | |
Manou et al. | Off-line programming of an industrial robot in a virtual reality environment | |
Buzjak et al. | Towards immersive designing of production processes using virtual reality techniques | |
CN112365580A (zh) | 一种面向人机技能传授的虚拟作业演示系统 | |
JP2020077231A (ja) | 位置検出プログラム、位置検出方法及び位置検出装置 | |
Liu et al. | Development of a virtual maintenance system with virtual hand | |
JP2005046931A (ja) | ロボットアーム・ハンド操作制御方法、ロボットアーム・ハンド操作制御システム | |
Metzner et al. | Intuitive interaction with virtual commissioning of production systems for design validation | |
Palm et al. | Recognition of human grasps by time-clustering and fuzzy modeling | |
US11620416B2 (en) | Systems and methods for determining digital model positioning for grasping | |
CN111590560A (zh) | 一种通过摄像头远程操作机械手的方法 | |
Kawasaki et al. | Virtual robot teaching for humanoid hand robot using muti-fingered haptic interface | |
KR20130122288A (ko) | 로봇의 삼차원 시뮬레이션 방법 및 시스템 | |
Pavlik et al. | Expanding haptic workspace for coupled-object manipulation | |
Ehlers et al. | Self-scaling Kinematic Hand Skeleton for Real-time 3D Hand-finger Pose Estimation. | |
JP2006285685A (ja) | 3次元デザイン支援装置及び方法 | |
Lanzoni et al. | Manual tasks real-time ergonomic evaluation for collaborative robotics | |
Krause et al. | Haptic interaction with non-rigid materials for assembly and dissassembly in product development |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20929564 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20929564 Country of ref document: EP Kind code of ref document: A1 |