CN116901066A - Robot social behavior synchronous control method driven based on scene information and nerve modulation mechanism - Google Patents

Robot social behavior synchronous control method driven based on scene information and nerve modulation mechanism Download PDF

Info

Publication number
CN116901066A
CN116901066A CN202310887775.4A CN202310887775A CN116901066A CN 116901066 A CN116901066 A CN 116901066A CN 202310887775 A CN202310887775 A CN 202310887775A CN 116901066 A CN116901066 A CN 116901066A
Authority
CN
China
Prior art keywords
social
robot
group
space
cohesive force
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310887775.4A
Other languages
Chinese (zh)
Inventor
刘仁楷
刘晓瑞
于金鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao University
Original Assignee
Qingdao University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao University filed Critical Qingdao University
Priority to CN202310887775.4A priority Critical patent/CN116901066A/en
Publication of CN116901066A publication Critical patent/CN116901066A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/0005Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means
    • B25J11/001Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means with emotions simulating means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/1605Simulation of manipulator lay-out, design, modelling of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning

Abstract

The invention discloses a robot social behavior synchronous control method based on scene information and a neuromodulation mechanism drive. The method is verified on a small fat robot platform, and has good effect in the aspect of coordinating the performance of social behaviors in a multi-object scene. The eye gazing behavior is modulated by adopting an optimal control algorithm based on minimum nerve transmission noise, and the effectiveness and stability of the model and the strategy are proved by analyzing the dynamic characteristics of the robot under the model and the strategy.

Description

Robot social behavior synchronous control method driven based on scene information and nerve modulation mechanism
Technical Field
The invention relates to the technical field of robot control, in particular to a robot social behavior synchronous control method driven by scene information and a nerve modulation mechanism.
Background
Robots are now widely used in a variety of scenarios in which they should adhere to social protocols and exhibit natural behavior. However, with current research, although social interactions are common in everyday life, there is little overall analysis of human social responses. How to make robots understand social scenes and show natural behaviors of human beings is a widely focused problem in modern society. Currently there are two problems waiting to be solved, namely selection between different subjects and coordination of the synchronous control of the limbs. Based on the above-described problems, studies have been made from two methods, namely, a data driving method and a model driving method. However, without prior knowledge of biological mechanisms, data-driven models have difficulty interpreting human behavior on a dynamic and long-term scale. With respect to the study of model driven methods, many studies have focused on basic coordinated patterns of human behavior. Based on the background, the invention focuses on establishing a model to represent coordinated social behaviors including body movement/orientation, head rotation and eyeball movement, and proposes synchronous control based on social space and neuromodulation. The method is verified on a small fat robot platform, and shows good results in terms of coordinating the characterization of social behaviors in a multi-object scene.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention provides a robot social behavior synchronous control method driven by scene information and a nerve modulation mechanism, solves the control problem of robots in a multi-target social scene, and models coordinated behaviors including body movement and orientation, head rotation, eyeball movement and the like. Based on a given model, a synchronous control method driven by social space theory and a nerve regulation mechanism is provided. The method controls the robot body according to dynamic social space and adjusts eye-head coordinate gaze behavior based on minimum neural transmission noise law.
The invention aims at modeling coordinated social behaviors contained in a multi-target scene and proposes a robot social behavior synchronization control strategy containing body motion/orientation, head rotation and eyeball motion. The strategy collects the RGB-D images and sound fields perceived in the scene, and determines the movement and direction of the body according to the social space and the relative position of the robot and the person; an optimal control algorithm based on minimal nerve transmission noise is employed to determine eye-head gaze behavior.
In order to achieve the above purpose, the invention relates to a robot social behavior synchronous control method driven by scene information and a nerve modulation mechanism, which specifically comprises the following steps:
(1) Confirmation of robot social distance through social space of group
The social space is divided into two types, namely personal space and group space, and for a social individual, his individual space is a self-centered region which is passed through a Gaussian function f i p (x, y) to construct a set of (x, y),
f i p the calculation of (x, y) is as shown in formulas (2 a) - (2 d)
θ i =atan2((y-y i ),(x-x i ))(2b)
For a group, accumulating the Gaussian functions of the social object personal space to obtain a Gaussian equation f of the group space i g (x,y)
Wherein, (x, y) is the position of the robot in the plane coordinate system, (x) i ,y i ) For the position of social object i in the group under the plane coordinate system, A is amplitude and sigma x Standard deviation in horizontal direction, sigma y Is the standard deviation of the front-back direction, θ i N is the number of social objects in the group, which is the deflection angle between the robot and the body of the social object;
changing the position of the robot and thus the f of the robot i g (x, y) values such that the robot is maintained at a suitable distance from the group, i.e. a distance that meets the hall social criteria;
(2) Method for confirming robot body corner through optimal social interaction cohesive force
In a multi-object scenario, social interaction cohesion scores are used to represent the interaction intent in a population, the social interaction cohesion value is obtained from the following equation,
S g =W g *n (6)
S total =S i +S g +S p (8)
W p +W g +W i =1 (9)
wherein θ ij Is the angle of the body direction vector of the social object i and the social object j in the group on the X-Y plane of the plane coordinate system, W i Represents the specific gravity of the social interaction cohesive force score to the total cohesive force, and G represents the social space f of the group i g (x,y),W g Score S representing population size cohesive force g Total cohesive force score S total Specific gravity, W p Is close to cohesive score S p Total cohesive force score S total Dist (i, j) is the Euclidean distance between social object i and social object j in the group, W p Is a weight close to the total cohesion of the cohesive fraction. (x) j ,y j ) For the position of social object j in the group under the planar coordinate system, (x) i ,y i ) The position of the social object i in the group under the plane coordinate system;
finding out the body corner when the social interaction cohesive force score is maximum, and further obtaining the angular relationship between the body corner of the robot and the body orientation of each social object under the optimal social group cohesive force condition;
(3) And according to the sound source information, solving the optimal running time under the given boundary condition to control the eye-head coordination.
Compared with the prior art, the invention has the following beneficial effects: the invention establishes a model from the perspective of social clues and biology to represent coordinated social behaviors, and provides a synchronous control method based on social space and nerve modulation. The eye gazing behavior is modulated by adopting an optimal control algorithm based on minimum nerve transmission noise, and the effectiveness and stability of the model and the strategy are proved by analyzing the dynamic characteristics of the robot under the model and the strategy.
Drawings
Fig. 1 is a flowchart of a robot social behavior synchronization control method based on scene information and neural modulation mechanism driving. .
Fig. 2 is a gaussian function distribution diagram of a personal space.
FIG. 3 is a Gaussian function distribution diagram of population space.
Fig. 4 is a schematic diagram of the euclidean coordinate space of eye-coordinated gaze behavior.
Fig. 5 is a schematic diagram of the operation of the robot test platform according to embodiment 2.
Fig. 6 is a view of the practical application scenario involved in embodiment 2.
Detailed Description
For a clearer description of the present invention, the present invention will be further described with reference to the accompanying drawings and specific examples:
example 1
In a multi-objective scenario, the behavior of the robot appears in three ways. First, keeping a social distance with a social object; secondly, adjusting the steering of the body; thirdly, controlling eye coordination and forming staring action to specific targets. The behaviors are driven by human excitation and synchronously performed in an actual social scene, and three aspects are coordinated to form a robot social behavior synchronous control method driven by scene information and a neuromodulation mechanism, which specifically comprises the following steps:
1. confirmation of robot social distance through social space of group
Social distance may be interpreted as the distance between two individuals, between an individual and a group, or between an internal and an external group. Considering that the sense of space plays a dominant role in regulating social behavior, we consider that social space is divided into two categories, namely personal space and group space. For a social individual (robot or person), his individual space is an area centered on himself.
As shown in FIG. 2, the gradient inside the individual space is set to f i p (x, y). During the interaction, f i p (x, y) varies with the relative position of social objects, which mimics the spatial perception in a social scene, where A, σ xy The magnitude, the horizontal and the standard deviation of the front and back directions are respectively, the magnitude of the personal dynamic social space depends on A, sigma x ,σ y . It can be noted that the interaction of the robot to the front and back of the body is different, so we can use σ y Divided into sigma rear Sum sigma front Wherein sigma rear Sum sigma front Is the standard deviation in the front-rear direction thereof. The red line is f i p The minimum threshold that (x, y) can yield is generally considered the boundary of acceptable social distance. The comfort space is typically constructed by using a gaussian function that sets the position of the person as the center of the gaussian function and assigns a value to the space around the person to describe the extent to which the area accepts the person. Thus, it adopts two-dimensional asymmetryGaussian function to realize f i p (x,y)。
f i p (x,y)=AsymGauss(x,y,x i ,y i ,θ i ,A,σ x ,σ y ) (1)
σ x The value of (2) is the standard deviation in the horizontal direction, and σ y The value of (2) is the standard deviation in the front-rear direction, (x, y) is the position of the robot in the plane coordinate system, (x) i ,y i ) The location of the social object in the planar coordinate system. Their values change frequently depending on the location, posture and social cues of the person. As shown in fig. 3, when several persons stand in front of the robot, the group space is a combination of their personal spaces, the boundary of the group space encloses all the individual spaces,and->Is the personal space occupied by each of three people. The green arrow indicates the direction in which the person and robot face, and the dotted line extending from the arrow forms an angle (θ i ) Is the deflection angle formed by the body directions of two social objects (robot and group social object).
f i p The calculation process of (x, y) is as in equations (2 a) - (2 d).
θ i =atan2((y-y i ),(x-x i )) (2b)
Meanwhile, on the basis of the group space, the requirement of personal space is met. The method can accumulate the Gaussian functions of the personal space of the social object to obtain the Gaussian equation of the group space. Where n is the number of social objects in the group.
Thus, a group social space meeting the personal space combination can be obtained, and the position of the robot can be changed, so that the f of the robot can be changed i g The (x, y) value is such that the robot can be maintained at a suitable distance from the social object group, i.e. a distance that satisfies the hall social criteria.
2. Confirmation of robot body corner by optimal cohesion
To obtain proper body orientation, intra-population cohesion is used, which measures humans using multiscale cohesion criteria.
Near cohesive score (S) p ):S p From the proximity principle, as defined by equations (4) and (5).
Where dist (i, j) is the Euclidean distance, W, between social object i and social object j in the group p Is a weight close to the total cohesive force score of the cohesive fraction, (x) j ,y j ) For the position of social object j in the group under the planar coordinate system, (x) i ,y i ) Is the position of the social object i in the group under the planar coordinate system.
Group-scale cohesive force score (S) g ): for a given social group, cohesion should be consistent with the groupThe number is related. In the present invention, we consider that the population cohesion is proportional to the population size, as defined by equation (6).
S g =W g *n (6)
Wherein W is g Specific gravity representing the total cohesive force score of the group-scale cohesive force score
Social interaction cohesion score (S) i ): in a multi-object scenario, the spatial relationship between any two individuals must be considered. Since the interaction intent in the population can be applied to all pairings, we use the social interaction cohesive force score to represent, the cohesive force value is obtained from equation (7).
Wherein θ ij Is the angle of the body direction vector of social object i and social object j in the group on the X-Y plane of the camera coordinates, W i The specific gravity of the social interaction cohesion score to the total cohesion score is expressed. It assumes that the value of cohesion has a positive effect when two persons are facing each other, otherwise it is negative. Therefore, we choose 1+cos θ ij As the main operator for regulating the social interaction cohesive force score.
On the basis of the above multi-scale representation, S is defined total Is S p 、S g And S is i A kind of electronic device. By adjusting the normalized weights (W p ,W g ,W i ) We can implement a comprehensive description of dynamic social scenarios.
S total =S i +S g +S p (8)
W p +W g +W i =1 (9)
Wherein S is total Representing the total cohesion score.
To obtain an optimal total cohesion score (S total ) We need to obtain the maximum total score of social interaction cohesion (S i ) Due to the machineThe social interaction cohesive force of people under different body corners is different, so that the body corners when the social interaction cohesive force score is maximum can be found out through algorithm traversal, and the angular relationship between the body corners of the robot and the body orientation of each social object under the optimal social group cohesive force condition can be obtained.
3. Solving optimal run time for controlling eye synergy under given boundary conditions
For robots in social scenes, gaze behavior is the primary manifestation of their interaction with a particular person. When this interaction extends to multi-object conditions, a large line-of-sight transfer from one object to another is inevitably required. For example, when a robot interacts with a social object, another person enters the robot's field of view and has a strong social intent. The robot needs to react appropriately to the latter, including the distraction of attention and gaze. The natural fixation model proposed by the present invention is shown in fig. 4, and this gaze-transfer behavior requires coordination between the eyes, head and even body. It also needs to take into account the vertical rotation across the Y-axis and the horizontal rotation across the Z-axis, which is due to head torsion (θ H And theta V ) Caused by the method. Due to the participation of the head, the gaze-shift angle in a given direction is divided into a head torsion angle θ H 、θ H And focal line deflection angle alpha H 、α V And alpha is L And alpha R Is the horizontal angle of the eyeball in the social fixation behavior. In FIG. 4, coordinate systems XYZ and X H Y H Z H Representing an absolute coordinate system and a robot head coordinate system, respectively. X'. H And Y' H The axes respectively representing X H Axes and Y H Projection of the axis onto the horizontal plane XOY.
Multi-object interactions typically involve social gaze transfer behavior, which is a process performed by a 2-DOF system that involves eye movement and head torsion, from one line of sight to another. The deviation of the line of sight offset is defined as the standard error of the noise as shown in equation (10).
Wherein sigma eye ,σ head Representing the deviation of eye and head movements, t p Is the duration of gaze transfer, H eye (t) is an impulse response function of eye movement, H head (t) impulse response function of head movement. Suppose E (t) p ) Neural control signals(s) associated with the head and eyes head (t) and s eye (t)) is proportional, with coefficients A and B, respectively. For eye muscle T eye Applied torque and focal line y eye Is defined as equation (11 a), where T eye Is the muscle torque of eye movement. The muscle and orbital tissue have two lag times, respectively, with constants τ 1 And τ 2 . Let us assume T eye Is a nerve control signal s eye The time constant τ of the low-pass filter of (2) e The delay between the muscle response and the control signal is described as defined by equation (11 b).
State vectorIndicating the acceleration, velocity and position of the focal line. The dynamics of the eyeball can be expressed in a third-order form as shown in equations (12) and (13)
For the head, it is a rigid body controlled by vestibular and body cavity reflexes, defined as a second order system in equation (14 a).
State vectorIndicating acceleration, velocity and torsion of the head. Muscle torque T of head movement head Is a control signal s head (time constant is τ) h ) As shown in equations (15) and (16).
Wherein K, V, S represents the inertia, viscosity and stiffness of the head motion, by which we can find the impulse response function H eye (t) and H head (t), t represents the time of proceeding.
Wherein the method comprises the steps ofa 3 =(2K-Vτ h )/2Kτ h . In order to solve the optimized control signal with minimum neural noise variance, the present invention defines the standard error of neural noise in equation (19)
By solving a corresponding Hamiltonian h eye (y e ,s eye T), we can derive an optimal control signal t for a given duration 0 ,t f ]Taking a unidirectional eye dynamics model as an example, the loss function can be converted intoOptimal signal s eye By calculating->And setting it to zero to perform semi-analytical solution, as shown in formula (21), the initial condition and the final condition are defined by +.>And->Give, get y eye Velocity profile and t of (2) f In the numerical method, ensure t f Is the optimal control time, and the solving formula is shown as (20-22).
Is a vector of lagrangian multipliers and can be applied to horizontal or vertical motion of focal line deflection by calculating the above-described optimal solution method, respectively. The above solution is the same for the head, as shown in equations (23-25).
Wherein lambda is head Is a vector of lagrange multipliers. From the above equation, y can be obtained head Velocity profile and t of (2) f Is a relationship of (3). By the method, the optimal running time of the eye model and the head model under a given rotation angle can be solved when the eye model and the head model independently run. By linear fitting the two corresponding relations, the running time t can be obtained f ,t f The solution of the coordinated control of the eye head will be supported as a known condition.
Example 2
1. Combination with a robot platform
A small fat robot is used as a test platform. A fat robot is a social robot for educational and recreational applications, equipped with multiple articulated motors that drive movements, rotations, head rotations, and the like. The screen is mounted on the face of the robot to support Android-based animated projection for displaying a three-dimensional system surface containing eye movements in horizontal and vertical directions. For the sensor, an RGB-D camera (Realsense D435) was mounted on the robot head and connected to an internal NUC computer. In the interaction process, a robot NUC computer acquires data streams from a camera and a microphone array, for example, in a social scene, three-dimensional coordinates of a social object face and the position of a sound source are used for carrying out social object selection and behavior expression on a robot in a robot coordinate system, and then spatial perception is completed in an algorithm based on the insight face to confirm the position relation between the robot and the social object and the selection of a social object. Based on these selections, the robot will drive the body, neck and face animations to create gaze behavior on the selected person. Meanwhile, the position information of the system is recorded, and the spatial position information of the system can be obtained, so that the system is used for an actual application scene.
2. Practical application scene
In an actual social scenario, the robot social scenario occurs more in the interaction between a plurality of people and a robot, in which case the robot first adjusts the distance between the plurality of people and the body angle of the robot itself to a suitable state, as shown in fig. 6 a. On the basis of maintaining social distance and body orientation, gaze behavior is taken into account. As shown in fig. 6 b. Initially, the robot keeps looking at People-1 by controlling head movements and animated faces on the screen. The People-2 will then emit some sound that can be measured by the robot, thereby causing a diversion of the robot's social attention. Along with the transformation, the robot changes the gazing behavior of the robot from Peole-1 to Peole-2, and People can see that the gazing behavior of the robot on social objects is completed by glance of eyes and head torsion, so that the purpose of gazing and transferring the robot under a plurality of social object scenes is achieved.

Claims (1)

1. The robot social behavior synchronous control method driven by the scene information and the nerve modulation mechanism is characterized by comprising the following steps of:
(1) Confirmation of robot social distance through social space of group
The social space is divided into two types, namely personal space and group space, and for a social individual, his individual space is a self-centered region which is passed through a Gaussian function f i p (x, y) to construct a set of (x, y),
f i p the calculation of (x, y) is as shown in formulas (2 a) - (2 d)
θ i =atan2((y-y i ),(x-x i )) (2b)
For a group, accumulating the Gaussian functions of the social object personal space to obtain a Gaussian equation f of the group space i g (x,y)
Wherein, (x, y) is the position of the robot in the plane coordinate system, (x) i ,y i ) For the position of social object i in the group under the plane coordinate system, A is amplitude and sigma x Standard deviation in horizontal direction, sigma y Is the standard deviation of the front-back direction, θ i N is the number of social objects in the group, which is the deflection angle between the robot and the body of the social object;
changing the position of the robot and thus the f of the robot i g (x, y) values such that the robot and group can be maintained in a single jointA proper distance, namely a distance meeting Hall social criteria;
(2) Method for confirming robot body corner through optimal social interaction cohesive force
In a multi-object scenario, social interaction cohesion scores are used to represent the interaction intent in a population, the social interaction cohesion value is obtained from the following equation,
S g =W g *n (6)
S total =S i +S g +S p (8)
W p +W g +W i =1 (9)
wherein θ ij Is the angle of the body direction vector of the social object i and the social object j in the group on the X-Y plane of the plane coordinate system, W i Represents the specific gravity of the social interaction cohesive force score to the total cohesive force, and G represents the social space f of the group i g (x,y),W g Score S representing population size cohesive force g Total cohesive force score S total Specific gravity, W p Is close to cohesive score S p Total cohesive force score S total Dist (i, j) is the Euclidean distance between social object i and social object j in the group, W p Is a weight close to the total cohesion of the cohesive fraction. (x) j ,y j ) For the position of social object j in the group under the planar coordinate system, (x) i ,y i ) The position of the social object i in the group under the plane coordinate system;
finding out the body corner when the social interaction cohesive force score is maximum, and further obtaining the angular relationship between the body corner of the robot and the body orientation of each social object under the optimal social group cohesive force condition;
(3) And according to the sound source information, solving the optimal running time under the given boundary condition to control the eye-head coordination.
CN202310887775.4A 2023-07-19 2023-07-19 Robot social behavior synchronous control method driven based on scene information and nerve modulation mechanism Pending CN116901066A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310887775.4A CN116901066A (en) 2023-07-19 2023-07-19 Robot social behavior synchronous control method driven based on scene information and nerve modulation mechanism

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310887775.4A CN116901066A (en) 2023-07-19 2023-07-19 Robot social behavior synchronous control method driven based on scene information and nerve modulation mechanism

Publications (1)

Publication Number Publication Date
CN116901066A true CN116901066A (en) 2023-10-20

Family

ID=88352650

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310887775.4A Pending CN116901066A (en) 2023-07-19 2023-07-19 Robot social behavior synchronous control method driven based on scene information and nerve modulation mechanism

Country Status (1)

Country Link
CN (1) CN116901066A (en)

Similar Documents

Publication Publication Date Title
US11052288B1 (en) Force measurement system
Yang et al. Mind control of a robotic arm with visual fusion technology
US11311209B1 (en) Force measurement system and a motion base used therein
Shibata et al. Biomimetic gaze stabilization based on feedback-error-learning with nonparametric regression networks
Duan et al. Design of a multimodal EEG-based hybrid BCI system with visual servo module
Asfour et al. The karlsruhe humanoid head
US10176725B2 (en) System and method of pervasive developmental disorder interventions
Tzafestas Intelligent Systems, Control and Automation: Science and Engineering
Whitton et al. Comparing VE locomotion interfaces
Menda et al. Optical brain imaging to enhance UAV operator training, evaluation, and interface development
JP4463120B2 (en) Imitation robot system and its imitation control method
Pfeiffer et al. Human-piloted drone racing: Visual processing and control
Panerai et al. Visuo-inertial stabilization in space-variant binocular systems
Pateromichelakis et al. Head-eyes system and gaze analysis of the humanoid robot Romeo
DE102010023914A1 (en) Method and device for controlling a governor
Berthouze et al. Learning of oculo-motor control: a prelude to robotic imitation
Hodge et al. Optimising the yaw motion cues available from a short stroke hexapod motion platform
Macchini et al. Does spontaneous motion lead to intuitive Body-Machine Interfaces? A fitness study of different body segments for wearable telerobotics
Silva et al. Mirroring emotion system-on-line synthesizing facial expressions on a robot face
Zhang et al. Using the motion of the head-neck as a joystick for orientation control
Jensen et al. Trends in haptic communication of human-human dyads: Toward natural human-robot co-manipulation
CN116901066A (en) Robot social behavior synchronous control method driven based on scene information and nerve modulation mechanism
JP7120572B2 (en) Visibility evaluation system
Patane et al. Design and development of a biologically-inspired artificial vestibular system for robot heads
Batmaz et al. Effects of image size and structural complexity on time and precision of hand movements in head mounted virtual reality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination