CN109254649A - A kind of high efficiency interactive system based on closed cockpit - Google Patents

A kind of high efficiency interactive system based on closed cockpit Download PDF

Info

Publication number
CN109254649A
CN109254649A CN201810869753.4A CN201810869753A CN109254649A CN 109254649 A CN109254649 A CN 109254649A CN 201810869753 A CN201810869753 A CN 201810869753A CN 109254649 A CN109254649 A CN 109254649A
Authority
CN
China
Prior art keywords
max
seat
user
row
col
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810869753.4A
Other languages
Chinese (zh)
Inventor
俞峰
汤勇明
郑姚生
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN201810869753.4A priority Critical patent/CN109254649A/en
Publication of CN109254649A publication Critical patent/CN109254649A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本发明公开了一种基于封闭式座舱的高效交互系统,包括一个封闭式座舱,该封闭式座舱外侧的前、后、左、右四个方向分别安设摄像头,且每个摄像头安装在舵机云台上;封闭式座舱内侧的前、后、左、右四个方向分别安设背投影,每个背投影通过数据传输线与同一方向的摄像头相连接;封闭式座舱内部的中心处安设有固定式转动座椅,其椅背和座垫分别设置压力传感器阵列,用户的头部和躯干各处安设陀螺仪;固定式转动座椅的正前方设置触摸屏,触摸屏的下方设置了手势识别设备。本发明综合基于坐姿姿态校正的体感交互方式和结合手势识别的触摸交互方式,解决了现有技术中交互手段集成度不高、交互效率较低、交互方式智能化不高、沉浸式体验感不足的问题。

The invention discloses an efficient interaction system based on a closed cockpit, comprising a closed cockpit, cameras are respectively installed in the front, rear, left and right directions outside the closed cockpit, and each camera is installed on the steering gear On the gimbal; rear projections are installed in the front, rear, left and right directions inside the enclosed cockpit, and each rear projection is connected to the camera in the same direction through a data transmission line; the center of the enclosed cockpit is installed with In the fixed rotating seat, pressure sensor arrays are installed on the seat back and seat cushion respectively, and gyroscopes are installed around the user's head and torso; a touch screen is installed directly in front of the fixed rotating seat, and a gesture recognition device is installed below the touch screen. . The invention integrates the somatosensory interaction method based on sitting posture correction and the touch interaction method combined with gesture recognition, and solves the problems of low integration of interaction means, low interaction efficiency, low intelligence of the interaction method and insufficient immersive experience in the prior art. The problem.

Description

A kind of high efficiency interactive system based on closed cockpit
Technical field
The invention belongs to showing interaction technique field, in particular to a kind of high efficiency interactive system based on closed cockpit System.
Background technique
Cockpit display is widely used in aviation, and display interface includes more parameter and state of flight information, behaviour Making mode is more traditional operating stick interaction and key interaction, and total system requires operation smooth, and picture is clear, interaction and When.Show that interaction technique is widely used in command and control system, such as aerospace, naval vessels charge, it is considered to be " people and machine Bridge between device ".Its particular technique refers to that machine provides a large amount of useful informations to people by presentation device and prompt is asked for instructions, People realizes the process that people interacts with machine by input equipment to machine transmission command information.It and cognitive science, human engineering There is close connection in the fields such as, psychology.
Interaction technique applied by current is compared with the demand in application scenarios, and there are the gaps of three aspects:
1, show that the integrated level of interactive means is not high;Most of interaction technique still uses traditional display interactive means, It is simple integrated to be applied to control equipment, do not integrate it is relatively advanced, efficiently show interactive means, also do not examine sufficiently at the same time Consider the influence to user experience and actual effect such as convenient comfort of interactive efficiency, manipulation.
2, the optimization of interactive mode is insufficient under various modes;Although being integrated with more interactive mode, only to interaction Mode, which is done, simply piles up, and does not do deeply and optimizes to interactive mode, causes the efficiency of decision-making of user to decline, each interactive mode Collaborative it is not strong.
3, assist the intelligence of interactive mode not strong, initiative of the interactive mode in interactive process is inadequate, also unstructured It can adapt to the adaptive human-computer interaction system of different application scene, different manipulation roles.
Summary of the invention
In order to solve the technical issues of above-mentioned background technique proposes, the present invention is intended to provide a kind of based on closed cockpit High efficiency interactive system realizes the high efficiency interactive under closed cockpit application scenarios, and increases the immersion experience of user.
In order to achieve the above technical purposes, the technical solution of the present invention is as follows:
A kind of high efficiency interactive system based on closed cockpit, including a closed cockpit, closed cockpit outside Front, rear, left and right four direction install a camera respectively, and each camera is mounted on steering engine holder;Closed seat Front, rear, left and right four direction on the inside of cabin installs a back projection respectively, each back projection by data line with it is aforementioned Unidirectional camera is connected, to show the image of corresponding camera acquisition;Pacify at the center of closed cabin interior Equipped with fixed swiveling seat, array of pressure sensors is respectively set in the chair back and seat cushion of the fixed swiveling seat, and user sits In on fixed swiveling seat, the head of user and trunk install gyroscope everywhere;The front of fixed swiveling seat is arranged The lower section of touch screen, touch screen is provided with gesture identification equipment;Focusing control terminal, the control terminal of steering engine holder, the pressure of camera The output end difference of the output end of force sensor array, the output end of gyroscope, the output end of touch screen and gesture identification equipment It is connected with system master controller.
Further, which using the body feeling interaction mode based on sitting posture attitude updating and combines gesture to know Other touch interactive mode.
Further, the body feeling interaction mode based on sitting posture attitude updating is, by the chair back of fixed swiveling seat It is real-time transmitted to the data of the gyroscope acquisition of the array of pressure sensors and user's head and trunk on seat cushion everywhere and is System master controller, system master controller solve the current sitting posture information of user, root according to the data that array of pressure sensors acquires The current body-sensing posture information of user is solved according to the data that gyroscope acquires, and not androgynous by the current sitting posture information correction of user Feel the threshold value of the triggered response of posture, to obtain the current body-sensing posture of final user, system master controller is worked as according to user The focus point of preceding each camera of body-sensing gesture stability and the deflection angle of each steering engine holder complete body feeling interaction.
It further, will if being respectively provided with 3 × 3 array of pressure sensors on the chair back and seat cushion of fixed swiveling seat The signal of each pressure sensor acquisition is calculated as the following formula:
MAXrow=max { row_F1,row_F2,row_F3}
MAXcol=max { col_F1,col_F2,col_F3}
Location (x, y)=(rowmax,colmax)
In above formula, F (i, j) is the letter of the i-th row jth column pressure sensor acquisition in 3 × 3 arrays;Location (x, y) is The maximum coordinate of array stress point, rowmax,colmaxFor MAXrow,MAXcolCorresponding ranks coordinate, (x, y) | (x, y) ∈ (1, 1),(1,2),(1,3),(2,1),(2,2),(2,3),(3,1),(3,2),(3,3)};
User is solved according to the three axis signal gyroscope (x, y, z) that location (x, y) and gyroscope export currently to sit Appearance:
State=function_correction [MAXrow-back,MAXcol-back,MAXrow-seat,MAXcol-seat,
location(x,y)back,location(x,y)seat, gyroscope (x, y, z)] and in above formula, MAXrow-back, MAXcol-back,location(x,y)backThe MAX obtained for chair back array of pressure sensorsrow,MAXcol,location(x,y) Value, MAXrow-seat,MAXcol-seat,location(x,y)seatThe MAX obtained for seat cushion array of pressure sensorsrow,MAXcol, Location (x, y) value;Function_correction [] be sitting posture recognition function, state ∈ front, behind, Left, right, idle }, i.e., user's sitting posture include front, rear, left and right, in 5 kinds of sitting postures, this corresponding body-sensing posture of 5 kinds of sitting postures Be followed successively by lean forward, swing back, is left-leaning, Right deviation and placed in the middle;
When user is chronically at a certain sitting posture state, sitting posture state is considered as to the habit sitting posture of user, then system master Controller increase triggers the response activation threshold value of the corresponding body-sensing posture of the sitting posture.
Further, the touch interactive mode of the combination gesture identification is, user's striking on the touchscreen by finger Instruction is hit and slidably inputed, while identifying the gesture instruction of user by gesture identification equipment, and system master controller executes hand The priority of gesture instruction is higher than the instruction of touch screen acquisition.
Further, gesture identification equipment is mounted on one-dimensional steering engine holder, the deflection angle of the one-dimensional steering engine holder It is controlled by system master controller, system master controller input information according to received by touch screen judges active user's hand institute Position, control the deflection of one-dimensional steering engine holder, accordingly so as to adjust the detection range of gesture identification equipment.
By adopting the above technical scheme bring the utility model has the advantages that
The present invention combines two kinds of interactive modes --- body feeling interaction mode based on sitting posture attitude updating and combine gesture The touch interactive mode of identification.Emphasis corresponding to two kinds of interactive modes is different, the body feeling interaction based on sitting posture attitude updating Mode is used to control the rotation of extraneous camera lower rudder machine head and the focus point of camera, for adjusting the shown external world Environment inputs picture, and wherein the speed of image switching is changed according to the setting of user, avoids too fast and excessively slow phenomenon shadow Ring user experience.Then it is used to handle the equipment work shape of the closed cockpit of interaction itself in conjunction with the touch interactive mode of gesture identification State, i.e., corresponding parameter passes through touch screen and is shown to user in closed cockpit, and user can also be realized by this touch screen To the order of closed cockpit input, gesture identification is then more rapidly easy a kind of interactive mode on this basis.Integrate this Two kinds of interactive modes, user can realize the good interaction with the external world in closed cockpit, and can experience on this basis Preferable immersion effect.
Detailed description of the invention
Fig. 1 is inventive closure formula cockpit schematic diagram;
Fig. 2 is inventive sensor setting schematic diagram;
Fig. 3 is the body feeling interaction mode flow chart the present invention is based on sitting posture attitude updating;
Fig. 4 is array of pressure sensors setting schematic diagram of the present invention;
Fig. 5 is the touch interactive system schematic diagram that the present invention combines gesture identification;
Fig. 6 is each component locations schematic diagram of inventive touch interactive system;
Fig. 7 is the touch interactive mode flow chart that the present invention combines gesture identification;
Fig. 8 is whole system implementation flow chart of the present invention.
Specific embodiment
Below with reference to attached drawing, technical solution of the present invention is described in detail.
The present invention devises a kind of high efficiency interactive system based on closed cockpit, as shown in Figure 1, the closed cockpit Front end, rear end, left end right end place a pancratic camera respectively, referring to the label 1,2,3,4 in Fig. 1, camera shooting Head is mounted on steering engine holder, and wherein steering engine holder can be turned to by control signal control, to drive camera to difference Direction rotates, for rendering different scenes.In closed cabin interior, 4 back projections are set, referring to label 5 in Fig. 1,6,7, 8, four camera acquired image information are transmitted to the back projection of closed cabin interior by data line, to closing Personnel in formula cockpit transmit external image information.Fixed swiveling seat (only bottom is fixed) is arranged in closed cabin interior At center, referring to the label 9 in Fig. 1.The present invention is using the body feeling interaction mode based on sitting posture attitude updating and gesture is combined to know Other touch interactive mode.
For the body feeling interaction mode based on sitting posture attitude updating, the sitting posture mould based on the distribution of user's pressure is initially set up Type enters pressure data acquisition in system master controller according to the pressure sensor of each place arrangement in seat, can by operation Obtain the sitting posture of user's current state.Secondly, current according to the gyroscope acquisition user for being distributed in user's body key position Body-sensing information.In order to avoid the numerical value of external irregular effect of jitter gyroscope, the data of gyroscope should be filtered by Kalman Wave.As shown in Fig. 2, 10 and 12 be respectively the pressure sensor that seat cushion and the chair back is arranged in, 11 and 13 be respectively that setting exists The gyroscope of user's head and trunk.System master controller acquires body-sensing information and the current sitting posture of user is combined to carry out operation, root Change the threshold value of corresponding body-sensing action response according to the sitting posture information of active user.For example, if pressure sensor detects user Pressure distribution be concentrated mainly on the preceding part (this means that user, which compares, is accustomed to forward sitting posture) of seat, then microcontroller exists Judge that the threshold value that will increase posture triggering of leaning forward when the body-sensing posture of user, this method have evaded different user since sitting posture is practised Used difference and the problem for causing the response of body-sensing posture inaccurate.Last microcontroller exports active user's according to the result of operation Body-sensing status information, for interacting, the process is as shown in Figure 3.
In the body feeling interaction mode based on sitting posture attitude updating, active user's body-sensing appearance that system master controller is exported State information will be used to adjust the deflection angle of the focus point and steering engine holder of extraneous camera, and such embodiment is similar to reality People and extraneous interactive mode in the world.For example, the visual field will focus on a certain spy when human body tends to a certain specific direction Earnest body, object will be also presented in human eye in a manner of a kind of be more clear.This scene is mapped in closed cockpit, when When the posture of user leans forward, by the direction before making camera aligning surface, camera also will be according to leaning forward for the steering engine of camera base Angle moderately focus, to realize observation to object in front.
In the present invention, according to seat everywhere pressure sensor numerical value come the method that constructs user's sitting posture model are as follows: with It is respectively arranged 9 pressure sensors on the seat cushion and the chair back of the seat at family, forms 3 × 3 arrays, as shown in 14,15 in Fig. 4, The pressure value that each coordinate points are calculated by establishing coordinate can calculate the sitting posture of user, carry out subsequent fortune It calculates, calculates separately the ratio that every a line and the pressure value of each column in the coordinate system account for integral pressure, relatively can be obtained later The sitting posture state of user.Calculation formula is as follows:
MAXrow=max { row_F1,row_F2,row_F3}
MAXcol=max { col_F1,col_F2,col_F3}
Location (x, y)=(rowmax,colmax)
In above formula, F (i, j) is the letter of the i-th row jth column pressure sensor acquisition in 3 × 3 arrays;Location (x, y) is The maximum coordinate of array stress point, rowmax,colmaxFor MAXrow,MAXcolCorresponding ranks coordinate, (x, y) | (x, y) ∈ (1, 1),(1,2),(1,3),(2,1),(2,2),(2,3),(3,1),(3,2),(3,3)}。
User is solved according to the three axis signal gyroscope (x, y, z) that location (x, y) and gyroscope export currently to sit Appearance:
State=function_correction [MAXrow-back,MAXcol-back,MAXrow-seat,MAXcol-seat,
location(x,y)back,location(x,y)seat, gyroscope (x, y, z)] and in above formula, MAXrow-back, MAXcol-back,location(x,y)backThe MAX obtained for chair back array of pressure sensorsrow,MAXcol,location(x,y) Value, MAXrow-seat,MAXcol-seat,location(x,y)seatThe MAX obtained for seat cushion array of pressure sensorsrow,MAXcol, Location (x, y) value;Function_correction [] be sitting posture recognition function, state ∈ front, behind, Left, right, idle }, i.e., user's sitting posture include front, rear, left and right, in 5 kinds of sitting postures.
For combining the touch interactive mode of gesture identification, setting touch screen is used to receive the hand of user in face of seat Gesture input, while being used to receive the certain gestures information of user in the lower section of touch screen setting Leap Motion.In view of the reality The touch screen position for applying mode is relatively fixed, but the size of screen may be relatively large, is easy the identification beyond Leap Motion Range, it is contemplated that the identification range of user gesture must design matched one within triangular pyramid into leap motion Steering engine holder dynamically track user gesture is tieed up to solve the above problems.The specific embodiment of one-dimensional holder in one's power are as follows: in Leap Motion pedestal install freedom degree be 1 steering engine holder, realize gestures detection range expansion, implementation model as shown in figure 5, its In, (a) indicates user gesture not in Leap Motion detection range, (b) indicates to make user gesture position by multistage holder adjustment In in Leap Motion detection range, label 16 indicates effective gesture identification range in Leap Motion, and 17 indicate to touch Screen, 18 indicate Leap Motion, and 19 indicate one-dimensional steering engine holder.The rotational angle of steering engine holder is provided by system master controller, System master controller inputs touch information according to received by touch screen and judges position where active user's hand, by meter Corresponding rotational angle is exported after calculation realizes Leap Motion to the dynamically track of user gesture with this to steering engine holder.Such as Fig. 6 The shown installation site for touching each component of interactive system, label 20 are one-dimensional steering engine holder, and 21 be to be mounted on above holder Leap Motion, 22 be touch screen, and 23 be the effective range of Leap Motion.
In the touch interactive mode for combining gesture identification, the priority of gesture identification is greater than touch recognition priority, is System main device will pay the utmost attention to the result of gesture identification.The design of the part is mainly in view of in case of emergency, and user comes not And a series of operational motion is made to complete the processing to a certain emergency, so using gesture identification in touch interface Method improves interactive efficiency.The process that its program executes is as shown in Figure 7.
In the present invention, system master controller uses the microprocessor of generic structure.The fixed closed seat of swiveling seat The contact position in bilge portion should be disposed with buffer unit, such as spring, rubber, avoid external interference with this.Fig. 8 can be complete Show the implementation process of two kinds of interactive modes.
Embodiment is merely illustrative of the invention's technical idea, and this does not limit the scope of protection of the present invention, it is all according to Technical idea proposed by the present invention, any changes made on the basis of the technical scheme are fallen within the scope of the present invention.

Claims (6)

1.一种基于封闭式座舱的高效交互系统,其特征在于:该高效交互系统包括一个封闭式座舱,该封闭式座舱外侧的前、后、左、右四个方向分别安设一个摄像头,且每个摄像头安装在舵机云台上;封闭式座舱内侧的前、后、左、右四个方向分别安设一个背投影,每个背投影通过数据传输线与前述同一方向的摄像头相连接,从而显示对应摄像头采集的图像;封闭式座舱内部的中心处安设有固定式转动座椅,该固定式转动座椅的椅背和座垫分别设置压力传感器阵列,用户坐于固定式转动座椅上,用户的头部和躯干各处安设陀螺仪;固定式转动座椅的正前方设置触摸屏,触摸屏的下方设置了手势识别设备;摄像头的对焦控制端、舵机云台的控制端、压力传感器阵列的输出端、陀螺仪的输出端、触摸屏的输出端和手势识别设备的输出端分别与系统主控制器相连。1. An efficient interactive system based on a closed cockpit, characterized in that: the efficient interactive system comprises a closed cockpit, and a camera is respectively installed in the front, rear, left and right directions outside the closed cockpit, and Each camera is installed on the steering gear gimbal; a rear projection is installed in the front, rear, left and right directions inside the enclosed cockpit, and each rear projection is connected to the aforementioned camera in the same direction through a data transmission line, so that The image captured by the corresponding camera is displayed; a fixed rotating seat is installed in the center of the closed cockpit, the seat back and seat cushion of the fixed rotating seat are respectively provided with pressure sensor arrays, and the user sits on the fixed rotating seat , gyroscopes are installed on the user's head and torso; a touch screen is set directly in front of the fixed rotating seat, and a gesture recognition device is set below the touch screen; the focus control end of the camera, the control end of the servo gimbal, and the pressure sensor The output end of the array, the output end of the gyroscope, the output end of the touch screen and the output end of the gesture recognition device are respectively connected with the main controller of the system. 2.根据权利要求1所述基于封闭式座舱的高效交互系统,其特征在于:该高效交互系统采用基于坐姿姿态校正的体感交互方式和结合手势识别的触摸交互方式。2 . The high-efficiency interaction system based on a closed cockpit according to claim 1 , wherein the high-efficiency interaction system adopts a somatosensory interaction method based on sitting posture correction and a touch interaction method combined with gesture recognition. 3 . 3.根据权利要求2所述基于封闭式座舱的高效交互系统,其特征在于:所述基于坐姿姿态校正的体感交互方式为,将固定式转动座椅的椅背和座垫上的压力传感器阵列以及用户头部和躯干各处的陀螺仪采集的数据实时传送给系统主控制器,系统主控制器根据压力传感器阵列采集的数据求解出用户当前坐姿信息,根据陀螺仪采集的数据求解出用户当前体感姿态信息,并通过用户当前坐姿信息校正不同体感姿态所触发响应的阈值,从而获得最终的用户当前体感姿态,系统主控制器根据用户当前体感姿态控制各摄像头的对焦点和各舵机云台的偏转角度,完成体感交互。3. The high-efficiency interactive system based on a closed cockpit according to claim 2, wherein the somatosensory interaction method based on sitting posture correction is to rotate the seat back of the fixed rotating seat and the pressure sensor array on the seat cushion And the data collected by the gyroscopes on the user's head and torso are transmitted to the main controller of the system in real time. Somatosensory posture information, and correct the thresholds of responses triggered by different somatosensory postures through the user's current sitting posture information, so as to obtain the final user's current somatosensory posture. the deflection angle to complete the somatosensory interaction. 4.根据权利要求3所述基于封闭式座舱的高效交互系统,其特征在于:设固定式转动座椅的椅背和座垫上均设置3×3的压力传感器阵列,将各压力传感器采集的信号按下式进行计算:4. The high-efficiency interactive system based on the enclosed cockpit according to claim 3, wherein a 3×3 pressure sensor array is arranged on the seat back and the seat cushion of the fixed rotating seat, and the data collected by each pressure sensor is collected. The signal is calculated as follows: MAXrow=max{row_F1,row_F2,row_F3}MAX row =max{row_F 1 ,row_F 2 ,row_F 3 } MAXcol=max{col_F1,col_F2,col_F3}MAX col =max{col_F 1 ,col_F 2 ,col_F 3 } location(x,y)=(rowmax,colmax)location(x,y)=(row max ,col max ) 上式中,F(i,j)为3×3阵列中第i行第j列压力传感器采集的信;location(x,y)为阵列受力点最大的坐标,rowmax,colmax为MAXrow,MAXcol对应的行列坐标,{(x,y)|(x,y)∈(1,1),(1,2),(1,3),(2,1),(2,2),(2,3),(3,1),(3,2),(3,3)};In the above formula, F(i,j) is the information collected by the pressure sensor in the ith row and jth column in the 3×3 array; location(x,y) is the maximum coordinate of the force point of the array, row max , col max are MAX row , the row and column coordinates corresponding to MAX col , {(x,y)|(x,y)∈(1,1),(1,2),(1,3),(2,1),(2,2 ),(2,3),(3,1),(3,2),(3,3)}; 根据location(x,y)和陀螺仪输出的三轴信号gyroscope(x,y,z)求解用户当前坐姿:Find the current sitting posture of the user according to location(x,y) and the three-axis signal gyroscope(x,y,z) output by the gyroscope: state=function_correction[MAXrow-back,MAXcol-back,MAXrow-seat,MAXcol-seat,location(x,y)back,location(x,y)seat,gyroscope(x,y,z)]state=function_correction[MAX row-back ,MAX col-back ,MAX row-seat ,MAX col-seat ,location(x,y) back ,location(x,y) seat ,gyroscope(x,y,z)] 上式中,MAXrow-back,MAXcol-back,location(x,y)back为椅背压力传感器阵列得到的MAXrow,MAXcol,location(x,y)值,MAXrow-seat,MAXcol-seat,location(x,y)seat为座垫压力传感器阵列得到的MAXrow,MAXcol,location(x,y)值;function_correction[]为坐姿识别函数,state∈{front,behind,left,right,idle},即用户坐姿包括前、后、左、右、中5种坐姿,这5种坐姿对应的体感姿态依次为前倾、后仰、左倾、右倾和居中;In the above formula, MAX row-back , MAX col-back , location(x,y) back are the MAX row , MAX col , location(x,y) values obtained by the seat back pressure sensor array, MAX row-seat , MAX col -seat ,location(x,y) seat is the MAX row ,MAX col ,location(x,y) value obtained by the seat cushion pressure sensor array; function_correction[] is the sitting posture recognition function, state∈{front,behind,left,right ,idle}, that is, the user's sitting posture includes 5 sitting postures: front, back, left, right, and middle, and the somatosensory postures corresponding to these 5 sitting postures are forward, backward, left, right, and center; 当用户长期处于某一坐姿state,将该坐姿state视为用户的习惯坐姿,则系统主控制器增大触发该坐姿对应的体感姿态的响应触发阈值。When the user is in a certain sitting posture state for a long time, and the sitting posture state is regarded as the user's habitual sitting posture, the system main controller increases the response trigger threshold for triggering the somatosensory posture corresponding to the sitting posture. 5.根据权利要求2所述基于封闭式座舱的高效交互系统,其特征在于:所述结合手势识别的触摸交互方式为,用户通过手指在触摸屏上的敲击和滑动输入指令,同时通过手势识别设备识别用户的手势指令,且系统主控制器执行手势指令的优先级高于触摸屏采集的指令。5. The high-efficiency interactive system based on the enclosed cockpit according to claim 2, wherein the touch interaction method combined with gesture recognition is that the user inputs commands by tapping and sliding with fingers on the touch screen, and simultaneously recognizes the gestures through gesture recognition. The device recognizes the user's gesture command, and the system main controller executes the gesture command with a higher priority than the command collected by the touch screen. 6.根据权利要求5所述基于封闭式座舱的高效交互系统,其特征在于:将手势识别设备安装在一维舵机云台上,该一维舵机云台的偏转角度受系统主控制器控制,系统主控制器根据触摸屏所接收到的输入信息判断出当前用户手所在的位置,据此控制一维舵机云台偏转,从而调整手势识别设备的检测范围。6. The high-efficiency interactive system based on a closed cockpit according to claim 5, wherein the gesture recognition device is installed on a one-dimensional steering gear head, and the deflection angle of the one-dimensional steering gear head is controlled by the system main controller Control, the main controller of the system determines the current position of the user's hand according to the input information received by the touch screen, and controls the deflection of the one-dimensional steering gear PTZ accordingly, thereby adjusting the detection range of the gesture recognition device.
CN201810869753.4A 2018-08-02 2018-08-02 A kind of high efficiency interactive system based on closed cockpit Pending CN109254649A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810869753.4A CN109254649A (en) 2018-08-02 2018-08-02 A kind of high efficiency interactive system based on closed cockpit

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810869753.4A CN109254649A (en) 2018-08-02 2018-08-02 A kind of high efficiency interactive system based on closed cockpit

Publications (1)

Publication Number Publication Date
CN109254649A true CN109254649A (en) 2019-01-22

Family

ID=65049031

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810869753.4A Pending CN109254649A (en) 2018-08-02 2018-08-02 A kind of high efficiency interactive system based on closed cockpit

Country Status (1)

Country Link
CN (1) CN109254649A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130271618A1 (en) * 2012-04-13 2013-10-17 Samsung Electronics Co., Ltd. Camera apparatus and control method thereof
CN103530501A (en) * 2013-09-12 2014-01-22 西安交通大学 Stress aid decision making experimental device and method based on interaction of multiple sensing channels
US20140118270A1 (en) * 2012-10-26 2014-05-01 Qualcomm Incorporated System and method for providing infrared gesture interaction on a display
CN104486543A (en) * 2014-12-09 2015-04-01 北京时代沃林科技发展有限公司 Equipment and method for controlling cloud deck camera by intelligent terminal in touch manner
CN104740869A (en) * 2015-03-26 2015-07-01 北京小小牛创意科技有限公司 True environment integrated and virtuality and reality combined interaction method and system
CN106200952A (en) * 2016-07-04 2016-12-07 歌尔股份有限公司 A kind of method monitoring user behavior data and wearable device
CN106325487A (en) * 2015-07-03 2017-01-11 中兴通讯股份有限公司 Method for achieving sensomotor function, and terminal
US20170024587A1 (en) * 2015-07-24 2017-01-26 Kyocera Corporation Electronic device
CN107232822A (en) * 2016-08-31 2017-10-10 浙江玛拉蒂智能家具科技有限公司 A kind of Intelligent seat based on gesture recognition, intelligent interactive system and method
CN206711354U (en) * 2017-02-27 2017-12-05 安徽大学 Sitting posture correction and eye protection integrated device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130271618A1 (en) * 2012-04-13 2013-10-17 Samsung Electronics Co., Ltd. Camera apparatus and control method thereof
US20140118270A1 (en) * 2012-10-26 2014-05-01 Qualcomm Incorporated System and method for providing infrared gesture interaction on a display
CN103530501A (en) * 2013-09-12 2014-01-22 西安交通大学 Stress aid decision making experimental device and method based on interaction of multiple sensing channels
CN104486543A (en) * 2014-12-09 2015-04-01 北京时代沃林科技发展有限公司 Equipment and method for controlling cloud deck camera by intelligent terminal in touch manner
CN104740869A (en) * 2015-03-26 2015-07-01 北京小小牛创意科技有限公司 True environment integrated and virtuality and reality combined interaction method and system
CN106325487A (en) * 2015-07-03 2017-01-11 中兴通讯股份有限公司 Method for achieving sensomotor function, and terminal
US20170024587A1 (en) * 2015-07-24 2017-01-26 Kyocera Corporation Electronic device
CN106200952A (en) * 2016-07-04 2016-12-07 歌尔股份有限公司 A kind of method monitoring user behavior data and wearable device
CN107232822A (en) * 2016-08-31 2017-10-10 浙江玛拉蒂智能家具科技有限公司 A kind of Intelligent seat based on gesture recognition, intelligent interactive system and method
CN206711354U (en) * 2017-02-27 2017-12-05 安徽大学 Sitting posture correction and eye protection integrated device

Similar Documents

Publication Publication Date Title
WO2020221311A1 (en) Wearable device-based mobile robot control system and control method
CN107221223B (en) A virtual reality aircraft cockpit system with force/haptic feedback
CN108170279B (en) Eye movement and head movement interaction method of head display equipment
CN106527722B (en) Exchange method, system and terminal device in virtual reality
CN102541260B (en) Human-machine interaction control method and application thereof
CN1304931C (en) Head carried stereo vision hand gesture identifying device
WO2015180497A1 (en) Motion collection and feedback method and system based on stereoscopic vision
CN107656613A (en) A kind of man-machine interactive system and its method of work based on the dynamic tracking of eye
CN110039545A (en) A kind of robot remote control system and control method based on wearable device
CN109044651A (en) Method for controlling intelligent wheelchair and system based on natural gesture instruction in circumstances not known
CN110850987A (en) Specific identification control method and device based on two-dimensional intention expressed by human body
CN109976338A (en) A kind of multi-modal quadruped robot man-machine interactive system and method
CN103977539A (en) Cervical vertebra rehabilitation and health care training aiding system
CN113505694A (en) Human-computer interaction method and device based on sight tracking and computer equipment
CN110658742A (en) Wheelchair control system and method for multi-modal cooperative manipulation
CN103207667A (en) Man-machine interaction control method and application thereof
CN107703950A (en) A kind of underwater robot and control method using motion sensing control
CN113160260B (en) Head-eye double-channel intelligent man-machine interaction system and operation method
JP2008018529A (en) Communication robot
Wu et al. Omnidirectional mobile robot control based on mixed reality and semg signals
CN113253851B (en) Immersive flow field visualization man-machine interaction method based on eye movement tracking
CN105975057A (en) Multi-interface interaction method and device
CN203950270U (en) Body sense recognition device and by the man-machine interactive system of its mouse beacon keyboard operation
CN109254649A (en) A kind of high efficiency interactive system based on closed cockpit
CN108062102A (en) A kind of gesture control has the function of the Mobile Robot Teleoperation System Based of obstacle avoidance aiding

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20190122

RJ01 Rejection of invention patent application after publication