CN109254649A - A kind of high efficiency interactive system based on closed cockpit - Google Patents

A kind of high efficiency interactive system based on closed cockpit Download PDF

Info

Publication number
CN109254649A
CN109254649A CN201810869753.4A CN201810869753A CN109254649A CN 109254649 A CN109254649 A CN 109254649A CN 201810869753 A CN201810869753 A CN 201810869753A CN 109254649 A CN109254649 A CN 109254649A
Authority
CN
China
Prior art keywords
max
user
seat
row
col
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810869753.4A
Other languages
Chinese (zh)
Inventor
俞峰
汤勇明
郑姚生
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN201810869753.4A priority Critical patent/CN109254649A/en
Publication of CN109254649A publication Critical patent/CN109254649A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a kind of high efficiency interactive system based on closed cockpit, including a closed cockpit, the front, rear, left and right four direction on the outside of the closed cockpit installs camera respectively, and each camera is mounted on steering engine holder;The front, rear, left and right four direction of closed cabin interior side installs back projection respectively, and each back projection is connected by data line with unidirectional camera;Fixed swiveling seat is installed at the center of closed cabin interior, array of pressure sensors is respectively set in the chair back and seat cushion, and the head of user and trunk install gyroscope everywhere;Touch screen is arranged in the front of fixed swiveling seat, and the lower section of touch screen is provided with gesture identification equipment.The touch interactive mode of the comprehensive body feeling interaction mode and combination gesture identification based on sitting posture attitude updating of the present invention solves the problems, such as that interactive means integrated level is not high in the prior art, interactive efficiency is lower, intelligent not high, the immersion experience sense deficiency of interactive mode.

Description

A kind of high efficiency interactive system based on closed cockpit
Technical field
The invention belongs to showing interaction technique field, in particular to a kind of high efficiency interactive system based on closed cockpit System.
Background technique
Cockpit display is widely used in aviation, and display interface includes more parameter and state of flight information, behaviour Making mode is more traditional operating stick interaction and key interaction, and total system requires operation smooth, and picture is clear, interaction and When.Show that interaction technique is widely used in command and control system, such as aerospace, naval vessels charge, it is considered to be " people and machine Bridge between device ".Its particular technique refers to that machine provides a large amount of useful informations to people by presentation device and prompt is asked for instructions, People realizes the process that people interacts with machine by input equipment to machine transmission command information.It and cognitive science, human engineering There is close connection in the fields such as, psychology.
Interaction technique applied by current is compared with the demand in application scenarios, and there are the gaps of three aspects:
1, show that the integrated level of interactive means is not high;Most of interaction technique still uses traditional display interactive means, It is simple integrated to be applied to control equipment, do not integrate it is relatively advanced, efficiently show interactive means, also do not examine sufficiently at the same time Consider the influence to user experience and actual effect such as convenient comfort of interactive efficiency, manipulation.
2, the optimization of interactive mode is insufficient under various modes;Although being integrated with more interactive mode, only to interaction Mode, which is done, simply piles up, and does not do deeply and optimizes to interactive mode, causes the efficiency of decision-making of user to decline, each interactive mode Collaborative it is not strong.
3, assist the intelligence of interactive mode not strong, initiative of the interactive mode in interactive process is inadequate, also unstructured It can adapt to the adaptive human-computer interaction system of different application scene, different manipulation roles.
Summary of the invention
In order to solve the technical issues of above-mentioned background technique proposes, the present invention is intended to provide a kind of based on closed cockpit High efficiency interactive system realizes the high efficiency interactive under closed cockpit application scenarios, and increases the immersion experience of user.
In order to achieve the above technical purposes, the technical solution of the present invention is as follows:
A kind of high efficiency interactive system based on closed cockpit, including a closed cockpit, closed cockpit outside Front, rear, left and right four direction install a camera respectively, and each camera is mounted on steering engine holder;Closed seat Front, rear, left and right four direction on the inside of cabin installs a back projection respectively, each back projection by data line with it is aforementioned Unidirectional camera is connected, to show the image of corresponding camera acquisition;Pacify at the center of closed cabin interior Equipped with fixed swiveling seat, array of pressure sensors is respectively set in the chair back and seat cushion of the fixed swiveling seat, and user sits In on fixed swiveling seat, the head of user and trunk install gyroscope everywhere;The front of fixed swiveling seat is arranged The lower section of touch screen, touch screen is provided with gesture identification equipment;Focusing control terminal, the control terminal of steering engine holder, the pressure of camera The output end difference of the output end of force sensor array, the output end of gyroscope, the output end of touch screen and gesture identification equipment It is connected with system master controller.
Further, which using the body feeling interaction mode based on sitting posture attitude updating and combines gesture to know Other touch interactive mode.
Further, the body feeling interaction mode based on sitting posture attitude updating is, by the chair back of fixed swiveling seat It is real-time transmitted to the data of the gyroscope acquisition of the array of pressure sensors and user's head and trunk on seat cushion everywhere and is System master controller, system master controller solve the current sitting posture information of user, root according to the data that array of pressure sensors acquires The current body-sensing posture information of user is solved according to the data that gyroscope acquires, and not androgynous by the current sitting posture information correction of user Feel the threshold value of the triggered response of posture, to obtain the current body-sensing posture of final user, system master controller is worked as according to user The focus point of preceding each camera of body-sensing gesture stability and the deflection angle of each steering engine holder complete body feeling interaction.
It further, will if being respectively provided with 3 × 3 array of pressure sensors on the chair back and seat cushion of fixed swiveling seat The signal of each pressure sensor acquisition is calculated as the following formula:
MAXrow=max { row_F1,row_F2,row_F3}
MAXcol=max { col_F1,col_F2,col_F3}
Location (x, y)=(rowmax,colmax)
In above formula, F (i, j) is the letter of the i-th row jth column pressure sensor acquisition in 3 × 3 arrays;Location (x, y) is The maximum coordinate of array stress point, rowmax,colmaxFor MAXrow,MAXcolCorresponding ranks coordinate, (x, y) | (x, y) ∈ (1, 1),(1,2),(1,3),(2,1),(2,2),(2,3),(3,1),(3,2),(3,3)};
User is solved according to the three axis signal gyroscope (x, y, z) that location (x, y) and gyroscope export currently to sit Appearance:
State=function_correction [MAXrow-back,MAXcol-back,MAXrow-seat,MAXcol-seat,
location(x,y)back,location(x,y)seat, gyroscope (x, y, z)] and in above formula, MAXrow-back, MAXcol-back,location(x,y)backThe MAX obtained for chair back array of pressure sensorsrow,MAXcol,location(x,y) Value, MAXrow-seat,MAXcol-seat,location(x,y)seatThe MAX obtained for seat cushion array of pressure sensorsrow,MAXcol, Location (x, y) value;Function_correction [] be sitting posture recognition function, state ∈ front, behind, Left, right, idle }, i.e., user's sitting posture include front, rear, left and right, in 5 kinds of sitting postures, this corresponding body-sensing posture of 5 kinds of sitting postures Be followed successively by lean forward, swing back, is left-leaning, Right deviation and placed in the middle;
When user is chronically at a certain sitting posture state, sitting posture state is considered as to the habit sitting posture of user, then system master Controller increase triggers the response activation threshold value of the corresponding body-sensing posture of the sitting posture.
Further, the touch interactive mode of the combination gesture identification is, user's striking on the touchscreen by finger Instruction is hit and slidably inputed, while identifying the gesture instruction of user by gesture identification equipment, and system master controller executes hand The priority of gesture instruction is higher than the instruction of touch screen acquisition.
Further, gesture identification equipment is mounted on one-dimensional steering engine holder, the deflection angle of the one-dimensional steering engine holder It is controlled by system master controller, system master controller input information according to received by touch screen judges active user's hand institute Position, control the deflection of one-dimensional steering engine holder, accordingly so as to adjust the detection range of gesture identification equipment.
By adopting the above technical scheme bring the utility model has the advantages that
The present invention combines two kinds of interactive modes --- body feeling interaction mode based on sitting posture attitude updating and combine gesture The touch interactive mode of identification.Emphasis corresponding to two kinds of interactive modes is different, the body feeling interaction based on sitting posture attitude updating Mode is used to control the rotation of extraneous camera lower rudder machine head and the focus point of camera, for adjusting the shown external world Environment inputs picture, and wherein the speed of image switching is changed according to the setting of user, avoids too fast and excessively slow phenomenon shadow Ring user experience.Then it is used to handle the equipment work shape of the closed cockpit of interaction itself in conjunction with the touch interactive mode of gesture identification State, i.e., corresponding parameter passes through touch screen and is shown to user in closed cockpit, and user can also be realized by this touch screen To the order of closed cockpit input, gesture identification is then more rapidly easy a kind of interactive mode on this basis.Integrate this Two kinds of interactive modes, user can realize the good interaction with the external world in closed cockpit, and can experience on this basis Preferable immersion effect.
Detailed description of the invention
Fig. 1 is inventive closure formula cockpit schematic diagram;
Fig. 2 is inventive sensor setting schematic diagram;
Fig. 3 is the body feeling interaction mode flow chart the present invention is based on sitting posture attitude updating;
Fig. 4 is array of pressure sensors setting schematic diagram of the present invention;
Fig. 5 is the touch interactive system schematic diagram that the present invention combines gesture identification;
Fig. 6 is each component locations schematic diagram of inventive touch interactive system;
Fig. 7 is the touch interactive mode flow chart that the present invention combines gesture identification;
Fig. 8 is whole system implementation flow chart of the present invention.
Specific embodiment
Below with reference to attached drawing, technical solution of the present invention is described in detail.
The present invention devises a kind of high efficiency interactive system based on closed cockpit, as shown in Figure 1, the closed cockpit Front end, rear end, left end right end place a pancratic camera respectively, referring to the label 1,2,3,4 in Fig. 1, camera shooting Head is mounted on steering engine holder, and wherein steering engine holder can be turned to by control signal control, to drive camera to difference Direction rotates, for rendering different scenes.In closed cabin interior, 4 back projections are set, referring to label 5 in Fig. 1,6,7, 8, four camera acquired image information are transmitted to the back projection of closed cabin interior by data line, to closing Personnel in formula cockpit transmit external image information.Fixed swiveling seat (only bottom is fixed) is arranged in closed cabin interior At center, referring to the label 9 in Fig. 1.The present invention is using the body feeling interaction mode based on sitting posture attitude updating and gesture is combined to know Other touch interactive mode.
For the body feeling interaction mode based on sitting posture attitude updating, the sitting posture mould based on the distribution of user's pressure is initially set up Type enters pressure data acquisition in system master controller according to the pressure sensor of each place arrangement in seat, can by operation Obtain the sitting posture of user's current state.Secondly, current according to the gyroscope acquisition user for being distributed in user's body key position Body-sensing information.In order to avoid the numerical value of external irregular effect of jitter gyroscope, the data of gyroscope should be filtered by Kalman Wave.As shown in Fig. 2, 10 and 12 be respectively the pressure sensor that seat cushion and the chair back is arranged in, 11 and 13 be respectively that setting exists The gyroscope of user's head and trunk.System master controller acquires body-sensing information and the current sitting posture of user is combined to carry out operation, root Change the threshold value of corresponding body-sensing action response according to the sitting posture information of active user.For example, if pressure sensor detects user Pressure distribution be concentrated mainly on the preceding part (this means that user, which compares, is accustomed to forward sitting posture) of seat, then microcontroller exists Judge that the threshold value that will increase posture triggering of leaning forward when the body-sensing posture of user, this method have evaded different user since sitting posture is practised Used difference and the problem for causing the response of body-sensing posture inaccurate.Last microcontroller exports active user's according to the result of operation Body-sensing status information, for interacting, the process is as shown in Figure 3.
In the body feeling interaction mode based on sitting posture attitude updating, active user's body-sensing appearance that system master controller is exported State information will be used to adjust the deflection angle of the focus point and steering engine holder of extraneous camera, and such embodiment is similar to reality People and extraneous interactive mode in the world.For example, the visual field will focus on a certain spy when human body tends to a certain specific direction Earnest body, object will be also presented in human eye in a manner of a kind of be more clear.This scene is mapped in closed cockpit, when When the posture of user leans forward, by the direction before making camera aligning surface, camera also will be according to leaning forward for the steering engine of camera base Angle moderately focus, to realize observation to object in front.
In the present invention, according to seat everywhere pressure sensor numerical value come the method that constructs user's sitting posture model are as follows: with It is respectively arranged 9 pressure sensors on the seat cushion and the chair back of the seat at family, forms 3 × 3 arrays, as shown in 14,15 in Fig. 4, The pressure value that each coordinate points are calculated by establishing coordinate can calculate the sitting posture of user, carry out subsequent fortune It calculates, calculates separately the ratio that every a line and the pressure value of each column in the coordinate system account for integral pressure, relatively can be obtained later The sitting posture state of user.Calculation formula is as follows:
MAXrow=max { row_F1,row_F2,row_F3}
MAXcol=max { col_F1,col_F2,col_F3}
Location (x, y)=(rowmax,colmax)
In above formula, F (i, j) is the letter of the i-th row jth column pressure sensor acquisition in 3 × 3 arrays;Location (x, y) is The maximum coordinate of array stress point, rowmax,colmaxFor MAXrow,MAXcolCorresponding ranks coordinate, (x, y) | (x, y) ∈ (1, 1),(1,2),(1,3),(2,1),(2,2),(2,3),(3,1),(3,2),(3,3)}。
User is solved according to the three axis signal gyroscope (x, y, z) that location (x, y) and gyroscope export currently to sit Appearance:
State=function_correction [MAXrow-back,MAXcol-back,MAXrow-seat,MAXcol-seat,
location(x,y)back,location(x,y)seat, gyroscope (x, y, z)] and in above formula, MAXrow-back, MAXcol-back,location(x,y)backThe MAX obtained for chair back array of pressure sensorsrow,MAXcol,location(x,y) Value, MAXrow-seat,MAXcol-seat,location(x,y)seatThe MAX obtained for seat cushion array of pressure sensorsrow,MAXcol, Location (x, y) value;Function_correction [] be sitting posture recognition function, state ∈ front, behind, Left, right, idle }, i.e., user's sitting posture include front, rear, left and right, in 5 kinds of sitting postures.
For combining the touch interactive mode of gesture identification, setting touch screen is used to receive the hand of user in face of seat Gesture input, while being used to receive the certain gestures information of user in the lower section of touch screen setting Leap Motion.In view of the reality The touch screen position for applying mode is relatively fixed, but the size of screen may be relatively large, is easy the identification beyond Leap Motion Range, it is contemplated that the identification range of user gesture must design matched one within triangular pyramid into leap motion Steering engine holder dynamically track user gesture is tieed up to solve the above problems.The specific embodiment of one-dimensional holder in one's power are as follows: in Leap Motion pedestal install freedom degree be 1 steering engine holder, realize gestures detection range expansion, implementation model as shown in figure 5, its In, (a) indicates user gesture not in Leap Motion detection range, (b) indicates to make user gesture position by multistage holder adjustment In in Leap Motion detection range, label 16 indicates effective gesture identification range in Leap Motion, and 17 indicate to touch Screen, 18 indicate Leap Motion, and 19 indicate one-dimensional steering engine holder.The rotational angle of steering engine holder is provided by system master controller, System master controller inputs touch information according to received by touch screen and judges position where active user's hand, by meter Corresponding rotational angle is exported after calculation realizes Leap Motion to the dynamically track of user gesture with this to steering engine holder.Such as Fig. 6 The shown installation site for touching each component of interactive system, label 20 are one-dimensional steering engine holder, and 21 be to be mounted on above holder Leap Motion, 22 be touch screen, and 23 be the effective range of Leap Motion.
In the touch interactive mode for combining gesture identification, the priority of gesture identification is greater than touch recognition priority, is System main device will pay the utmost attention to the result of gesture identification.The design of the part is mainly in view of in case of emergency, and user comes not And a series of operational motion is made to complete the processing to a certain emergency, so using gesture identification in touch interface Method improves interactive efficiency.The process that its program executes is as shown in Figure 7.
In the present invention, system master controller uses the microprocessor of generic structure.The fixed closed seat of swiveling seat The contact position in bilge portion should be disposed with buffer unit, such as spring, rubber, avoid external interference with this.Fig. 8 can be complete Show the implementation process of two kinds of interactive modes.
Embodiment is merely illustrative of the invention's technical idea, and this does not limit the scope of protection of the present invention, it is all according to Technical idea proposed by the present invention, any changes made on the basis of the technical scheme are fallen within the scope of the present invention.

Claims (6)

1. a kind of high efficiency interactive system based on closed cockpit, it is characterised in that: the high efficiency interactive system includes a closing Formula cockpit, the front, rear, left and right four direction on the outside of the closed cockpit installs a camera respectively, and each camera is pacified On steering engine holder;The front, rear, left and right four direction of closed cabin interior side installs a back projection, each rear-projection respectively Shadow is connected by data line with aforementioned unidirectional camera, to show the image of corresponding camera acquisition;Envelope Fixed swiveling seat is installed at center inside closed cabin, the chair back and seat cushion of the fixed swiveling seat are respectively set Array of pressure sensors, user are seated on fixed swiveling seat, and the head of user and trunk install gyroscope everywhere;It is fixed Touch screen is arranged in the front of swiveling seat, and the lower section of touch screen is provided with gesture identification equipment;The focusing control terminal of camera, The control terminal of steering engine holder, the output end of array of pressure sensors, the output end of gyroscope, the output end of touch screen and gesture are known The output end of other equipment is connected with system master controller respectively.
2. according to claim 1 based on the high efficiency interactive system of closed cockpit, it is characterised in that: the high efficiency interactive system Using the touch interactive mode of body feeling interaction mode and combination gesture identification based on sitting posture attitude updating.
3. according to claim 2 based on the high efficiency interactive system of closed cockpit, it is characterised in that: described to be based on sitting posture appearance State correction body feeling interaction mode be, by the chair back and seat cushion of fixed swiveling seat array of pressure sensors and user The data of the gyroscope acquisition of head and trunk everywhere are real-time transmitted to system master controller, and system master controller is passed according to pressure The data of sensor array acquisition solve the current sitting posture information of user, solve user according to the data that gyroscope acquires and work as precursor Feel posture information, and pass through the threshold value of the triggered response of the current sitting posture information correction difference body-sensing posture of user, to obtain most The whole current body-sensing posture of user, system master controller is according to the focus point of the current each camera of body-sensing gesture stability of user and each The deflection angle of steering engine holder completes body feeling interaction.
4. according to claim 3 based on the high efficiency interactive system of closed cockpit, it is characterised in that: set fixed rotation seat 3 × 3 array of pressure sensors is respectively provided on the chair back and seat cushion of chair, the signal that each pressure sensor is acquired carries out as the following formula It calculates:
MAXrow=max { row_F1,row_F2,row_F3}
MAXcol=max { col_F1,col_F2,col_F3}
Location (x, y)=(rowmax,colmax)
In above formula, F (i, j) is the letter of the i-th row jth column pressure sensor acquisition in 3 × 3 arrays;Location (x, y) is array The maximum coordinate of stress point, rowmax,colmaxFor MAXrow,MAXcolCorresponding ranks coordinate, and (x, y) | (x, y) ∈ (1,1), (1,2),(1,3),(2,1),(2,2),(2,3),(3,1),(3,2),(3,3)};
The current sitting posture of user is solved according to the three axis signal gyroscope (x, y, z) that location (x, y) and gyroscope export:
State=function_correction [MAXrow-back,MAXcol-back,MAXrow-seat,MAXcol-seat,location (x,y)back,location(x,y)seat,gyroscope(x,y,z)]
In above formula, MAXrow-back,MAXcol-back,location(x,y)backThe MAX obtained for chair back array of pressure sensorsrow, MAXcol, location (x, y) value, MAXrow-seat,MAXcol-seat,location(x,y)seatFor seat cushion array of pressure sensors Obtained MAXrow,MAXcol, location (x, y) value;Function_correction [] is sitting posture recognition function, state ∈ { front, behind, left, right, idle }, i.e. user's sitting posture include front, rear, left and right, in 5 kinds of sitting postures, this 5 kinds seat The corresponding body-sensing posture of appearance be followed successively by lean forward, swing back, is left-leaning, Right deviation and placed in the middle;
When user is chronically at a certain sitting posture state, sitting posture state is considered as to the habit sitting posture of user, then system master system Device increase triggers the response activation threshold value of the corresponding body-sensing posture of the sitting posture.
5. according to claim 2 based on the high efficiency interactive system of closed cockpit, it is characterised in that: the combination gesture is known Other touch interactive mode is, user is by finger percussion on the touchscreen and slidably inputs instruction, while passing through gesture knowledge The gesture instruction of other equipment identification user, and system master controller executes finger of the priority higher than touch screen acquisition of gesture instruction It enables.
6. according to claim 5 based on the high efficiency interactive system of closed cockpit, it is characterised in that: by gesture identification equipment It is mounted on one-dimensional steering engine holder, the deflection angle of the one-dimensional steering engine holder is controlled by system master controller, system master controller The input information according to received by touch screen judges the position where active user's hand, and it is inclined to control one-dimensional steering engine holder accordingly Turn, so as to adjust the detection range of gesture identification equipment.
CN201810869753.4A 2018-08-02 2018-08-02 A kind of high efficiency interactive system based on closed cockpit Pending CN109254649A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810869753.4A CN109254649A (en) 2018-08-02 2018-08-02 A kind of high efficiency interactive system based on closed cockpit

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810869753.4A CN109254649A (en) 2018-08-02 2018-08-02 A kind of high efficiency interactive system based on closed cockpit

Publications (1)

Publication Number Publication Date
CN109254649A true CN109254649A (en) 2019-01-22

Family

ID=65049031

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810869753.4A Pending CN109254649A (en) 2018-08-02 2018-08-02 A kind of high efficiency interactive system based on closed cockpit

Country Status (1)

Country Link
CN (1) CN109254649A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130271618A1 (en) * 2012-04-13 2013-10-17 Samsung Electronics Co., Ltd. Camera apparatus and control method thereof
CN103530501A (en) * 2013-09-12 2014-01-22 西安交通大学 Stress aid decision making experimental device and method based on interaction of multiple sensing channels
US20140118270A1 (en) * 2012-10-26 2014-05-01 Qualcomm Incorporated System and method for providing infrared gesture interaction on a display
CN104486543A (en) * 2014-12-09 2015-04-01 北京时代沃林科技发展有限公司 Equipment and method for controlling cloud deck camera by intelligent terminal in touch manner
CN104740869A (en) * 2015-03-26 2015-07-01 北京小小牛创意科技有限公司 True environment integrated and virtuality and reality combined interaction method and system
CN106200952A (en) * 2016-07-04 2016-12-07 歌尔股份有限公司 A kind of method monitoring user behavior data and wearable device
CN106325487A (en) * 2015-07-03 2017-01-11 中兴通讯股份有限公司 Method for achieving sensomotor function, and terminal
US20170024587A1 (en) * 2015-07-24 2017-01-26 Kyocera Corporation Electronic device
CN107232822A (en) * 2016-08-31 2017-10-10 浙江玛拉蒂智能家具科技有限公司 A kind of Intelligent seat based on gesture recognition, intelligent interactive system and method
CN206711354U (en) * 2017-02-27 2017-12-05 安徽大学 Sitting posture correction and eye protection integrated device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130271618A1 (en) * 2012-04-13 2013-10-17 Samsung Electronics Co., Ltd. Camera apparatus and control method thereof
US20140118270A1 (en) * 2012-10-26 2014-05-01 Qualcomm Incorporated System and method for providing infrared gesture interaction on a display
CN103530501A (en) * 2013-09-12 2014-01-22 西安交通大学 Stress aid decision making experimental device and method based on interaction of multiple sensing channels
CN104486543A (en) * 2014-12-09 2015-04-01 北京时代沃林科技发展有限公司 Equipment and method for controlling cloud deck camera by intelligent terminal in touch manner
CN104740869A (en) * 2015-03-26 2015-07-01 北京小小牛创意科技有限公司 True environment integrated and virtuality and reality combined interaction method and system
CN106325487A (en) * 2015-07-03 2017-01-11 中兴通讯股份有限公司 Method for achieving sensomotor function, and terminal
US20170024587A1 (en) * 2015-07-24 2017-01-26 Kyocera Corporation Electronic device
CN106200952A (en) * 2016-07-04 2016-12-07 歌尔股份有限公司 A kind of method monitoring user behavior data and wearable device
CN107232822A (en) * 2016-08-31 2017-10-10 浙江玛拉蒂智能家具科技有限公司 A kind of Intelligent seat based on gesture recognition, intelligent interactive system and method
CN206711354U (en) * 2017-02-27 2017-12-05 安徽大学 Sitting posture correction and eye protection integrated device

Similar Documents

Publication Publication Date Title
CN106527722B (en) Exchange method, system and terminal device in virtual reality
CN108170279B (en) Eye movement and head movement interaction method of head display equipment
CN107221223B (en) Virtual reality cockpit system with force/tactile feedback
CN1304931C (en) Head carried stereo vision hand gesture identifying device
CN105224069B (en) A kind of augmented reality dummy keyboard input method and the device using this method
CN102253713B (en) Towards 3 D stereoscopic image display system
WO2020221311A1 (en) Wearable device-based mobile robot control system and control method
CN109044651A (en) Method for controlling intelligent wheelchair and system based on natural gesture instruction in circumstances not known
WO2007053116A1 (en) Virtual interface system
US11036296B2 (en) Electronic device and control method thereof
CN109976338A (en) A kind of multi-modal quadruped robot man-machine interactive system and method
CN107103309A (en) A kind of sitting posture of student detection and correcting system based on image recognition
CN107885124A (en) Brain eye cooperative control method and system in a kind of augmented reality environment
WO2012106978A1 (en) Method for controlling man-machine interaction and application thereof
CN106377228A (en) Monitoring and hierarchical-control method for state of unmanned aerial vehicle operator based on Kinect
CN103207667A (en) Man-machine interaction control method and application thereof
Song et al. Detection of movements of head and mouth to provide computer access for disabled
US20210117663A1 (en) Control apparatus, information processing system, control method, and program
CN108052901B (en) Binocular-based gesture recognition intelligent unmanned aerial vehicle remote control method
Corradini et al. Multimodal speech-gesture interface for handfree painting on a virtual paper using partial recurrent neural networks as gesture recognizer
Vasisht et al. Human computer interaction based eye controlled mouse
US10444831B2 (en) User-input apparatus, method and program for user-input
CN104765454A (en) Human muscle movement perception based menu selection method for human-computer interaction interface
CN109254649A (en) A kind of high efficiency interactive system based on closed cockpit
CN109298710A (en) Double-wheel self-balancing car owner based on human-computer interaction is dynamic to follow composite control method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20190122