CN201698326U - Interactive image positioning action recognition system - Google Patents

Interactive image positioning action recognition system Download PDF

Info

Publication number
CN201698326U
CN201698326U CN 201020120281 CN201020120281U CN201698326U CN 201698326 U CN201698326 U CN 201698326U CN 201020120281 CN201020120281 CN 201020120281 CN 201020120281 U CN201020120281 U CN 201020120281U CN 201698326 U CN201698326 U CN 201698326U
Authority
CN
China
Prior art keywords
interactive
signal input
unit
action
input end
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN 201020120281
Other languages
Chinese (zh)
Inventor
郝宏贤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN 201020120281 priority Critical patent/CN201698326U/en
Application granted granted Critical
Publication of CN201698326U publication Critical patent/CN201698326U/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The utility model relates to an interactive image positioning action recognition system, which comprises an interactive first control signal input end, an interactive second control signal input end, an image positioning auxiliary end, an information receiving processing end and a display end. The interactive first control signal input end connected with the interactive second control signal input end coordinates with the image positioning auxiliary end in work, and the interactive second control signal input end is connected with the information receiving processing end connected with the display end. By adopting the technical scheme of the interactive image positioning action recognition system, a gamer does not need to sit idly on a fixed seat when playing a game, maneuvering of the game can be performed by corresponding actions such as waving arms and the like, the system can recognize the action of the operator and input recognition information into the game for corresponding control, the gamer plays the game and simultaneously exercises, and the interactive image positioning action recognition system achieves the purposes of gaming and entertaining while exercising due to the fact that the gamer is normally sweaty after one round of the game.

Description

Interactive image location action recognition system
Technical field
The utility model relates to a kind of action recognition system, is specifically related to a kind of action recognition system that relies on body sense remote control to carry out the interactive image location, can be applicable to recreation and controlling the field.
Background technology
At present, traditional computer and controlling of TV all are to adopt keyboard, mouse or TV remote controller to realize.Also be to adopt common handle in the control of computer or television recreation, and, recreation controlled in conjunction with the button input of control commands on the handle.
When adopting this control mode, the operator sits around often before computer or TV, comes operate game by the operating key of pointing on the continuous push-button controller, the player is fixed on the seat for a long time, eyes stare at display screen, can make that the player has a pain in the back, and eyes are also near-sighted easily.So just make the player when enjoying the direct stimulation that recreation brings, often make the health of oneself be subjected to serious injury, it is bigger particularly some to be in the actual bodily harm that the teenager in body development period causes.
The utility model content
Exist the operator to sit around before computer or television in the prior art in order to solve, come operate game by the operating key of pointing on the continuous push-button controller, the player is fixed on the seat for a long time, eyes stare at display screen, can make that the player has a pain in the back, eyes are myopia easily, particularly can cause this problem of actual bodily harm to some teenagers that are in body development period, the utility model provides a kind of interactive image location action recognition system, this system employs is controlled the field to recreation can effectively solve the problems of the prior art.
The technical scheme that the prior art problem that solves the utility model is adopted is for providing a kind of interactive image location action recognition system, described interactive image location action recognition system comprises: interactive first signal input end, interactive second signal input end, the auxiliary end of framing, message pick-up end for process and display end, wherein, described interactive first signal input end is connected with described interactive second signal input end and assists the end co-ordination with described framing, described interactive second signal input end is connected with described message pick-up end for process, and described message pick-up end for process is connected with described display end.
According to an optimal technical scheme of the present utility model: described interactive first signal input end is secondary control handle, comprise: the first action induction unit, first control module, the first action message transmitter unit and first power supply unit, wherein, the described first action induction unit is connected with described first control module, described first control module is connected with the described first action message transmitter unit, and described first power supply unit is connected with described first control module.
According to an optimal technical scheme of the present utility model: described interactive second signal input end is a key lever, comprise: the first action message receiving element, the second action induction unit, second control module, second power supply unit, framing receiving element and data transmission unit, wherein, the described first action message receiving element, the described second action induction unit, described framing receiving element are connected with described second control module respectively, and described second control module is connected with described data transmission unit with described second power supply unit.
According to an optimal technical scheme of the present utility model: described interactive first signal input end is connected with the described first action message receiving element by the described first action message transmitter unit with described interactive second signal input end, and connected mode is wired or wireless.
According to an optimal technical scheme of the present utility model: the auxiliary end of framing comprises: framing auxiliary reference unit and the 3rd power supply unit, described framing auxiliary reference unit is connected with described the 3rd power supply unit.
According to an optimal technical scheme of the present utility model: described framing auxiliary reference unit is infrared type framing auxiliary reference unit.
According to an optimal technical scheme of the present utility model: described message pick-up end for process comprises: Data Receiving unit and data are synthesized processing unit, and described Data Receiving unit is connected with the synthetic processing unit of described data.
According to an optimal technical scheme of the present utility model: described message pick-up end for process comprises information receiving end and data processing end, and described information receiving end and described data processing end are separate, connect by USB interface.
According to an optimal technical scheme of the present utility model: described interactive second signal input end is connected with described Data Receiving unit by described data transmission unit with described message pick-up end for process.
According to an optimal technical scheme of the present utility model: described display end is the display of televisor or computer.
After adopting technical solutions of the utility model, the player need not sit around when playing on fixing seat again, but play and control by brandishing corresponding action such as arm, native system can be discerned operator's action, and import in the recreation and carry out control corresponding, player is at this moment taking exercises, and a recreation is got off, often make the player with a face streaming with sweat, reached the purpose that to take exercises again when carrying out Entertainment.
Description of drawings
Fig. 1. the utility model interactive image location action recognition system structural representation one;
Fig. 2. the utility model interactive image location action recognition system structural representation two;
Fig. 3. the interactive first signal input end modular structure synoptic diagram in the utility model interactive image location action recognition system;
Fig. 4. the interactive second signal input end modular structure synoptic diagram in the utility model interactive image location action recognition system;
Fig. 5. the auxiliary end of framing modular structure synoptic diagram in the utility model interactive image location action recognition system;
Fig. 6. message pick-up end for process modular structure synoptic diagram in the utility model interactive image location action recognition system.
Embodiment
Below in conjunction with accompanying drawing technical solutions of the utility model are elaborated:
Among the figure: interactive first signal input end 101, interactive second signal input end 102, the auxiliary end 103 of framing, message pick-up end for process 104, information receiving end 1041, data processing end 1042, display end 105, the first action induction unit 301, first control module 302, the first action message transmitter unit 303, first power supply unit 304, the first action message receiving element 401, the second action induction unit 402, second control module 403, second power supply unit 404, framing receiving element 405, data transmission unit 406, framing auxiliary reference unit 501, the 3rd power supply unit 502, Data Receiving unit 601, data are synthesized processing unit 602.
See also Fig. 1 the utility model interactive image location action recognition system structural representation one, shown in figure one, described interactive image location action recognition system comprises: interactive first signal input end 101, interactive second signal input end 102, the auxiliary end 103 of framing, message pick-up end for process 104 and display end 105, wherein, described interactive first signal input end 101 is connected with described interactive second signal input end 102 and assists end 103 co-ordinations with described framing, described interactive second signal input end 102 is connected with described message pick-up end for process 104, and described message pick-up end for process 104 is connected with described display end 105.
In Fig. 1, described display end 105 is the display screen of televisor, described message pick-up end for process 104 is a data processing terminal independently, the input signal that is used to receive described interactive first signal input end 101 and the described interactive second signal input end 102 synthetic processing of line data of going forward side by side.
See also Fig. 2 the utility model interactive image location action recognition system structural representation two, as shown in Figure 2, described interactive image location action recognition system comprises: interactive first signal input end 101, interactive second signal input end 102, the auxiliary end 103 of framing, message pick-up end for process 104 and display end 105, wherein, described interactive first signal input end 101 is connected with described interactive second signal input end 102 and assists end 103 co-ordinations with described framing, described interactive second signal input end 102 is connected with described message pick-up end for process 104, and described message pick-up end for process 104 is connected with described display end 105.
In Fig. 2, described message pick-up end for process 104 comprises information receiving end 1041 and data processing end 1042, and described information receiving end 1041 and described data processing end 1042 are separate, connect by USB interface.Wherein, described information receiving end 1041 is used to receive the input signal of described interactive first signal input end 101 and described interactive second signal input end 102, and described data processing end 1042 is used for described information receiving end 1041 is changed over to synthetic processing of data of signal.Described data processing end 1042 is the main frame of computer, and described display end 105 is the display of computer.
In Fig. 1 and Fig. 2, on the auxiliary end 103 of described framing two infrared lamps are installed, this infrared lamp can send infrared signal, and being used to locate described second signal input end 102 is the position of the described relatively display end 105 of key lever.
See also the interactive first signal input end modular structure synoptic diagram in Fig. 3 the utility model interactive image location action recognition system.As shown in Figure 3, in the technical solution of the utility model, described interactive first signal input end 101 is arranged to secondary control handle, it comprises: the first action induction unit 301, first control module 302, the first action message transmitter unit 303 and first power supply unit 304, wherein, the described first action induction unit 301 is connected with described first control module 302, described first control module 302 is connected with the described first action message transmitter unit 303, and controlled by described first control module 302, described first power supply unit 304 is connected with described first control module 302, and is 302 power supplies of described first control module.
Wherein, the described first action induction unit 301 is used for inductive operation person's hand motion, and calculating motion measurement data about direction, speed and acceleration, specific implementation is to realize by the 3-axis acceleration chip that is installed on the described first action induction unit 301.
See also the interactive second signal input end modular structure synoptic diagram in Fig. 4 the utility model interactive image location action recognition system.As shown in Figure 4, in the technical solution of the utility model, described interactive second signal input end 102 is set to key lever, and it comprises: the first action message receiving element 401, the second action induction unit 402, second control module 403, second power supply unit 404, framing receiving element 405 and data transmission unit 406.Wherein, the described first action message receiving element 401, the described second action induction unit 402, described framing receiving element 405 are connected with described second control module 403 respectively, and described second control module 403 is connected with described data transmission unit 406 with described second power supply unit 404.
Described interactive first signal input end 101 is connected with the described first action message receiving element 401 by the described first action message transmitter unit 303 with described interactive second signal input end 102, and connected mode is wired connection or wireless connections.In the mode of interactive first signal input end 101 described in the optimal technical scheme of the present utility model, specifically can consult Figure of description Fig. 1 and Fig. 2 with the connected mode employing wired connection of described interactive second signal input end 102.
Wherein, the described second action induction unit 402 is used for inductive operation person's hand motion, and calculating motion measurement data about direction, speed and acceleration, specific implementation is to realize by the 3-axis acceleration chip that is installed on the described second action induction unit 402.
See also the auxiliary end of framing modular structure synoptic diagram in Fig. 5 the utility model interactive image location action recognition system.As shown in Figure 5, the auxiliary end 103 of described framing comprises: framing auxiliary reference unit 501 and the 3rd power supply unit 502, described framing auxiliary reference unit 501 is connected with described the 3rd power supply unit 502, and by 502 power supplies of described the 3rd power supply unit.
In optimal technical scheme of the present utility model, described framing auxiliary reference unit 501 is infrared type framing auxiliary reference unit 501, it sends infrared signal by infrared lamp, being used to locate described second signal input end 102 is the position of the described relatively display end 105 of key lever, is implemented in the definite and control of relative position in the recreation with this.
See also message pick-up end for process modular structure synoptic diagram in Fig. 6 the utility model interactive image location action recognition system.As shown in Figure 6, described message pick-up end for process 104 comprises: Data Receiving unit 601 and data are synthesized processing unit 602, and described Data Receiving unit 601 is connected with the synthetic processing unit 602 of described data.Wherein, described message pick-up end for process 104 can be designed to a data processing terminal independently as required, also can be designed to separate information receiving end and data processing end, described information receiving end is connected by USB interface with described data processing end.
The utility model interactive image location action recognition system in use, hold described interactive first signal input end 101 (auxiliary-handle) by one of operator, another holds the game state that described interactive second signal input end 102 (main handle) shows according to described display end 105, brandish corresponding arm input control signal, at this moment, described interactive first signal input end 101 is connected with described interactive second signal input end 102, described interactive second signal input end 102 can described interactive first signal input end 101 of perception control signal, described interactive second signal input end 102 is carried out the location positioning of its described relatively display end 105 by the auxiliary end 103 of described framing, and with corresponding position, direction, the motion measurement data of speed and acceleration send to described message pick-up end for process 104 and carry out the synthetic processing of data, and final control command is imported the control that described display end 105 is played.
After adopting technical solutions of the utility model, the player need not sit around when playing on fixing seat again, but play and control by brandishing corresponding action such as arm, native system can be discerned operator's action, and import in the recreation and carry out control corresponding, player is at this moment taking exercises, and a recreation is got off, often make the player with a face streaming with sweat, reached the purpose that to take exercises again when carrying out Entertainment.
Above content be in conjunction with concrete optimal technical scheme to further describing that the utility model is done, can not assert that concrete enforcement of the present utility model is confined to these explanations.For the utility model person of an ordinary skill in the technical field, under the prerequisite that does not break away from the utility model design, can also make some simple deduction or replace, all should be considered as belonging to protection domain of the present utility model.

Claims (10)

1. interactive image location action recognition system, it is characterized in that: described interactive image location action recognition system comprises: interactive first signal input end (101), interactive second signal input end (102), the auxiliary end of framing (103), message pick-up end for process (104) and display end (105), wherein, described interactive first signal input end (101) is connected with described interactive second signal input end (102) and assists end (103) co-ordination with described framing, described interactive second signal input end (102) is connected with described message pick-up end for process (104), and described message pick-up end for process (104) is connected with described display end (105).
2. according to the described interactive image location action of claim 1 recognition system, it is characterized in that: described interactive first signal input end (101) is secondary control handle, comprise: the first action induction unit (301), first control module (302), the first action message transmitter unit (303) and first power supply unit (304), wherein, the described first action induction unit (301) is connected with described first control module (302), described first control module (302) is connected with the described first action message transmitter unit (303), and described first power supply unit (304) is connected with described first control module (302).
3. according to the described interactive image location action of claim 1 recognition system, it is characterized in that: described interactive second signal input end (102) is key lever, comprise: the first action message receiving element (401), the second action induction unit (402), second control module (403), second power supply unit (404), framing receiving element (405) and data transmission unit (406), wherein, the described first action message receiving element (401), the described second action induction unit (402), described framing receiving element (405) is connected with described second control module (403) respectively, and described second control module (403) is connected with described data transmission unit (406) with described second power supply unit (404).
4. according to claim 2 or 3 described interactive image location action recognition systems, it is characterized in that: described interactive first signal input end (101) is connected with the described first action message receiving element (401) by the described first action message transmitter unit (303) with described interactive second signal input end (102), and connected mode is wired or wireless.
5. according to the described interactive image location action of claim 1 recognition system, it is characterized in that: the auxiliary end of framing (103) comprising: framing auxiliary reference unit (501) and the 3rd power supply unit (502), described framing auxiliary reference unit (501) is connected with described the 3rd power supply unit (502).
6. according to the described interactive image location action of claim 5 recognition system, it is characterized in that: described framing auxiliary reference unit (501) is infrared type framing auxiliary reference unit (501).
7. according to the described interactive image location action of claim 1 recognition system, it is characterized in that: described message pick-up end for process (104) comprising: Data Receiving unit (601) and data are synthesized processing unit (602), and described Data Receiving unit (601) is connected with the synthetic processing unit (602) of described data.
8. according to claim 1 or 7 described interactive image location action recognition systems, it is characterized in that: described message pick-up end for process (104) comprises information receiving end (1041) and data processing end (1042), described information receiving end (1041) is separate with described data processing end (1042), connects by USB interface.
9. according to the described interactive image location action of claim 1 recognition system, it is characterized in that: described interactive second signal input end (102) is connected with Data Receiving unit (601) by data transmission unit (406) with described message pick-up end for process (104).
10. according to the described interactive image location action of claim 1 recognition system, it is characterized in that: described display end (105) is the display of televisor or computer.
CN 201020120281 2010-02-23 2010-02-23 Interactive image positioning action recognition system Expired - Fee Related CN201698326U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201020120281 CN201698326U (en) 2010-02-23 2010-02-23 Interactive image positioning action recognition system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201020120281 CN201698326U (en) 2010-02-23 2010-02-23 Interactive image positioning action recognition system

Publications (1)

Publication Number Publication Date
CN201698326U true CN201698326U (en) 2011-01-05

Family

ID=43399573

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201020120281 Expired - Fee Related CN201698326U (en) 2010-02-23 2010-02-23 Interactive image positioning action recognition system

Country Status (1)

Country Link
CN (1) CN201698326U (en)

Similar Documents

Publication Publication Date Title
JP7028917B2 (en) How to fade out an image of a physics object
US20080220865A1 (en) Interactive playstation controller
US10617941B2 (en) Computer game interface
CN102033606A (en) Mobile terminal being capable of implementing man-machine interaction and method thereof
CN109011570A (en) Somatic sensation television game interactive approach and system
CN107551536A (en) A kind of intelligent mahjong
CN102755745A (en) Whole-body simulation game equipment
CN201453626U (en) Massage apparatus integrated with touch screen controller
US20180169520A1 (en) Vibration feedback system and vibration feedback method
CN101146284A (en) A smart mobile phone platform
WO2021232698A1 (en) Vision guidance-based mobile gaming system and mobile gaming response method
CN201638149U (en) Mobile terminal capable of achieving man-machine interaction
CN201698326U (en) Interactive image positioning action recognition system
CN105988601A (en) Input apparatus, display apparatus and control method thereof
CN103191010A (en) Massage armchair with virtual reality functions
CN202536778U (en) Massage chair with virtual reality function
CN201796335U (en) Action recognition system for image positioning
CN204637535U (en) The automatic sensing apparatus of human body in virtual game
CN105477858A (en) Real-time interaction somatosensory interaction intelligent human body wearing device
CN203043530U (en) Improved dancing blanket
CN202892879U (en) Video game controller and auxiliary displayer
CN202916918U (en) Motion sensing game input device
WO2022137376A1 (en) Method, computer-readable medium, and information processing device
CN203169983U (en) Body sense remote-control input game device
CN202033720U (en) Master-slave controller capable of realizing wireless connection

Legal Events

Date Code Title Description
C14 Grant of patent or utility model
GR01 Patent grant
DD01 Delivery of document by public notice

Addressee: Hao Hongxian

Document name: Notification to Pay the Fees

DD01 Delivery of document by public notice

Addressee: Hao Hongxian

Document name: Notification of Decision on Request for Restoration of Right

DD01 Delivery of document by public notice
DD01 Delivery of document by public notice

Addressee: Hao Hongxian

Document name: Notification to Pay the Fees

DD01 Delivery of document by public notice

Addressee: Hao Hongxian

Document name: Notification of Termination of Patent Right

C17 Cessation of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20110105

Termination date: 20140223