CN207780717U - Air is imaged interaction device - Google Patents

Air is imaged interaction device Download PDF

Info

Publication number
CN207780717U
CN207780717U CN201820158784.4U CN201820158784U CN207780717U CN 207780717 U CN207780717 U CN 207780717U CN 201820158784 U CN201820158784 U CN 201820158784U CN 207780717 U CN207780717 U CN 207780717U
Authority
CN
China
Prior art keywords
infrared
light
air
interaction device
memory space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201820158784.4U
Other languages
Chinese (zh)
Inventor
黄浩
刘丹
范铭达
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Yong Micro Mdt Infotech Ltd
Original Assignee
Shanghai Yong Micro Mdt Infotech Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Yong Micro Mdt Infotech Ltd filed Critical Shanghai Yong Micro Mdt Infotech Ltd
Priority to CN201820158784.4U priority Critical patent/CN207780717U/en
Application granted granted Critical
Publication of CN207780717U publication Critical patent/CN207780717U/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The air of the utility model is imaged interaction device, including:Shell, interior to be equipped with memory space, the memory space forms light out part in at least side of the shell;Display unit is set in the memory space, and its light-emitting surface is arranged outside towards the light out part, to form air imaging region on the outside of the light out part;Infrared emission component and infrared receiver component as its opposite end, in the memory space, and it is located at the display unit both sides, and the light-receiving direction for the light direction and infrared receiver component for being arranged to the infrared emission component is arranged towards the air imaging region, to be formed and at least partly overlapping gestures detection region of the air imaging region, so that user displayed image is directly carried out gesture operation can obtain display content feedback, the impression of What You See Is What You Get greatly promotes the experience effect of user.

Description

Air is imaged interaction device
Technical field
The utility model is related to human-computer interaction technique fields, and interaction device is imaged more particularly to air.
Background technology
Human-computer interaction is increasingly paid attention to by people as a kind of natural interactive mode, with traditional based on graphical user The interactive mode at interface (GUI) is compared, gesture interaction interface (the Vision-Based Gesture of view-based access control model Interface, VBGI) so that user has been broken away from the constraint of keyboard, mouse, thus become one kind naturally, unconfined interactive mode. The gesture interaction interface of view-based access control model is widely used in the multiple fields such as intelligent space, augmented reality, general fit calculation, gradually at For research hotspot both domestic and external.
But since VBGI has the characteristics that untouchable and ambiguity, generally passed using dual camera and LED infrared rays Sensor identifies various gesture operations, and then achievees the purpose that control host, but the program needs user to be familiar with more gesture Operation, and the height that resolution is not enough, easy to produce maloperation.So there is also some problems i.e. " Midas Touch Problem”。
In existing technical solution, the display content that is shown spatially exists with interaction area is isolated or user Special equipment must be worn, inconvenient to use, problems that operation learning difficulty is big, and system occupied space is big etc. are caused.
Invention content
In view of the foregoing deficiencies of prior art, the purpose of this utility model is to provide air to be imaged interaction device, For solving the problems of the prior art.
In order to achieve the above objects and other related objects, the utility model provides a kind of air imaging interaction device, including: Shell, interior to be equipped with memory space, the memory space forms light out part in at least side of the shell;Display unit is set to In the memory space, and its light-emitting surface is imaged towards setting outside the light out part with forming air on the outside of the light out part Region;Infrared emission component and infrared receiver component as its opposite end are set in the memory space, and are located at institute State display unit both sides, and be arranged to the infrared emission component light direction and infrared receiver component light-receiving direction it is equal It is arranged towards the air imaging region, to be formed and at least partly overlapping gestures detection region of the air imaging region.
In an embodiment of the utility model, the top of the shell is arranged the light out part, the display unit, red External transmitter and infrared camera are located at below the light out part.
In an embodiment of the utility model, the infrared emission component includes:Infrared transmitter;At least one reflection Mirror is set on the light direction of the infrared transmitter, the light that infrared transmitter is sent out is reflected towards the air imaging area Domain.
In an embodiment of the utility model, the infrared receiver component includes:Infrared remote receiver;At least one reflection Mirror is arranged to receive the infrared light from the air imaging region, and the light received is reflected towards described red Outer receiver.
In an embodiment of the utility model, the infrared receiver component includes:Infrared remote receiver.
In an embodiment of the utility model, the type of the infrared remote receiver includes:Infrared camera.
In an embodiment of the utility model, the air is imaged interaction device, further includes:Host is controlled, is set to In the memory space, it is electrically connected the display unit, infrared emission component and infrared receiver component.
In an embodiment of the utility model, the air is imaged interaction device, further includes:Host is controlled, is set to In the memory space and below the display unit, it is electrically connected the display unit, infrared emission component and infrared Receiving part.
In an embodiment of the utility model, the light out part is opening.
In an embodiment of the utility model, the light out part is the opening for being covered with light transparent member.
As described above, the air of the utility model is imaged interaction device, including:Shell, it is interior equipped with memory space, it is described to deposit It stores up space and forms light out part in at least side of the shell;Display unit is set in the memory space, and its light-emitting surface court It is arranged to outside the light out part, to form air imaging region on the outside of the light out part;Infrared emission component and right as its The infrared receiver component at end is set in the memory space, and is located at the display unit both sides, and is arranged to described red The light direction of outer emission element and the light-receiving direction of infrared receiver component are arranged towards the air imaging region, with shape At the gestures detection region at least partly overlapping with the air imaging region so that user is directly carrying out displayed image Gesture operation can obtain the feedback of display content, and the impression of What You See Is What You Get greatly promotes the experience effect of user.
Description of the drawings
Fig. 1 is shown as the inner plane structural schematic diagram of air imaging interaction device in one embodiment of the utility model.
Fig. 2 is shown as the dimensional structure diagram of air imaging interaction device in one embodiment of the utility model.
Component label instructions
1 air is imaged interaction device
11 shells
111 light out part
12 display units
121 air imaging regions
13 infrared emission components
131 infrared transmitters
132 speculums
14 infrared receiver components
141 infrared remote receivers
142 speculums
15 control hosts
Specific implementation mode
Illustrate that the embodiment of the utility model, those skilled in the art can be by this theorys below by way of specific specific example Content disclosed by bright book understands other advantages and effect of the utility model easily.The utility model can also be by addition Different specific implementation modes are embodied or practiced, and the various details in this specification can also be based on different viewpoints and answer With carrying out various modifications or alterations under the spirit without departing from the utility model.It should be noted that the case where not conflicting Under, the feature in following embodiment and embodiment can be combined with each other.
It should be noted that the diagram provided in following embodiment only illustrates the basic of the utility model in a schematic way Conception, component count, shape when only display is with related component in the utility model rather than according to actual implementation in schema then And size is drawn, when actual implementation kenel, quantity and the ratio of each component can be a kind of random change, and its assembly layout Kenel may also be increasingly complex.
Referring to Fig. 1, the inner plane structural representation of the air imaging interaction device 1 in displaying the utility model embodiment Figure.
The air imaging device includes:Shell 11, display unit 12, infrared emission component 13 and infrared receiver component 14。
The shell 11, interior to be equipped with memory space, the memory space forms light extraction in at least side of the shell 11 Portion 111.In an embodiment of the utility model, the shell 11 is a babinet, and shape is arbitrary, can be square body, cylinder Body or other three-dimensional shapes;11 bottom of the shell is positioned over surface to be placed (such as ground etc.), and top is the light extraction The structure in portion 111, the light out part 111 can be opening, can also be the opening for being covered with light out part 111, the light out part 111 be, for example, glass etc..
The display unit 12 is set in the memory space, and its light-emitting surface is arranged outside towards the light out part 111, To form air imaging region 121 in 111 outside of the light out part.In this present embodiment, the display unit 12 is, for example, sky Gas projection machine etc.;The display unit 12 is located at the lower section of the light out part 111, such as set on transparent component (such as glass Glass) lower section, light extraction upward.
As shown in Fig. 2, the dimensional structure diagram of air imaging device described in one embodiment of displaying, the shell 11 are Square box, the display unit 12 are imaged in 11 upper air of shell.
The infrared emission component 13 and infrared receiver component 14 as its opposite end are set in the memory space, and 12 both sides of the display unit (preferably opposite sides) are located at, and are arranged to the light extraction side of the infrared emission component 13 It is arranged to the light-receiving direction with infrared receiver component 14 towards the air imaging region 121, to be formed and the air At least partly overlapping gestures detection region of imaging region 121.
If at least partly overlapping gestures detection region of the air imaging region 121 is overlapping refer to as shown in Figure 1, User places hand 2 to the air imaging region 121, and the infrared light that infrared emission component 13 is sent out reaches air imaging area Domain 121, and the hand through user reflexes to the infrared receiver component 14, so as to realize the detection to user gesture, arrow in figure Head illustrates the light path trend of this process.
In this present embodiment, the infrared emission component 13 and infrared receiver component 14 are set under the light out part 111 Side.
In an embodiment of the utility model, the infrared emission component 13 includes:Infrared transmitter 131, for example, it is red Outer laser emitter;At least one speculum 132 is set on the light direction of the infrared transmitter 131, by infrared transmitter 131 light sent out are reflected towards the air imaging region 121.
In an embodiment of the utility model, the infrared receiver component 14 includes:Infrared remote receiver 141, i.e., it is infrared Sensor, such as cmos sensor, for being imaged;At least one speculum 142, be arranged to receive from the air at It is reflected towards the infrared remote receiver 141 as the infrared light in region 121, and by the light received.
It can enable light path that the effect folded is presented using speculum, be conducive to reduce the size of whole system, promote integrated level;It needs Illustrate, although the infrared emission component 13 and infrared receiver component 14 that are shown in the present embodiment include speculum, It can also be that only one includes speculum, therefore is not limited with the present embodiment in other embodiments to be.
In an embodiment of the utility model, the air is imaged interaction device 1, further includes:Host 15 is controlled, if In in the memory space, being electrically connected the display unit 12, infrared emission component 13 and infrared receiver component 14;The control Host 15 processed can pass through general host computer or the realization of Embedded processing system hardware device, the processing system hardware packet Containing processor and memory etc..
Wherein, the memory and may include random access memory (RandomAccessMemory, abbreviation RAM), May also further include nonvolatile memory (non-volatilememory), a for example, at least magnetic disk storage.
The processor can be general processor, including central processing unit (CentralProcessingUnit, abbreviation CPU), network processing unit (NetworkProcessor, abbreviation NP) etc.;It can also be digital signal processor (DigitalSignalProcessing, abbreviation DSP), application-specific integrated circuit (ApplicationSpecificIntegratedCircuit, abbreviation ASIC), field programmable gate array (Field- ProgrammableGateArray, abbreviation FPGA) either other programmable logic device, discrete gate or transistor logic device Part, discrete hardware components.
To, control host 15 can in conjunction with existing software technology come control display unit 12 display content and according to User gesture information that infrared emission component 13 and infrared receiver component 14 detect carries out gesture identification etc., and controls in turn Display unit 12 carries out the corresponding variation of display content, and in other words, the equipment of the utility model can give user's one kind right Air imaging carries out the experience that gesture operation can be achieved with control.
Certainly, specifically, Gesture Recognition and projection display technique are existing, and the utility model is carried The hardware device of confession is the improvement that can coordinate the software of those prior arts, but be not related to software technology, the utility model The overlapping air in gestures detection region and air imaging region 121 can be imaged the structure of interaction device 1 by realizing.
In conclusion the air of the utility model is imaged interaction device, including:Shell, it is interior equipped with memory space, it is described to deposit It stores up space and forms light out part in at least side of the shell;Display unit is set in the memory space, and its light-emitting surface court It is arranged to outside the light out part, to form air imaging region on the outside of the light out part;Infrared emission component and right as its The infrared receiver component at end is set in the memory space, and is located at the display unit both sides, and is arranged to described red The light direction of outer emission element and the light-receiving direction of infrared receiver component are arranged towards the air imaging region, with shape At the gestures detection region at least partly overlapping with the air imaging region so that user is directly carrying out displayed image Gesture operation can obtain the feedback of display content, and the impression of What You See Is What You Get greatly promotes the experience effect of user.
The utility model effectively overcomes various shortcoming in the prior art and has high industrial utilization.
The above embodiments are only illustrative of the principle and efficacy of the utility model, new not for this practicality is limited Type.Any person skilled in the art can all carry out above-described embodiment under the spirit and scope without prejudice to the utility model Modifications and changes.Therefore, such as those of ordinary skill in the art without departing from the revealed essence of the utility model All equivalent modifications completed under refreshing and technological thought or change, should be covered by the claim of the utility model.

Claims (10)

1. a kind of air is imaged interaction device, which is characterized in that including:
Shell, interior to be equipped with memory space, the memory space forms light out part in at least side of the shell;
Display unit is set in the memory space, and its light-emitting surface is arranged outside towards the light out part, in the light out part Outside forms air imaging region;
Infrared emission component and infrared receiver component as its opposite end are set in the memory space, and are located at described Display unit both sides, and it is arranged to the equal court in light-receiving direction of the light direction and infrared receiver component of the infrared emission component It is arranged to the air imaging region, to be formed and at least partly overlapping gestures detection region of the air imaging region.
2. air according to claim 1 is imaged interaction device, which is characterized in that go out described in the top setting of the shell Light portion, the display unit, infrared transmitter and infrared camera are located at below the light out part.
3. air according to claim 1 is imaged interaction device, which is characterized in that the infrared emission component includes:
Infrared transmitter;
At least one speculum is set on the light direction of the infrared transmitter, and the light that infrared transmitter is sent out reflects To the air imaging region.
4. air according to claim 1 is imaged interaction device, which is characterized in that the infrared receiver component includes:
Infrared remote receiver;
At least one speculum, be arranged to receive the infrared light from the air imaging region, and will be received Light is reflected towards the infrared remote receiver.
5. air according to claim 1 is imaged interaction device, which is characterized in that the infrared receiver component includes:It is red Outer receiver.
6. air according to claim 4 or 5 is imaged interaction device, which is characterized in that the type of the infrared remote receiver Including:Infrared camera.
7. air according to claim 1 is imaged interaction device, which is characterized in that further include:Host is controlled, is set to described In memory space, it is electrically connected the display unit, infrared emission component and infrared receiver component.
8. air according to claim 2 is imaged interaction device, which is characterized in that further include:Host is controlled, is set to described In memory space and below the display unit, it is electrically connected the display unit, infrared emission component and infrared receiver Component.
9. air according to claim 1 is imaged interaction device, which is characterized in that the light out part is opening.
10. air according to claim 1 is imaged interaction device, which is characterized in that the light out part is to be covered with light transmission The opening of component.
CN201820158784.4U 2018-01-30 2018-01-30 Air is imaged interaction device Expired - Fee Related CN207780717U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201820158784.4U CN207780717U (en) 2018-01-30 2018-01-30 Air is imaged interaction device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201820158784.4U CN207780717U (en) 2018-01-30 2018-01-30 Air is imaged interaction device

Publications (1)

Publication Number Publication Date
CN207780717U true CN207780717U (en) 2018-08-28

Family

ID=63211054

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201820158784.4U Expired - Fee Related CN207780717U (en) 2018-01-30 2018-01-30 Air is imaged interaction device

Country Status (1)

Country Link
CN (1) CN207780717U (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110119208A (en) * 2019-05-15 2019-08-13 京东方科技集团股份有限公司 Suspend display imaging device and the display touch control method that suspends

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110119208A (en) * 2019-05-15 2019-08-13 京东方科技集团股份有限公司 Suspend display imaging device and the display touch control method that suspends
WO2020228512A1 (en) * 2019-05-15 2020-11-19 京东方科技集团股份有限公司 Suspension display imaging apparatus and suspension display touch-control method

Similar Documents

Publication Publication Date Title
US8723789B1 (en) Two-dimensional method and system enabling three-dimensional user interaction with a device
US9207773B1 (en) Two-dimensional method and system enabling three-dimensional user interaction with a device
US9600078B2 (en) Method and system enabling natural user interface gestures with an electronic system
KR102335132B1 (en) Multi-modal gesture based interactive system and method using one single sensing system
US10048779B2 (en) Virtual hand based on combined data
CN105579929B (en) Human-computer interaction based on gesture
US20170351324A1 (en) Camera-based multi-touch interaction apparatus, system and method
US20150089453A1 (en) Systems and Methods for Interacting with a Projected User Interface
US9454260B2 (en) System and method for enabling multi-display input
TWI559174B (en) Gesture based manipulation of three-dimensional images
CN108255292A (en) Air imaging interaction systems, method, control device and storage medium
CN103729054A (en) Multi display device and control method thereof
KR20200043110A (en) Display apparatus and control method thereof
TW201214243A (en) Optical touch system and object detection method therefor
US20190114034A1 (en) Displaying an object indicator
US10664090B2 (en) Touch region projection onto touch-sensitive surface
CN105468209A (en) Virtual two-dimensional positioning module of input device and virtual input device
US20180075294A1 (en) Determining a pointing vector for gestures performed before a depth camera
CN207780717U (en) Air is imaged interaction device
TW201126397A (en) Optical touch control display and method thereof
CN206249278U (en) A kind of laser infrared multi-point interactive contactor control device
US9250748B2 (en) Portable electrical input device capable of docking an electrical communication device and system thereof
CN201886453U (en) Frustrated total internal reflection multi-touch device
CN108062186A (en) A kind of multifunction laser dummy keyboard
TW201324245A (en) Human-machine interface device and application method thereof

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20180828

Termination date: 20190130