CN104238735A - Device with gesture sensor - Google Patents

Device with gesture sensor Download PDF

Info

Publication number
CN104238735A
CN104238735A CN201410089925.8A CN201410089925A CN104238735A CN 104238735 A CN104238735 A CN 104238735A CN 201410089925 A CN201410089925 A CN 201410089925A CN 104238735 A CN104238735 A CN 104238735A
Authority
CN
China
Prior art keywords
gesture
image
device
processing unit
gesture sensor
Prior art date
Application number
CN201410089925.8A
Other languages
Chinese (zh)
Inventor
吴宗祐
柯怡贤
陈念泽
Original Assignee
原相科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to CN201310233948 priority Critical
Priority to CN201310233948.7 priority
Application filed by 原相科技股份有限公司 filed Critical 原相科技股份有限公司
Priority to CN201410089925.8A priority patent/CN104238735A/en
Publication of CN104238735A publication Critical patent/CN104238735A/en

Links

Abstract

The invention discloses a device with a gesture sensor. The gesture sensor comprises an image sensing unit and a processing unit. The image sensing unit is used for capturing the image of at least one gesture performed by a user. The processing unit is electrically connected with the image sensing unit, and the processing unit transmits control instructions according to the gesture images to allow the device to be operated.

Description

There is the device of gesture sensor

Technical field

The present invention relates to a kind of device with gesture sensor.

Background technology

Be common in the plurality of devices in life at present, such as water swivel, toilet and dodge gate, use proximity switch (proximity switch).Such as, the infrared induction water swivel (infrared sensor water tap) that the lavatory of the public places such as many hospitals, department store, station and dining room is provided with and infrared induction toilet (infrared sensor toilet) are all use the proximity switch with infrared sensor.

But proximity switch generally only possesses the function of opening and closing.For infrared induction water swivel, this kind of water swivel can provide current and stop supplying water, but the flow (discharge of current can not can be controlled as traditional water swivel, here be refer to the unit interval interior amount by the fluid of water delivering orifice with follow-up described " flow "), to such an extent as to infrared induction water swivel can not provide user the current of multiple different flow.

Summary of the invention

The invention provides a kind of discharging device, it utilizes gesture sensor and operation valve to control current.

The invention provides a kind of water swivel, it utilizes gesture sensor and can provide the current of multiple different flow.

The invention provides a kind of toilet, it utilizes gesture sensor and can provide the current of multiple different aquifer yield.

One embodiment of the invention provides a kind of device with gesture sensor, and it is discharging device, and comprises water outlet body, operation valve and gesture sensor.Water outlet body has water delivering orifice, and water delivering orifice is in order to provide current.Operation valve is installed in water outlet body, and for controlling current.Gesture sensor comprises image sensing unit and processing unit.Image sensing unit for capturing at least one gesture image that user does.Processing unit electrical connection image sensing unit, wherein processing unit sends steering order to operation valve according to gesture image correspondence.Steering order comprises first flow instruction or the second flow instruction, and the flow that operation valve changes current according to first flow instruction or the second flow instruction correspondence is first flow or the second flow, and wherein first flow is greater than the second flow.

Another embodiment of the present invention provides another kind of discharging device, and it comprises water outlet body, operation valve and gesture sensor.Water outlet body has water delivering orifice, and water delivering orifice is in order to provide current.Operation valve is installed in water outlet body, and for controlling current.Gesture sensor comprises image sensing unit and processing unit.At least one gesture image that image sensing unit is done for capturing user.Processing unit electrical connection image sensing unit, wherein processing unit sends steering order to operation valve according to gesture image correspondence.Steering order comprises one first water outlet instruction or one second water outlet instruction.The aquifer yield that operation valve changes current according to the first water outlet instruction or the second water outlet instruction correspondence is the first aquifer yield or the second aquifer yield, and wherein the first aquifer yield is greater than the second aquifer yield.

Another embodiment of the present invention provides a kind of device with gesture sensor, and it is water swivel, and comprises faucet body, operation valve and gesture sensor.Faucet body has water delivering orifice, and water delivering orifice is in order to provide current.Operation valve is installed in faucet body, and for controlling current.Gesture sensor comprises image sensing unit and processing unit.At least one gesture image that image sensing unit is done for capturing user.Processing unit electrical connection image sensing unit, wherein processing unit according to gesture image correspondence send at least one steering order to operation valve.Steering order comprises decrement instruction or increment instruction.Operation valve successively decreases according to decrement instruction the flow of current, and operation valve increases progressively the flow of current according to increment instruction.

Another embodiment of the present invention provides a kind of device with gesture sensor, and it is toilet, and comprises toilet tanks seat, water unit, operation valve and gesture sensor.Toilet tanks seat has flush port, and water unit connects toilet tanks seat, and has the water delivering orifice be communicated with flush port.Water delivering orifice flow to flush port in order to export water.Operation valve is installed in water unit, and for controlling current.Gesture sensor comprises image sensing unit and processing unit.At least one gesture image that image sensing unit is done for capturing user.Processing unit electrical connection image sensing unit, wherein processing unit according to gesture image correspondence send at least one steering order to operation valve.Steering order comprises the first flushing instruction or the second flushing instruction.The aquifer yield that operation valve controls current according to the first flushing instruction is the first aquifer yield, and the aquifer yield that operation valve controls current according to the second flushing instruction is the second aquifer yield, and wherein the first aquifer yield is greater than the second aquifer yield.

Another embodiment of the present invention provides a kind of device with gesture sensor, and it is display device, and comprises display unit and gesture sensor.Gesture sensor comprises image sensing unit and processing unit, and processing unit and image sensing unit are electrically connected.At least one gesture image that image sensing unit is done for capturing a user, and processing unit sends gesture at least on the other hand according at least one gesture image correspondence and controls signal to display unit, and control the operation of display unit according to gesture control signal.

Another embodiment of the present invention provides a kind of device with gesture sensor, and it is Satellite Navigation Set, and comprises display element, controller and gesture sensor.Controller and display unit are set up signal and are connected, a map datum and a coordinate data to be sent to display element display.Gesture sensor comprises image sensing unit and processing unit.At least one gesture image that image sensing unit is done for capturing a user.Processing unit electrical connection image sensing unit, and set up signal with controller and link, wherein processing unit sends gesture at least on the other hand according at least one gesture image correspondence and controls signal to controller, and controller controls the mode of display element display map data and coordinate data according to gesture control signal.

Another embodiment of the present invention provides a kind of device with gesture sensor, and it is golf auxiliary practice device, comprises exercising machine, indicating member and gesture sensor.Gesture sensor comprises image sensing unit and processing unit.When user's action, a silhouette of image sensing unit acquisition user, wherein silhouette comprises at least one hand image and a leg image, and wherein hand image and leg image shape have angle.Processing unit and image sensing unit and indicating member are set up signal and are linked, wherein according to built-in numerical range, processing unit judges whether angle falls within numerical range, when angle does not fall within numerical range, processing unit sends an indicator signal to indicating member, to inform user.

Based on above-mentioned, discharging device provided by the present invention, water swivel, toilet, display device and golf auxiliary apparatus all utilize gesture sensor, user not contact the mode of these devices to control discharging device, water swivel, toilet, display device and golf auxiliary apparatus, and can be operated these devices.For example, for discharging device, water swivel and toilet, utilize gesture sensor, user can control discharging device, water swivel and toilet to provide the current of multiple different flow or different aquifer yield in the mode of not contact-making switch.

In order to further understand technology of the present invention, refer to following detailed description and graphic.But institute's accompanying drawings and annex only provide reference and explanation use, are not used for being limited the present invention.

Accompanying drawing explanation

Figure 1A is the simplified diagram of the discharging device of one embodiment of the invention.

Figure 1B is the circuit box schematic diagram of the discharging device in Figure 1A.

Fig. 1 C is discharging device in Figure 1A simplified diagram when being in closed condition.

Fig. 2 A is the diagrammatic cross-section of the water swivel of one embodiment of the invention.

Fig. 2 B is the schematic diagram that the display element in Fig. 2 A is watched from display surface.

Fig. 3 is the diagrammatic cross-section of the water swivel of another embodiment of the present invention.

Fig. 4 A is the schematic perspective view of the toilet of one embodiment of the invention.

Fig. 4 B is the diagrammatic cross-section that in Fig. 4 A, I-I section along the line illustrates.

Fig. 4 C is the diagrammatic cross-section that in Fig. 4 B, II-II section along the line illustrates.

Fig. 5 A is the diagrammatic cross-section of the toilet of another embodiment of the present invention.

Fig. 5 B is the circuit box schematic diagram of the toilet in Fig. 5 A.

Fig. 6 A is the simplified diagram of the display device of one embodiment of the invention.

Fig. 6 B is the circuit box schematic diagram of the display device in Fig. 6 A.

Fig. 7 A is the simplified diagram of the Satellite Navigation Set of the embodiment of the present invention.

Fig. 7 B is the circuit box schematic diagram of the Satellite Navigation Set in Fig. 7 A.

Fig. 8 A shows the simplified diagram of the golf auxiliary practice device of one embodiment of the invention.

User's image that the image sensing unit that Fig. 8 B shows Fig. 8 A captures.

Fig. 8 C shows the circuit box schematic diagram of golf auxiliary practice device in Fig. 8 A.

Wherein, description of reference numerals is as follows:

54: guidance panel

100: discharging device

110: water outlet body

112,212,412: water delivering orifice

114: space

116a: efferent duct

116b: input pipe

118: cell body

120,820: gesture sensor

121,825: light emitting source

122,821: image sensing unit

123,823: processing unit

130,230,430: operation valve

140,240,440,541,640,710: display element

142,442: illuminating part

144,444: instruction light-passing board

200,300: water swivel

210: faucet body

242: display surface

400,500: toilet

410: water unit

414a: front

414b: the back side

450: toilet tanks seat

452: flush port

454: notch

542: control module

560: heating cushion

C1: counterclockwise

C2: clockwise direction

E1, R1: light

F1, F2: current

H1, H2: hand

W1, W2: wire

600: display device

610: display unit

620,720: controller

650,740: indicator elment

700: Satellite Navigation Set

701: microphone

630,702: loudspeaker

721: position receiver module

722: database

723: signal processing unit

800: golf auxiliary practice device

810: exercising machine

811: alley

812: medicine ball

824: display

830: indicating member

L1: first axle

L2: the second axis

θ: angle

Embodiment

Figure 1A is the simplified diagram of the discharging device of one embodiment of the invention.Refer to Figure 1A, discharging device 100 can export current F1, and can be water swivel, toilet or shower nozzle.Discharging device 100 comprises water outlet body 110, gesture sensor 120 and operation valve 130, and wherein operation valve 130 is installed in water outlet body 110, and for controlling current F1, and operation valve 130 can be solenoid valve (solenoid valve).The various gestures that the hand H1 that gesture sensor 120 can detect user does, and send corresponding steering order to operation valve 130 according to these gestures, open or close current F1 with order operation valve 130, or change flow or the aquifer yield of current F1.

Should be noted that, above-mentioned aquifer yield refers to the water yield that discharging device 100 exports, and the measurement unit of aquifer yield is such as volume unit, as litre, milliliter or gallon, or unit of weight, as kilogram, g or pound.In addition, the size of aquifer yield can decide with the operation valve 130 lasting time of opening current F1.The time that operation valve 130 continues to open current F1 is more of a specified duration, and aquifer yield is larger.Otherwise the time that operation valve 130 continues to open current F1 is shorter, and aquifer yield is less.

The retaining of water outlet body 110 energy, and the space 114 having water delivering orifice 112 and can supply water accommodating, the water wherein in space 114 can flow out from water delivering orifice 112, so water delivering orifice 112 can be used to provide current F1.In addition, in the embodiment shown in Figure 1A, water outlet body 110 can comprise cell body (tank) 118, efferent duct 116a and input pipe 116b, and wherein efferent duct 116a and input pipe 116b is all installed in cell body 118.Efferent duct 116a has water delivering orifice 112, and water can be directed in space 114 by input pipe 116b.

Although the water outlet body 110 shown in Figure 1A comprises cell body 118, efferent duct 116a and input pipe 116b, in other embodiments, water outlet body 110 also can be a pipe, and does not comprise cell body 118.So water outlet body 110 can have multiple enforcement aspect, and Figure 1A only illustrates a kind of water outlet body 110 is used as illustrating.

Figure 1B is the circuit box schematic diagram of the discharging device in Figure 1A.Refer to Figure 1A and Figure 1B, gesture sensor 120 comprises light emitting source 121, image sensing unit 122 and processing unit 123, and wherein processing unit 123 is electrically connected light emitting source 121 and image sensing unit 122.Light emitting source 121 can emit beam the hand H1 of E1 to user, and wherein light E1 can be visible ray (visible light) or invisible light (invisible light), and this invisible light is such as infrared ray.In addition, light emitting source 121 can be infrared light-emitting diode (Infrared Light-Emitting Diode, Infrared LED).

The contiguous light emitting source 121 of image sensing unit 122, and energy pick-up image, and image sensing unit 122 can capture dynamic image.Above-mentioned image can be reflected by light E1 and be formed, so image sensing unit 122 can capture the image that invisible light is formed, and the image such as formed by infrared ray.In addition, image sensing unit 122 can be CMOS (Complementary Metal Oxide Semiconductor) sensing element (Complementary Metal-Oxide-Semiconductor Sensor, CMOS Sensor) or charge coupled cell (Charge-Coupled Device, CCD).

When the hand H1 of user makes various control gesture, such as clench fist, open palm or wave, or palm is when rotating (as shown in Figure 1A) along counter clockwise direction C1 or clockwise direction C2, light E1 can be reflected into light R1 by hand H1, and image sensing unit 122 can receive light R1, and from light R1 pick-up image.So, the various control gesture acquisition various gestures image that image sensing unit 122 can be done from hand H1, and these gesture images reflect (i.e. light R1) by light E1 to form.

Should be noted that, in the present embodiment, gesture sensor 120 comprises the light emitting source 121 of the E1 that can emit beam, but in other embodiments, gesture sensor 120 also can not comprise light emitting source 121, and utilizes image sensing unit 122 directly to capture the image of hand H1.Specifically, hand H1 energy extraneous ray of reflecting, it is such as come from indoor lamp source or the sun of outdoor.The above-mentioned extraneous light that image sensing unit 122 can reflect from hand H1 carrys out pick-up image, the various control gesture acquisition various gestures image thus can done from hand H1 equally.So above-mentioned gesture image may also be and formed by extraneous light, and do not limit only by light E1 reflect (i.e. light R1) form.

Processing unit 123 can utilize wire incoming call connection control valve 130, or utilizes wireless technology, and such as Bluetooth technology (Bluetooth), sets up signal with operation valve 130 and be connected.Processing unit 123 can send multiple steering order to operation valve 130 according to these gesture image correspondences, opens or close current F1 with order operation valve 130, or changes flow or the aquifer yield of current F1.Processing unit 123 is such as digital signal processor (Digital Signal Processor, DSP), and can judge, in the multiple images captured at image sensing unit 122, whether have the gesture image of corresponding steering order according to algorithm, control gesture to differentiate these.

Processing unit 123 has Identification Data, and the algorithm that processing unit 123 uses can be object identification algorithm (object recognition) or object follows the trail of algorithm (object tracking).When processing unit 123 uses object identification algorithm, processing unit 123 can judge in the image captured at image sensing unit 122 according to Identification Data, whether there is the object of shape as hand, and judge whether the posture of this object meets wherein a kind of gesture image further.When processing unit 123 confirms that the posture of this object meets a certain gesture image, processing unit 123 can send the steering order of this gesture image corresponding to operation valve 130, to control operation valve 130.

When processing unit 123 uses object to follow the trail of algorithm, processing unit 123 can judge in the continuous image captured at image sensing unit 122 according to Identification Data, whether the movement locus of object meets wherein a kind of gesture image, wherein above-mentioned object can possess specific shape, it is such as the shape as hand, or as the shape of the electronic installation such as mobile phone or game machine (electronic device).When processing unit 123 confirms that the movement locus of this object meets the motion of a certain gesture image, processing unit 123 can send the steering order of this gesture image corresponding to control operation valve 130.

The steering order that processing unit 123 sends comprises first flow instruction or the second flow instruction.Operation valve 130 can be first flow according to the flow of first flow instruction change current F1, and is the second flow according to the flow of the second flow instruction change current F1, and wherein first flow is greater than the second flow.Therefore, according to first flow instruction or the second flow instruction, processing unit 123 can the corresponding flow changing current F1 of order operation valve 130.

In the embodiment shown in Figure 1A, gesture image corresponding to first flow instruction can be palm and rotates counterclockwise (hand H1 rotates along counter clockwise direction C1), and gesture image corresponding to the second flow instruction can be palm and rotate clockwise (hand H1 clockwise C2 rotates).Therefore, when opening hand H1 as user and rotate along counter clockwise direction C1, the flow that operation valve 130 can adjust current F1 is high flow capacity (first flow).When user opens hand H1 and C2 rotates clockwise, the flow that operation valve 130 can adjust current F1 is low discharge (the second flow).Accordingly, user can make the current F1 that palm rotates clockwise or rotate counterclockwise to obtain different flow.

In addition, in other embodiments, these steering orders also can comprise decrement instruction or increment instruction.Such as, when hand H1 makes the gesture that palm rotates counterclockwise, processing unit 123 can send increment instruction, increases the flow of current F1 with order operation valve 130.When hand H1 makes the gesture that palm rotates clockwise, processing unit 123 can send decrement instruction, reduces the flow of current F1 with order operation valve 130.So, when user wants the current F1 of large discharge, the hand H1 of user can make once or make continuously repeatedly the gesture that palm rotates counterclockwise.When user wants the current F1 of low discharge, the hand H1 of user can make once or make continuously repeatedly the gesture that palm rotates clockwise.

These steering orders also can comprise OPEN and to cut off the water instruction, and wherein corresponding two kinds of different gesture images are distinguished in OPEN and instruction of cutting off the water.OPEN is used to order operation valve 130 and opens current F1, and instruction of cutting off the water is used to order operation valve 130 closes current F1.Specifically, when operation valve 130 starts, and when current F1 does not produce, hand H1 can make the control gesture of corresponding OPEN before gesture sensor 120, captures gesture image to make image sensing unit 122.Processing unit 123 sends OPEN to operation valve 130 according to this gesture image.Now, operation valve 130 opens water delivering orifice 112 according to OPEN, makes current F1 start to flow out.

When current F1 opens, but when user wants to close current F1, correspondence can be made to cut off the water the control gesture of instruction before gesture sensor 120, capture gesture image to make image sensing unit 122.Processing unit 123 sends instruction of cutting off the water to operation valve 130 according to this gesture image.Now, operation valve 130 closes water delivering orifice 112 according to instruction of cutting off the water, and makes current F1 stop flowing out.

These steering orders also can comprise the first water outlet instruction or the second water outlet instruction.The aquifer yield that operation valve 130 can change current F1 according to the first water outlet instruction is the first aquifer yield, and is the second aquifer yield according to the aquifer yield that the second water outlet instruction changes current F1, and wherein the first aquifer yield is greater than the second aquifer yield.Therefore, according to the first water outlet instruction or the second water outlet instruction, processing unit 123 can the corresponding aquifer yield changing current F1 of order operation valve 130.In addition, processing unit 123 can carry out according to the first water outlet instruction or the second water outlet instruction the time that setup control valve 130 continues to open current F1 state, and then corresponding change aquifer yield.

For example, according to the first water outlet instruction, processing unit 123 can setup control valve 130 and allow current F1 continue outflow 10 seconds, and according to the second water outlet instruction, and processing unit 123 can setup control valve 130 and allow current F1 continue outflow 5 seconds.So, under the immovable prerequisite of overall flow of current F1, the first aquifer yield can be greater than the second aquifer yield, and operation valve 130 can provide different aquifer yields.

Discharging device 100 can also comprise display element 140.Display element 140 is electrically connected processing unit 123, or display element 140 can utilize wireless technology (such as Bluetooth technology) to set up signal with processing unit 123 and be connected.Display element 140 can show the state of discharging device 100, and such as showing discharging device 100 is be in starting state, is still in closed condition.Aforementioned starting state refers to that operation valve 130 starts, and the instruction that can receive processing unit 123 carrys out order operation valve 130 opens or close current F1, or changes current F1 flow or aquifer yield.So when discharging device 100 is in starting state, the control gesture that operation valve 130 can be done according to user immediately controls current F1.

Closed condition refers to the state that operation valve 130 does not start.When discharging device 100 is in closed condition, operation valve 130 is closed water delivering orifice 112 and keeps cutting off the water state, until operation valve 130 receives enabled instruction that processing unit 123 sends and starts.The initiation gesture of the corresponding user of enabled instruction, and image sensing unit 122 can capture from initiation gesture the gesture startup image formed by light R1 or extraneous light.Processing unit 123 can carry out order display element 140 according to this gesture startup image and show starting state, and starts operation valve 130.

When discharging device 100 is in closed condition, unless hand H1 makes initiation gesture, otherwise operation valve 130 is the not action by the gesture of user substantially.In addition, should be noted that, enabled instruction is different from above-mentioned OPEN.Enabled instruction is used to start operation valve 130, allows operation valve 130 can accept the control of processing unit 123, and OPEN is only used to order operation valve 130 and opens current F1.So, be used for starting the enabled instruction of operation valve 130 and be not same as OPEN for opening current F1.

In the embodiment of Figure 1A, display element 140 can comprise illuminating part 142 and instruction light-passing board 144, wherein illuminating part 142 is such as light emitting diode (LED) or cathode fluorescent tube (Cold Cathode Fluorescent Lamp, CCFL), and indicate light-passing board 144 can be polymethyl methacrylate base plate (the Polymethylmethacrylate substrate of printing opacity, PMMA substrate, i.e. acryl substrate) or glass plate.Instruction light-passing board 144 shows gesture corresponding to various instruction (such as enabled instruction, first flow instruction, the second flow instruction, OPEN and instruction of cutting off the water) and function.

For example, instruction light-passing board 144 for example, surface can illustrate word and/or pattern to the current F1 of the corresponding high flow capacity (first flow) of gesture representing palm and rotate counterclockwise, and the current F1 of the corresponding low discharge (the second flow) of gesture that palm rotates clockwise.So, user the content according to display element 140 can operate discharging device 100.In addition, in other embodiments, display element 140 can be liquid crystal display or organic light emitting diode display, so display element 140 not necessarily comprises instruction light-passing board 144.

Fig. 1 C is discharging device in Figure 1A simplified diagram when being in closed condition.Refer to Figure 1A and Fig. 1 C, in the present embodiment, when discharging device 100 is in starting state, processing unit 123 can carry out order display element 140 according to gesture startup image and show starting state.Now, illuminating part 142 can be luminous towards instruction light-passing board 144, to make instruction light-passing board 144 shinny, as shown in Figure 1A.

Otherwise when discharging device 100 is in closed condition, processing unit 123 can carry out order display element 140 according to gesture closedown image and show closed condition.The gesture that the corresponding hand H1 of closed condition does closes image, and gesture closedown image formed by light R1 or extraneous light.Image sensing unit 122 can capture gesture and close image, and processing unit 123 can be closed image according to gesture and closed display element 140, and sends out code to operation valve 130.So, display element 140 shows closed condition, and illuminating part 142 stops luminescence, as shown in Figure 1 C.Operation valve 130 enters closed condition.

In addition, in the present embodiment, it can be open palm (as shown in Figure 1A) that gesture corresponding to enabled instruction starts image, and gesture corresponding to out code to close image can be clench fist (as shown in Figure 1 C).To this, the surface of instruction light-passing board 144 can illustrate word and/or pattern represents the startup of opening the corresponding discharging device 100 of palm, and the closedown of corresponding discharging device 100 of clenching fist.

Should be noted that, in the present embodiment, the corresponding control gesture of both first flow instruction and the second flow instruction is respectively palm and rotates counterclockwise and rotate clockwise with palm, and the initiation gesture corresponding to enabled instruction is for opening palm, and the closedown gesture corresponding to out code is for clenching fist.But, in other embodiments, the corresponding gesture of both first flow instruction and the second flow instruction can be palm other gestures counterclockwise and beyond rotating clockwise, such as, wave, and initiation gesture with can be open palm and gesture beyond clenching fist both closedown gesture.Therefore, above-described initiation gesture, close gesture and control gesture can be clench fist, open palm, wave, palm rotates clockwise, palm rotates counterclockwise and the combination in any of other gestures.The present embodiment does not limit initiation gesture, closes gesture and controls the gesture motion of gesture.

It is worth mentioning that, discharging device 100 can be water swivel or toilet, and gesture sensor 120 can be applicable to water swivel.To this, below will with Fig. 2 A, Fig. 2 B and Fig. 3 as an example, describing discharging device 100 in detail is the embodiment of water swivel.In addition, water swivel shown in Fig. 2 A, Fig. 2 B and Fig. 3 has the technical characteristic similar to discharging device 100, and the following content feature that no longer repeated description water swivel is identical with discharging device 100 in principle, the mode of such as no longer repeated description gesture sensor identification gesture.

Fig. 2 A is the diagrammatic cross-section of the water swivel of one embodiment of the invention.Refer to Fig. 2 A, water swivel 200 comprises faucet body 210, gesture sensor 120 and operation valve 230.Operation valve 230 is installed in faucet body 210, and can control the current F2 that faucet body 210 exports, and the gesture that gesture sensor 120 can be done according to the hand H1 of user controls operation valve 230, can operate to make water swivel 200 according to the gesture of hand H1.

Faucet body 210 has water delivering orifice 212, and water delivering orifice 212 is for providing current F2.Operation valve 230 is installed in faucet body 210, and for controlling current F2, wherein operation valve 230 can be solenoid valve.Gesture sensor 120 is positioned at the top of water delivering orifice 212, and comprises light emitting source 121, image sensing unit 122 and processing unit 123.In the present embodiment, processing unit 123 can utilize wire W1 incoming call connection control valve 230, but in other embodiments, processing unit 123 can utilize wireless technology, such as Bluetooth technology, sets up signal be connected with operation valve 230.Utilize wire W1 or wireless technology, gesture sensor 120 can send steering order to control operation valve 230.

Light emitting source 121 can emit beam the hand H1 of E1 to user, and light E1 forms light R1 after handling H1 reflection.Utilize light E1, the various control gesture acquisition various gestures image that image sensing unit 122 can be done from hand H1, and these control gestures can comprise clench fist, open palm, wave, palm rotates clockwise, palm rotates counterclockwise or other gestures, wherein these gesture images reflect (i.e. light R1) by light E1 to form.Processing unit 123 can send multiple steering order to operation valve 230 according to these gesture image correspondences.So, gesture sensor 120 can control operation valve 230 according to the gesture of hand H1.In addition, above-mentioned gesture image also can be formed by the extraneous light reflected from hand H1.

Steering order can comprise decrement instruction or increment instruction.Specifically, operation valve 230 can successively decrease according to decrement instruction the flow of current F2, and increases progressively the flow of current F2 according to increment instruction.Decrement instruction and increment instruction corresponding different control gesture respectively.For example, in the embodiment shown in Fig. 2 A, the control gesture that increment instruction is corresponding can be that palm rotates along counter clockwise direction C1, and control gesture corresponding to decrement instruction can be palm C2 and rotating clockwise.

Hold above-mentioned, when gesture sensor 120 detect H1 in one's hands make the gesture that palm rotates along counter clockwise direction C1 time, image sensing unit 122 utilizes reflection acquisition gesture image (palm rotates counterclockwise) of light E1 or extraneous light, and processing unit 123 can send increment instruction to operation valve 230 according to this gesture image.Afterwards, operation valve 230 increases the flow of current F2 according to increment instruction.

When gesture sensor 120 detect H1 in one's hands make palm clockwise C2 rotate gesture time, image sensing unit 122 utilizes reflection acquisition gesture image (palm rotates clockwise) of light E1 or extraneous light, and processing unit 123 can send decrement instruction to operation valve 230 according to this gesture image.Afterwards, operation valve 230 reduces the flow of current F2 according to decrement instruction.It can thus be appreciated that gesture sensor 120 can rotate counterclockwise according to palm and palm rotates clockwise and controls operation valve 230, thus the flow controlling current F2 increases and reduces.

In addition, similar to aforementioned discharging device 100, these steering orders also can comprise first flow instruction, the second flow instruction, the first water outlet instruction and the second water outlet instruction, and the control gesture that first flow instruction, the second flow instruction, the first water outlet instruction and the second water outlet instruction are corresponding different respectively, wherein these control gestures can be clench fist, open palm, wave and the combination in any of other gestures.Image sensing unit 122 can capture various gestures image from these gestures, and processing unit 123 can send first flow instruction, the second flow instruction, the first water outlet instruction and the second water outlet instruction to operation valve 230 according to these gesture images.

First flow instruction and the second flow instruction are respectively used to the flow changing current F2.First water outlet instruction and the second water outlet instruction are respectively used to the aquifer yield changing current F2.Specifically, operation valve 230 can be first flow according to the flow of first flow instruction change current F1, and is the second flow according to the flow of the second flow instruction change current F1, and wherein first flow is greater than the second flow.Operation valve 230 can be also the first aquifer yield according to the aquifer yield of the first water outlet instruction change current F1, and is the second aquifer yield according to the aquifer yield of the second water outlet instruction change current F1, and wherein the first aquifer yield is greater than the second aquifer yield.In addition, operation valve 230 can utilize the time changing lasting unlatching current F2 to decide the first aquifer yield and the second aquifer yield.

In addition, similar to aforementioned discharging device 100, these steering orders also comprise OPEN or instruction of cutting off the water, wherein OPEN and instruction corresponding wherein two kinds of gesture images respectively of cutting off the water.OPEN is used to order operation valve 230 and opens current F2, and instruction of cutting off the water is used to order operation valve 230 closes current F2.Processing unit 123 can send OPEN according to different gesture images or instruction of cutting off the water to operation valve 230.Operation valve 230 can open water delivering orifice 212 according to OPEN, makes current F2 start to flow out, and closes water delivering orifice 212 according to instruction of cutting off the water, and makes current F2 stop flowing out.

When user wants water swivel 200 to supply water to produce current F2, the control gesture of corresponding OPEN can be made before gesture sensor 120, control operation valve 230 to make gesture sensor 120 and open current F2.When user wants closed tap 200 to stop current F2 to flow out, correspondence can be made to cut off the water the control gesture of instruction before gesture sensor 120, control operation valve 230 to make gesture sensor 120 and close current F2.So, the hand H1 of user under the condition of not touching water swivel 200, can utilize gesture to open and closed tap 200, to reduce the Contact of hand H1 and germ.In addition, OPEN can be identical with the control gesture corresponding to both increment instructions, and instruction of cutting off the water can be identical with the control gesture corresponding to both decrement instructions.

In the present embodiment, gesture sensor 120 can be installed on faucet body 210, and is positioned at the top of water delivering orifice 212, as shown in Figure 2 A.From Fig. 2 A, the gesture sensor 120 be positioned at above water delivering orifice 212 is exposed to the first half of faucet body 210, to such an extent as to user is easily found to the position of gesture sensor 120, and hand H1 also facilitates make various gestures above faucet body 210, be beneficial to operate water swivel 200.

Water swivel 200 can also comprise display element 240.Display element 240 can be electrically connected processing unit 123.Such as, or display element 240 can utilize wireless technology, Bluetooth technology, setting up signal with processing unit 123 is connected.The structure of the display element 240 in the present embodiment is same as in fact the structure of aforementioned display element 140, namely display element 240 can comprise illuminating part (Fig. 2 A does not illustrate) and instruction light-passing board (Fig. 2 A does not illustrate), therefore the structure of following no longer repeated description display element 240.In addition, in other embodiments, display element 240 also can be liquid crystal display or organic light emitting diode display, so the display element 240 in Fig. 2 A not necessarily comprises instruction light-passing board.

Display element 240 has display surface 242, and display surface 242 is to show gesture corresponding to various instruction (such as first flow instruction, the second flow instruction, OPEN and instruction of cutting off the water) and function with word and/or pattern.When display element 240 comprises instruction light-passing board, instruction light-passing board has display surface 242, and word and/or pattern can illustrate on display surface 242.When display element 240 be liquid crystal display or organic light emitting diode display time, display surface 242 can show the picture containing above-mentioned word and/or pattern.

Fig. 2 B is the schematic diagram that the display element in Fig. 2 A is watched from display surface.Refer to Fig. 2 A and Fig. 2 B, in the present embodiment, the word that display surface 242 can show " waving ", " clenching fist ", " palm rotates clockwise " and " palm rotates counterclockwise " etc. represent gesture, and the word of the function of display these gestures corresponding, i.e. " startup ", " closedown ", " tap of fetching boiling water, add flow " and " closed tap, subtract flow ".

So, user can learn from display surface 242, start water swivel 200, and hand H1 will make the gesture of waving.Want closed tap 200, hand H1 will make the gesture of clenching fist.Fetch boiling water tap or increase the flow of current F2, hand H1 will make the gesture that palm rotates counterclockwise.Want the flow of closed tap or minimizing current F2, hand H1 will make the gesture that palm rotates clockwise.

In addition, display element 240 can show the state of water swivel 200, and such as showing water swivel 200 is be in starting state or closed condition.Identical with aforementioned discharging device 100, starting state refers to that operation valve 230 starts, and the steering order (such as first flow instruction, the first water outlet instruction and OPEN) that processing unit 123 sends can be accepted open or close current F2, or change current F2 flow or aquifer yield.

Closed condition refers to the state that operation valve 230 is not activated.When water swivel 200 is in closed condition, operation valve 230 can be closed water delivering orifice 212 and keep the state of cutting off the water, until operation valve 230 receives enabled instruction that processing unit 123 sends and starts.The initiation gesture that the corresponding user of enabled instruction does, and image sensing unit 122 can capture from initiation gesture the gesture startup image formed by light R1 or extraneous light.Processing unit 123 can start image according to gesture and send enabled instruction, to make display element 140 show starting state, and starts operation valve 130.

When water swivel 200 is in closed condition, unless hand H1 makes initiation gesture, otherwise operation valve 230 is the not action by the gesture of user substantially.In addition, enabled instruction is used to start operation valve 230, allow operation valve 230 can accept the control of processing unit 123, but OPEN is used to order operation valve 230 opens current F2.So above-mentioned enabled instruction is not same as OPEN.

About above-mentioned closed condition, the closedown gesture acquisition that image sensing unit 122 more can be done from hand H1 closes image by the gesture of light E1 or ambient light line reflection, and processing unit 123 can close image closedown display element 240 according to gesture, such as, stop the illuminating part of display element 240 luminous.Secondly, processing unit 123 can also be closed image according to gesture and be sent out code to operation valve 230, enters closed condition to make operation valve 230.In addition, above initiation gesture with close gesture can be clench fist, open palm, wave, palm rotates clockwise, palm rotates counterclockwise and the combination in any of other gestures.

Fig. 3 is the diagrammatic cross-section of the water swivel of another embodiment of the present invention.Refer to Fig. 3, the water swivel 300 of the present embodiment is similar to aforementioned water swivel 200, and the identical technical characteristic of both water swivels 200 and 300 no longer repeated description in principle.But, water swivel 200 and 300 still has difference therebetween, and it is: in water swivel 300, gesture sensor 120 is positioned at below water delivering orifice 212, and be not positioned at the flow path of current F2, wherein faucet body 210 hides gesture sensor 120, as shown in Figure 3.

Because faucet body 210 hides gesture sensor 120, therefore faucet body 210 can the light in the shield portions external world to the image sensing unit 122 of gesture sensor 120.So, faucet body 210 can reduce some bias light noises and enter to image sensing unit 122, to improve the induction accuracy of gesture sensor 120, thus contributes to reducing the probability that misoperation occurs because of noise impact water swivel 300.

In addition, in the present embodiment, gesture sensor 120 can utilize wire W2 incoming call connection control valve 230 and display element 240, can send instruction to operation valve 230 and display element 240 to make processing unit 123.Such as, but in other embodiments, processing unit 123 also can utilize wireless technology, Bluetooth technology, setting up signal with operation valve 230 and display element 240 is connected.Therefore, utilize wire W2 or wireless technology, gesture sensor 120 also can control operation valve 230 and display element 240.

Except water swivel 200, discharging device 100 also can be toilet, and namely gesture sensor 120 also can be applicable to toilet.Below will for Fig. 4 A to Fig. 4 C, Fig. 5 A and Fig. 5 B, describing discharging device 100 in detail is the embodiment of toilet.In addition, Fig. 4 A to Fig. 4 C, Fig. 5 A and the toilet shown in Fig. 5 B similar to discharging device 100, and the following content feature that no longer repeated description toilet is identical with discharging device 100 in principle, the mode of such as no longer repeated description gesture sensor identification gesture.

Fig. 4 A is the schematic perspective view of the toilet of one embodiment of the invention, and Fig. 4 B is the diagrammatic cross-section that in Fig. 4 A, I-I section along the line illustrates.Refer to Fig. 4 A and Fig. 4 B, toilet 400 comprises water unit 410, gesture sensor 120, operation valve 430 and toilet tanks seat 450.Water unit 410 provides washes out excremental water, and has water delivering orifice 412.Toilet tanks seat 450 connects water unit 410, and has flush port 452 and notch 454.Flush port 452 is communicated with water delivering orifice 412, and water delivering orifice 412 can export current (Fig. 4 A and Fig. 4 B does not all illustrate) to flush port 452, and the water wherein in water unit 410 can flow in notch 454 via water delivering orifice 412 and flush port 452.

Water unit 410 has front 414a and back side 414b, and wherein front 414a is between notch 454 and back side 414b, and gesture sensor 120 is configured on the 414a of front.So, when user uses toilet 400 to carry out urine, user can make control gesture before gesture sensor 120, can detect that this controls gesture to make gesture sensor 120.Operation valve 430 is installed in water unit 410, and for controlling current.

Gesture sensor 120 can utilize wire incoming call connection control valve 430, or utilizes wireless technology, such as Bluetooth technology, sets up signal be connected with operation valve 430.So, gesture sensor 120 can send instruction to operation valve 430, opens to control operation valve 430 and closes current.In addition, in the present embodiment, water unit 410 can be reserve tank, but in other embodiments, toilet 400 can be the toilet not having reserve tank, and water unit 410 can be water pipe, so water unit 410 is not defined as reserve tank.

Gesture sensor 120 comprises light emitting source 121, image sensing unit 122 and processing unit 123.When light emitting source 121 emits beam E1 to hand H1, the various control gesture acquisition various gestures image that image sensing unit 122 can be done from hand H1, wherein these gesture images are the light E1 after namely being reflected by light R1() formed.Processing unit 123 can send steering order to operation valve 430 according to these gesture image correspondences, and wherein steering order comprises the first flushing instruction or the second flushing instruction.In addition, gesture sensor 120 can not comprise light emitting source 121, and above-mentioned gesture image can be formed by the extraneous light reflected from hand H1.

Operation valve 430 can be the first aquifer yield according to the aquifer yield of the first flushing instruction control current, and can be the second aquifer yield according to the aquifer yield of the second flushing instruction control current, wherein the first aquifer yield is greater than the second aquifer yield, and operation valve 430 can utilize the time continuing to open current to control the aquifer yield of current.

When user utilizes toilet 400 urine, because gesture sensor 120 is configured on the 414a of front, therefore hand H1 can make the control gesture of corresponding second flushing instruction before gesture sensor 120, to produce the current of low aquifer yield, reaches water-saving effect.When user utilizes toilet 400 to defecate, hand H1 can make the control gesture of corresponding first flushing instruction before gesture sensor 120, to produce the current of high aquifer yield, thus guarantees that excreta is washed out.In addition, each control action of gesture can be clench fist, open palm, wave, palm rotates clockwise, palm rotates counterclockwise or other gestures.

Toilet 400 can also comprise display element 440, and display element 440 can show toilet 400 is be in starting state or closed condition.Display element 440 is electrically connected processing unit 123, or display element 440 can utilize wireless technology (such as Bluetooth technology) to set up signal with processing unit 123 and be connected.So, processing unit 123 can send instruction to display element 440, to control display element 440.In addition, the processing unit 123 of the present embodiment also can send enabled instruction or out code, and wherein the mode that produces of enabled instruction and both out codes is identical with previous embodiment, so no longer repeated description.

Fig. 4 C is the diagrammatic cross-section that in Fig. 4 B, II-II section along the line illustrates.Refer to Fig. 4 B and Fig. 4 C, display element 440 comprises illuminating part 442 and instruction light-passing board 444.Illuminating part 442 is such as light emitting diode or cathode fluorescent tube, and indicates light-passing board 444 to can be polymethyl methacrylate base plate or the glass plate of printing opacity, and has operation screen.Operation screen can show gesture corresponding to various instruction (such as the first flushing instruction and the second flushing instruction) and function, and operation screen available word and/or pattern show.When display element 440 shows operation screen, illuminating part 442 is luminous towards instruction light-passing board 444, to make instruction light-passing board 444 shinny.

Fig. 5 A is the diagrammatic cross-section of the toilet of another embodiment of the present invention, and Fig. 5 B is the circuit box schematic diagram of the toilet in Fig. 5 A.Refer to Fig. 5 A and Fig. 5 B, the toilet 500 of the present embodiment is similar in appearance to toilet 400, and the identical technical characteristic of both toilets 400 and 500 no longer repeated description in principle.But toilet 400 and 500 still has difference therebetween, it is that toilet 500 also comprises control module 542.

Specifically, control module 542 is such as processor, and is electrically connected gesture sensor 120, and wherein control module 542, display element 541 and gesture sensor 120 can be integrated into guidance panel 54, as shown in Figure 5A.So, user can utilize gesture to operate guidance panel 54, thus controls the current that operation valve 430 produces different aquifer yield.

In addition, toilet 500 can also comprise heating cushion 560, and it is installed on toilet tanks seat 450, and is electrically connected control module 542.The temperature control gesture acquisition gesture temperature control image that image sensing unit 122 can be done from user, and processing unit 123 can send instruction to control module 542 according to gesture temperature control image, with the temperature making control module 542 can control heating cushion 560 according to gesture temperature control image, wherein above-mentioned temperature control gesture can be clench fist, open palm, wave, palm rotates clockwise, palm rotates counterclockwise and the combination in any of other gestures.

It is worth mentioning that, in other embodiments, toilet 500 also can comprise another gesture sensor 120, and the quantity of the gesture sensor 120 namely included by toilet 500 can be at least two.The front 414a(that this extra additional gesture sensor 120 can be configured in water unit 410 refers to Fig. 4 B), and the toilet lid of toilet 500, such as heating cushion 560, change-over switch (not illustrating) can be installed.

When user utilizes toilet 500 urine, toilet lid can be started, to trigger (trigger) change-over switch.Now, change-over switch can start the gesture sensor 120 being positioned at water unit 410, and allows the gesture sensor 120 of guidance panel 54 close or to enter dormant state.So, user can make the control gesture of corresponding second flushing instruction to the gesture sensor 120 in front, to save the water yield that toilet 500 is washed by water.

When user utilizes toilet 500 to defecate, toilet lid lid can be placed on toilet tanks seat 450, to trigger change-over switch.Now, change-over switch can the gesture sensor 120 of start-up operation panel 54, and the gesture sensor 120 making way for water unit 410 is closed or enters dormant state.So, be sitting in toilet 500 person of being convenient to use can make corresponding first flushing instruction control gesture to guidance panel 54, to produce the current of high aquifer yield, thus guarantee that excreta is washed out.In addition, the front 414a(that the gesture sensor 120 in Fig. 5 A can move on to water unit 410 refers to Fig. 4 B), and Fig. 4 A to Fig. 4 C, Fig. 5 A and the toilet shown in Fig. 5 B 400 and 500 are only for illustrating, do not limit the present invention.

Fig. 6 A is the simplified diagram of the display device of one embodiment of the invention.Refer to Fig. 6 A, display device 600 can allow user operate in a non-contact manner, and namely user can carry out operation display device 600 without touching switch or telepilot.Display device 600 can be furnished and should not to be caught the environment that tactile mode operates user, and such as kitchen, bathroom or hospital, therefore display device 600 can be a bathroom television, a kitchen TV or a Medical Devices TV.

Bathroom television can be furnished in bathroom, and can operate in the environment of humidity (such as bathroom), and bathroom television has than general TV preferably water proofing property and moisture-proof.Kitchen TV can furnish the TV in kitchen, and can at high temperature and oil smoke distribution environment in operate.Medical Devices TV can be the display of therapeutic equipments or medical test instrument.For example, Medical Devices TV can be used as introscope (endoscopy), magnetic shakes radiography (Magnetic Resonance Imaging, MRI) display screen of equipment, CAT scanner (Computed Tomography, CT) or circular knife (helical tomotherapy) therapy apparatus.

For kitchen TV, when kitchen, user may process food materials or the cooking, to such an extent as to both hands are stained with oil, water or food materials (such as flour or meat) and inconvenience directly contacts TV with hand or operates with telepilot.The display device 600 of the embodiment of the present invention can allow user's operation display device when not contacting display device.

Fig. 6 B is the circuit box schematic diagram of display device in Fig. 6 A.Refer to Fig. 6 A and Fig. 6 B.Display device 600 comprises display unit 610 and gesture sensor 120.Display unit 610 receivable channel signal and display frame, and can be in fact televisor, be such as LCD TV, plasm TV, organic light emission TV or cathode-ray tube TV (Cathode Ray Tube TV, CRT TV).In addition, above-mentioned channel signals comprises sound signal and vision signal.

Gesture sensor 120 can control display unit 610, and utilizes gesture sensor 120, and user can touch televisor (i.e. display unit 610) or telepilot to operate TV.Specifically, display unit 610 can comprise controller 620 and the display element 640 being electrically connected controller 620.Controller 620 comprises motherboard and is installed in the electronic component (electronic component) on motherboard.Display element 640 energy show image, and pixel (pixel) can be had, wherein display element 640 is such as Liquid Crystal Module (Liquid Crystal Module, LCM), organic LED panel or Plasmia indicating panel.

Gesture sensor 120 and display unit 610 carry out signal link.For example, the processing unit 123 of gesture sensor 120 can adopt wire or circuit board to be electrically connected the controller 620 of display unit 610.Or, processing unit 123 and both controllers 620 all can have radio receiving transmitting module, and processing unit 123 and controller 620 can utilize radio receiving transmitting module each other to set up wireless link, wherein above-mentioned radio receiving transmitting module is such as infrared signal transceiver module or bluetooth transceiver module.

Accordingly, the various gestures that the hand H1 that gesture sensor 120 can detect user does, and send according to these gestures and corresponding control signal to controller 620, control display element 640 with instruction control unit 620.In addition, the method for gesture sensor 120 identification gesture illustrates in the aforementioned embodiment, at this no longer repeated description.

When the hand H1 of user makes various control gesture, such as clench fist, open palm or wave, or palm is when rotating along counter clockwise direction C1 or clockwise direction C2, light E1 can be reflected into light R1 by hand H1, and image sensing unit 122 can receive light R1, and from light R1 pick-up image.So, the various control gesture acquisition various gestures image that image sensing unit 122 can be done from hand H1, and these gesture images reflect (i.e. light R1) by light E1 to form.

Should be noted that, in the present embodiment, gesture sensor 120 comprises the light emitting source 121 of the E1 that can emit beam, but in other embodiments, gesture sensor 120 also can not comprise light emitting source 121, and utilizes image sensing unit 122 directly to capture the image of hand H1.Specifically, hand H1 energy extraneous ray of reflecting, it is such as come from indoor lamp source or the sun of outdoor.The above-mentioned extraneous light that image sensing unit 122 can reflect from hand H1 carrys out pick-up image, the various control gesture acquisition various gestures image thus can done from hand H1 equally.So above-mentioned gesture image may also be and formed by extraneous light, and do not limit only by light E1 reflect (i.e. light R1) form.

The gesture control signal that processing unit 123 sends can comprise multiple instruction.In one embodiment, control signal comprises OPEN and out code, wherein OPEN and out code corresponding two kinds of different gesture images respectively.OPEN is used to instruction control unit 620 and opens display element 640, and out code is used to instruction control unit 620 closes display element 640.Specifically, controller 620 has a switch module (not illustrating), and whether controller 620 controls to power to display element 640 by switch module.

When display unit 610 is in off-mode, the electric energy of external power source is namely still accepted in display device 600, but when display element 640 is not powered, user can make the control gesture of corresponding OPEN before gesture sensor 120, captures this control gesture image to make image sensing unit 122.Processing unit 123 controls gesture image according to this and sends OPEN to controller 620, allows display element 640 be powered, thus is opened by display element 640.

When display unit 610 is in open state, such as, be the situation of display device 600 show image, user can make the control gesture of corresponding out code with hand H1.Now, processing unit 123 according to image sensing unit 122 the control gesture image that captures send out code to controller 620, to be shut down by display unit 610.

In other embodiments, the gesture control signal that processing unit 123 sends also comprises channel switching instruction.Controller 620 can according to channel switching instruction, the channel that switching display unit 610 receives.Aforementioned channel switching command is such as upwards selected channel instruction, is selected channel instruction or directly switching channels instruction downwards, and these channel switching instruction are control gestures corresponding different respectively.For example, hand H1 is opened the gesture correspondence moved up and upwards select channel instruction, and hand H1 to be opened the gesture moved down be corresponding select channel instruction downwards.If user goes out set of number with finger ratio, then representative is directly switch to the channel to organizing numeral.

That is, controller 620 can, according to upwards selecting channel instruction, make reception frequency range sequentially be switched to higher frequency range by current frequency range.Controller 620 also can, according to selecting channel instruction downwards, make reception frequency range sequentially be switched to lower frequency range by current frequency range.Specifically, controller 620 has receiver module (not shown).When user makes the gesture of corresponding aforementioned channel switching command, processing unit 123 sends channel switching instruction to controller 620, switches reception frequency range to control aforementioned receiver module.

In another embodiment, gesture control signal can comprise audio instructions.Processing unit 123 sends the sound size that audio instructions instruction control unit 620 adjusts display unit 610.Aforementioned audio instruction such as sound amplification instruction or sound turn small instruction, control gestures corresponding different respectively.For example, hand H1 opens toward counterclockwise rotating corresponding sound amplification instruction, and H1 opens and rotates then corresponding sound toward clockwise direction and turn small instruction.The instruction corresponding to gesture of processing unit 123 identification user, then the sound of display unit 610 tunes up or turns down by instruction control unit 620.Specifically, display unit 610 has loudspeaker 630, and loudspeaker 630 is electrically connected with controller 620.After controller 620 receives the audio instructions that processing unit 123 sends, then control sound size by control loudspeaker 630.

In the foregoing embodiments, processing unit 123 and controller 620 holding signal link always.But in other embodiments, after image sensing unit 122 captures a startup line gesture image of user, the signal just between start treatment unit 123 and controller 620 links.After image sensing unit 122 captures a termination line gesture image of user, the signal that processing unit 123 can cut off between controller 620 links.

That is, do not carry out signal link between processing unit 123 and controller 620 before, display device 600 is the not action to some extent because of the gesture of user substantially.But user is still by general mode of operation, such as, touches display device or carry out operation display device 600 with telepilot.Specifically, when user is for controlling the operation of display device 600 with gesture, must makes and start line gesture.After image sensing unit 122 captures and starts line gesture image, processing unit 123 starts line gesture image according to this again and transmits enabled instruction, thus and sets up signal link between controller 620.In addition, when user is for carrying out operation display device 600 in typical fashion, must makes and stop line gesture.After image sensing unit 122 captures and stops line gesture image, processing unit 123 interrupts the signal link between controller 620.

Display device 600 can also comprise indicator elment 650.Indicator elment 650 is electrically connected or utilizes wireless technology (such as Bluetooth technology) to set up signal with processing unit 123 and is connected.Indicator elment 650 can be used to the state of line between display controller 620 and processing unit 123.That is, after image sensing unit 122 captures the startup line gesture image made to user, processing unit 123 starts line gesture image according to this and signal link set up by controller 620, and order indicator elment 650 shows starting state.After image sensing unit 122 captures the termination line gesture image made to user, processing unit 123 interrupts the signal link between controller 620, and is closed by indicator elment 650.

In the embodiment of Fig. 6 A, indicator elment 650 can comprise pilot lamp, and its pilot lamp is such as light emitting diode (LED) or cathode fluorescent tube (Cold Cathode Fluorescent Lamp, CCFL).For example, when setting up signal link between processing unit 123 and controller 620, pilot lamp is luminous, to show starting state.And between processing unit 123 and controller 620 during look-at-me line, pilot lamp stops luminous, and show line closed condition.

In addition, in other embodiments, gesture corresponding to aforesaid various instruction (such as enabled instruction, line command for stopping, channel switching instruction, audio instructions, OPEN and out code) and function can utilize screen display technology (On-Screen Display), shown by display element 640, to illustrate with the method for gesture operation display device 600.

For example, display element 640 can show word and/or pattern and tune up to the corresponding sound of gesture representing palm and rotate counterclockwise, and the corresponding sound of gesture that palm rotates clockwise is turned down.So, user the content according to display element 640 can carry out operation display device 600.

In addition, in the present embodiment, startup line gesture image corresponding to enabled instruction can be open palm, and termination line gesture image corresponding to line command for stopping can be clench fist.To this, the surface of display element 640 can illustrate word and/or pattern represents that opening palm correspondence starts line gesture operation function, and the corresponding function stopping line gesture imaging operations of clenching fist.Should be noted that, above-described startup line gesture, stop line gesture, open gesture, close gesture and channel switch gesture can be clench fist, open palm, wave, palm rotates clockwise, palm rotates counterclockwise and the combination in any of other gestures.The present embodiment does not limit aforementioned these and controls the action of gesture and the function corresponding to it.

It is worth mentioning that, display device also can be a Satellite Navigation Set.The embodiment that display device is Satellite Navigation Set is described in detail below by with Fig. 7 A and Fig. 7 B.In addition, the Satellite Navigation Set shown in Fig. 7 A to Fig. 7 B has the technical characteristic similar to previous embodiment, and following content no longer repeats in principle, the mode of such as no longer repeated description gesture sensor identification gesture.

Fig. 7 A is the simplified diagram of the Satellite Navigation Set of one embodiment of the invention.Fig. 7 B is the circuit box schematic diagram of Fig. 7 A Satellite guider.In the present embodiment, Satellite Navigation Set 700 comprises display element 710, controller 720 and gesture sensor 120.

Display element 710 energy show image, and pixel (pixel) can be had, wherein display element 710 is such as Liquid Crystal Module (Liquid Crystal Module, LCM), organic LED panel or Plasmia indicating panel.

It is inner or outside that controller 720 is installed in display element 710, and set up signal link with display element 710, shows so that a map datum and a coordinate data are sent to display unit 710.Please refer to Fig. 7 B, controller 720 comprises position receiver module 721, database 722 and signal processing unit 723.Database 722 stores at least one map datum.Position receiver module 721 can be global positioning satellite (GPS) receiver, in order to receive at least one satellite-signal.

Signal processing unit 723 and position receiver module 721, database 722 are set up signal with display element 710 and are linked.Specifically, signal processing unit 723 carries out signal transacting, so that satellite-signal is converted to a coordinate data after receiving the satellite-signal that position receiver module 721 receives.Aforesaid coordinate data typically refers to the on-site coordinate data of Satellite Navigation Set 700.Time mobile when the vehicle that Satellite Navigation Set 700 is driven along with user, position receiver module 721 can continuous receiving satellite signal, and continues received satellite-signal to be passed to signal processing unit 723, and makes coordinate data continuous updating.In addition, signal processing unit 723 captures map datum, and map datum is sent to display element 710 together with coordinate data shows.

After user inputs destination address, signal processing unit 723 receives input data, and calculates the target coordinates data of corresponding destination address.In addition, signal processing unit 723 also according to target coordinates data, coordinate data and map datum, and calculates at least one path data.Aforesaid path data comprises the coordinate points may passed through to target coordinates data by coordinate data.The line of these coordinate points is namely by the path of user position to destination address.

Also can control display unit 710 presents aforesaid map datum, coordinate data and path data to signal processing unit 723.In one embodiment, signal processing unit 723 can control display unit 710 and comes display map data, coordinate data and path data with different display modes.Aforesaid display mode is such as that plane (2D) shows map or solid (3D) shows map, and these display modes can switch mutually.

That is, display element 710 different display modes can come display map data, coordinate data and path data.In embodiments of the present invention, the gesture that gesture sensor 120 can do according to the hand H1 of user carrys out instruction control unit 720, with the display mode making display element 710 can control map datum, coordinate data and path data according to the gesture of hand H1.Be described in detail as follows.

Please refer to Fig. 7 A and Fig. 7 B, in the present embodiment, gesture sensor 120 comprises light emitting source 121, image sensing unit 122 and processing unit 123.The signal processing unit 723 that processing unit 123 can utilize wire to send a telegram here in connection control device 720, but in other embodiments, processing unit 123 can utilize wireless technology, such as Bluetooth technology, sets up signal be connected with controller 720.Utilize wire or wireless technology, gesture sensor 120 can send gesture control signal and carry out instruction control unit 720.

Light emitting source 121 can emit beam the hand H1 of E1 to user, and light E1 forms light R1 after handling H1 reflection.Utilize light E1, the various control gesture acquisition various gestures image that image sensing unit 122 can be done from hand H1, and these control gestures can comprise clench fist, open palm, wave, palm rotates clockwise, palm rotates counterclockwise or other gestures, wherein these gesture images reflect (i.e. light R1) by light E1 to form.In other embodiments, above-mentioned gesture image also can be formed by the extraneous light reflected from hand H1.

Processing unit 123 can send according to these gesture image correspondences the signal processing unit 723 that multiple gesture controls signal to controller 720, and the signal processing unit 723 of controller 720 controls according to gesture control signal the display mode that display element 710 changes map datum and coordinate data again.

Specifically, gesture control signal can comprise the first switching command and the second switching command, and wherein the first switching command plane is shown map to switch to stereo display map, and the second switching command is plane display map by stereo display Map Switch.First switching command and the second switching command corresponding different gesture respectively.For example, the first switching command is corresponding gesture is than going out two fingers, and the second switching command is than going out three fingers.When processing unit 123 picks out the gesture of corresponding first switching command, processing unit 123 transmits the first switching command to controller 720, to switch to plane map to show the display mode of map datum.And when processing unit 123 picks out the gesture of corresponding second switching command, processing unit 123 transmits the second switching command to controller 720, to switch to relief map to show display mode.

In addition, gesture control signal more can comprise amplification instruction, reduce instruction, move, and these instructions are control gestures corresponding different respectively.For example, the gesture corresponding to amplification instruction is that thumb and forefinger are opened gradually by closure state, and the gesture reduced corresponding to instruction is that thumb and forefinger are closed gradually by open mode.Image sensing unit 122 is after the gesture image of acquisition user, and processing unit 123 sends gesture and controls signal to controller 720, controls the map datum partial enlargement that presents of display unit 710 or reduces.

In addition, user also controls the switch of Satellite Navigation Set 700 by gesture.Specifically, user can make and opens gesture and close gesture.Opening gesture is such as the continuous folding of palm three times, finally rests on open mode about 3 seconds, and to close gesture be such as the continuous folding of palm three times, finally rests on the state of clenching fist about 3 seconds.When aforesaid image sensing unit 122 is after the gesture image of acquisition user is open gesture image, send OPEN to controller 720 by processing unit 123, and display unit 700 is opened according to OPEN by controller 720.The gesture image capturing user when image sensing unit 122, for after closing gesture image, send out code to controller 720, and display unit 700 cuts out by controller 720 by processing unit 123 according to out code.

The Satellite Navigation Set 700 of the embodiment of the present invention can comprise microphone 701, and microphone 701 is electrically connected at controller 720.When microphone 701 is opened, user inputs destination address by microphone 701.Controller 720 calculates the target coordinates data of corresponding destination address after receiving the data that user inputs.In embodiments of the present invention, user also can utilize gesture to open microphone 701, to input destination address.

Specifically, image sensing unit 122 captures and controls gesture image, and processing unit 123 sends speech-input instructions or END instruction to controller 720 according to gesture image.That is, the gesture control signal that aforementioned processing unit 123 sends also can comprise speech-input instructions and END instruction.In the present embodiment, speech-input instructions and END instruction are control gestures corresponding different respectively.When controller 720 receives the speech-input instructions that processing unit 123 sends, controller 720 is opened according to speech-input instructions order microphone 701, to receive the voice messaging that user inputs.When controller 720 receives the END instruction that processing unit 123 sends, controller 720 is according to END instruction mute microphone (MIC) 701.Although be noted that in the present embodiment, the mode that user inputs destination address utilizes microphone 701 to input, and user also can utilize other interfaces, such as: contact panel, inputs destination address, is not limited in the microphone 701 of the present embodiment.

In other embodiments, Satellite Navigation Set 700 comprises loudspeaker 702, is used for sending sound signal.Loudspeaker 702 is electrically connected at controller 720.Be noted that loudspeaker 702 and aforesaid microphone 701 can be separately independently element, also can be integrated into voice transmitting-receiving module.In the present embodiment, user controls unlatching or the closedown of Voice Navigation service by gesture.

Specifically, after image sensing unit 122 captures the voice unlatching gesture made to user, voice open command is sent to controller 720 by processing unit 123.Aforesaid path data is converted to audio-frequency information by controller 720, and control loudspeaker 702 sends voice according to audio-frequency information, and guides user to travel according to path data.When image sensing unit 122 captures after voice that user makes close gesture, a voice out code is sent to controller 720 by processing unit 123.Namely controller 720 stops transmitting voice data to loudspeaker 702.In other words, the gesture control signal that processing unit 123 sends also can comprise voice open command and voice out code.And the control gesture that voice open command and voice out code are corresponding different.

The display device 700 of the present embodiment also comprises indicator elment 740.Indicator elment 740 can be pilot lamp, is electrically connected or utilizes wireless technology (such as Bluetooth technology) to set up signal with processing unit 123 to be connected.Indicator elment 740 can be used to display controller 720 and whether sets up signal link with processing unit 123.That is, after image sensing unit 122 captures the startup line gesture image made to user, processing unit 123 starts line gesture image according to this to set up signal link with controller 720, and order indicator elment 740 shows starting state.

In one embodiment, after image sensing unit 122 captures a startup line gesture image of user, the signal just between start treatment unit 123 and controller 720 links.That is, do not carry out signal link between processing unit 123 and controller 720 before, Satellite Navigation Set 700 is the not action to some extent because of the gesture of user substantially.But user still can operate Satellite Navigation Set by other means.

When user is for interrupting with gesture operation Satellite Navigation Set 700, can makes and stop line gesture.After image sensing unit 122 captures termination line gesture image, processing unit 123 can interrupt the signal link between controller 720, and order indicator elment 740 is closed.

In the present embodiment, gesture corresponding to aforementioned various instruction (such as enabled instruction, amplification instruction, reduce instruction, speech-input instructions, END instruction etc.) and function can utilize screen display technology (On-Screen Display), shown by display element 710, to illustrate with the method for gesture operation display device 700.Based on above-mentioned, user is in the process of steering vehicle, and hand need directly not contact satellite navigation display device, can control the operations of satellite navigation display device, such as: Input Address, zoom in or out map, opening voice navigation etc.

The device with gesture sensor of the embodiment of the present invention also can be golf auxiliary practice device.Be described in detail with Fig. 8 A, Fig. 8 B and the golf auxiliary practice device of Fig. 8 C to the present embodiment below.Fig. 8 A shows the simplified diagram of the golf auxiliary practice device of one embodiment of the invention.User's image that the image sensing unit that Fig. 8 B shows Fig. 8 A captures.Fig. 8 C shows the circuit box schematic diagram of golf auxiliary practice device in Fig. 8 A.The golf auxiliary practice device of the embodiment of the present invention by gesture sensor 120, can adjust the action of user.

Golf auxiliary practice device 800 comprises exercising machine 810, gesture sensor 820 and indicating member 830.Signal link is mutually set up between gesture sensor 820 and indicating member 830.

Exercising machine 810 can comprise alley 811 and medicine ball 812, wherein alley 811 simulative golf field field conditions and design.The exercise that user can carry out push rod to the medicine ball 812 be positioned on alley 811 or swing.Alley 811 can be swing exercise pad or putting practice pad.Fig. 8 A shows user and stands on alley 811 with hand H2 bar of catching, and aims at medicine ball 812 and carry out swing exercise.In user's swinging process, hand H2 can move along a movement locus T.When club striking is to the moment of medicine ball 812, the hand H2 of user is just in time positioned at the minimum point of movement locus T.For the user of practice swing or push rod, the whether correct result for batting of action of impact has a great impact.

Please refer to Fig. 8 B, gesture sensor 820 comprises image sensing unit 821 and processing unit 823.Processing unit 823 is set up signal with image sensing unit 821 and indicating member 830 and is linked.When user's action, image sensing unit 821 captures at least one user's image.And aforesaid user's image is the user's image when the hand H2 of user is in movement locus T extreme lower position.User's image can present user side, and comprises at least one hand image and a leg image.Aforesaid hand image can comprise the image of palm and upper arm, and aforesaid leg image can comprise the image of thigh and shank.

Gesture sensor 820 can also comprise light emitting source 825.Light emitting source 825 in order to send a light to user, and is electrically connected processing unit 823, the contiguous light emitting source 825 of this image sensing unit 821, and user's image is formed by light reflection.In one embodiment, described light is invisible light.In other embodiments, light also can be sunlight or indoor light source.

Processing unit 823 receives user's image data, picks out hand image data and leg image data, to analyze for hand image data and leg image data.Specifically, processing unit 823 defines a first axle L1 by hand image data, and defines one second axis L2 by leg image data.An angle theta is formed between first axle L1 and the second axis L2.Aforesaid angle theta is in fact corresponding to impact, the arm of user and the angle of leg.

The built-in at least one numerical range of processing unit 823, wherein the angle theta scope of this numerical range representative under standard swings action.In other embodiments, processing unit 823 built-inly can organize different numerical ranges more, and these numerical ranges respectively corresponding different situation set.For example, when user's practice swing, numerical range about 10 to 170 is spent; When user is exercise push rod, numerical range about 2 to 85 is spent.In addition, numerical range also can according to the height of user, and club kind used when user practises sets.

In one embodiment, gesture sensor 820 also can comprise a display 824.Display 824 can be liquid crystal display or contact panel.Processing unit 823 can by aforesaid condition, such as: height, club kind, exercise kind etc. option, is presented on display 824, selects for user.In one embodiment, user before beginning to exercise, carries out condition selection by gesture.For example, on display 824, display swings and push rod two kinds of options, and the gesture that these two kinds of options are corresponding different respectively, such as: the gesture swung corresponding to option stretches out a finger, and the gesture corresponding to push rod option is two fingers.Image sensing unit 821 capture to user compare the gesture after, be sent to processing unit 823.The gesture image of processing unit 823 identification user, to judge the condition that user inputs, and calculates qualified numerical range.After processing unit 823 calculates angle theta, according to the numerical range defined, judge whether angle theta falls within numerical range.When angle theta does not drop in numerical range, processing unit 823 can send indicator signal to indicating member 830, to remind user.

Indicating member 822 can be pilot lamp and/or loudspeaker.Pilot lamp can be the LED of one or more different colours, in order to show measurement, or has the alerting signal correcting instruction.When angle theta falls within numerical range, represent that the slance of user is correct, then pilot lamp display green light.When angle theta falls within outside numerical range, represent that in user's impact, posture deviation is too much, then pilot lamp display red light.In addition, loudspeaker can send multiple prompt tone relevant with testing result, such as, how to adjust posture with voice instruction user, or sends the prompting of music tip sound.

In other embodiments, golf auxiliary practice device 800 can be used to measure the speed of user at square stance.Specifically, image sensing unit 821 can capture multiple user's image continuously in different time points, and is sent to processing unit 823.These user's images comprise the user's image when the hand H2 of user is in movement locus extreme lower position.These user's images can present user side, and comprise at least one hand image.

After processing unit 823 receives aforementioned user's image data, and pick out hand image data.Processing unit 823 can calculate the relative distance of hand H2 and gesture sensor 820 according to hand image area occupied size in user's image.Specifically, processing unit 823 also can comprise database, and database storage one reference table.The relation of hand image in user's image between area occupied size and relative distance is stored in reference table.Therefore, processing unit 823 is analyzed and after obtaining these hand images size shared in user's image respectively, can comparison reference table, and learns the relative distance of hand H2 different time points user and gesture sensor 820.Processing unit 823 according to hand image size over time, and can calculate a striking speed of user.Particularly when the hand H2 of user moves to minimum point along movement locus, the spot speed of hand.

The database of processing unit 823 more stores a velocity range.Processing unit 823 analyzes striking speed according to these user's image datas, and judges whether striking speed drops in velocity range.When striking speed does not fall in velocity range, processing unit 823 also transmits an indicator signal to indicating member 822, to inform user.

Because striking speed may swing or push rod and different because of user.Therefore, can store in the database of processing unit 823 and organize velocity range more, with situations corresponding different respectively.User, before practising, can first set exercise kind.Processing unit 823, again according to the information that user inputs, selects the velocity range be applicable to.

In sum, the discharging device described in the above embodiment of the present invention, water swivel and toilet utilize gesture sensor and operation valve to control current, and therefore user can make various gestures to control operation valve to gesture sensor.So, utilize gesture sensor, user can control by the mode of not contact-making switch the current that discharging device, water swivel and toilet provide multiple different flow or different aquifer yield.In addition, the display device described in the embodiment of the present invention utilizes gesture sensor and controller to control the operation of display device.So, in kitchen or under the environment such as bathroom, user can not contact display device and directly operate display device.And the golf auxiliary practice device of the embodiment of the present invention utilizes gesture sensor, can adjust the action of user's batting.

The foregoing is only possible embodiments of the present invention, all equalizations done according to the present patent application the scope of the claims change and modify, and all should belong to covering scope of the present invention.

Claims (64)

1. there is a device for gesture sensor, it is characterized in that this device is a discharging device, and comprise:
One water outlet body, has a water delivering orifice, and this water delivering orifice is in order to provide current;
One operation valve, is installed in this water outlet body, and for controlling this current;
One gesture sensor, comprising:
One image sensing unit, for capturing at least one gesture image that a user does; And
One processing unit, be electrically connected this image sensing unit, wherein this processing unit according to this at least one gesture image correspondence send at least one steering order to this operation valve, this at least one steering order comprise a first flow instruction or one second flow instruction, the flow that this operation valve changes these current according to this first flow instruction or this second flow instruction correspondence is a first flow or one second flow, and wherein this first flow is greater than this second flow.
2. there is the device of gesture sensor as claimed in claim 1, wherein the action of this at least one gesture image be palm of clenching fist, open, wave, palm rotates clockwise or palm rotates counterclockwise.
3. there is the device of gesture sensor as claimed in claim 1, wherein this gesture sensor also comprises a light emitting source, this light emitting source is in order to send a light to this user, and be electrically connected this processing unit, this image sensing unit this light emitting source contiguous, and this at least one gesture image be formed by this light reflection.
4. have the device of gesture sensor as claimed in claim 3, wherein this light is invisible light.
5. there is the device of gesture sensor as claimed in claim 3, also comprise a display element, this display element is electrically connected this processing unit, the gesture that this image sensing unit is more done for capturing this user starts image, and this processing unit starts this operation valve according to this gesture startup image, and this display element is ordered to show a starting state.
6. have the device of gesture sensor as claimed in claim 5, wherein this gesture starts the action of image for clenching fist, opening palm or wave.
7. have the device of gesture sensor as claimed in claim 5, wherein this display element comprises an illuminating part and and indicates light-passing board, and when this display element shows this starting state, this illuminating part is luminous towards this instruction light-passing board.
8. have the device of gesture sensor as claimed in claim 5, the gesture that wherein this image sensing unit is more done for capturing this user closes image, and this processing unit closes this display element and this operation valve according to this gesture closedown image.
9. have the device of gesture sensor as claimed in claim 8, wherein this gesture closes the action of image for clenching fist, opening palm or wave.
10. there is the device of gesture sensor as claimed in claim 1, wherein this at least one steering order also comprise an OPEN or and to cut off the water instruction, this operation valve opens this water delivering orifice according to this OPEN, these current are made to start to flow out, this operation valve closes this water delivering orifice according to this instruction of cutting off the water, and makes these current stop flowing out.
11. 1 kinds of devices with gesture sensor, is characterized in that this device is a water swivel, and comprise:
One faucet body, has a water delivering orifice, and this water delivering orifice is in order to provide current;
One operation valve, is installed in this faucet body, and for controlling this current;
One gesture sensor, comprising:
One image sensing unit, for capturing at least one gesture image that a user does; And
One processing unit, be electrically connected this image sensing unit, wherein this processing unit according to this at least one gesture image correspondence send at least one steering order to this operation valve, this at least one steering order comprise a decrement instruction or an increment instruction, this operation valve successively decreases according to this decrement instruction the flow of these current, and this operation valve increases progressively the flow of these current according to this increment instruction.
12. as the device with gesture sensor of claim 11, wherein the action of this at least one gesture image be palm of clenching fist, open, wave, palm rotates clockwise or palm rotates counterclockwise.
13. as the device with gesture sensor of claim 11, wherein this gesture sensor also comprises a light emitting source, this light emitting source is in order to send a light to this user, and be electrically connected this processing unit, this image sensing unit this light emitting source contiguous, and this at least one gesture image be formed by this light reflection.
14. as the device with gesture sensor of claim 13, and wherein this light is invisible light.
15. as the device with gesture sensor of claim 13, also comprise a display element, this display element is electrically connected this processing unit, the initiation gesture acquisition one of this image sensing unit more for doing from this user starts image by the gesture of this light reflection, and this processing unit starts this operation valve according to this gesture startup image, and this display element is ordered to show a starting state.
16. as the device with gesture sensor of claim 15, wherein the closedown gesture acquisition one of this image sensing unit more for doing from this user closes image by the gesture of this light reflection, and this processing unit closes this display element and this operation valve according to this gesture closedown image.
17. as the device with gesture sensor of claim 16, and wherein the action of closing image of this gesture is for clenching fist, opening palm or wave.
18. as the device with gesture sensor of claim 11, wherein this at least one steering order also comprise an OPEN or and to cut off the water instruction, this operation valve opens this water delivering orifice according to this OPEN, these current are made to start to flow out, this operation valve closes this water delivering orifice according to this instruction of cutting off the water, and makes these current stop flowing out.
19. as the device with gesture sensor of claim 11, and wherein this gesture sensor is positioned at the below of this water delivering orifice, and is not positioned at the flow path of these current.
20. as the device with gesture sensor of claim 11, and wherein this gesture sensor is positioned at the top of this water delivering orifice.
21. 1 kinds of devices with gesture sensor, is characterized in that this device is discharging device, and comprise:
One water outlet body, has a water delivering orifice, and this water delivering orifice is in order to provide current;
One operation valve, is installed in this water outlet body, and for controlling this current;
One gesture sensor, comprising:
One image sensing unit, for capturing at least one gesture image that a user does; And
One processing unit, be electrically connected this image sensing unit, wherein this processing unit according to this at least one gesture image correspondence send at least one steering order to this operation valve, this at least one steering order comprise one first water outlet instruction or one second water outlet instruction, the aquifer yield that this operation valve changes these current according to this first water outlet instruction or this second water outlet instruction is one first aquifer yield or one second aquifer yield, and wherein this first aquifer yield is greater than this second aquifer yield.
22. as the device with gesture sensor of claim 21, wherein the action of this at least one gesture image be palm of clenching fist, open, wave, palm rotates clockwise or palm rotates counterclockwise.
23. as the device with gesture sensor of claim 21, wherein this gesture sensor also comprises a light emitting source, this light emitting source is in order to send a light to this user, and be electrically connected this processing unit, this image sensing unit this light emitting source contiguous, and this at least one gesture image be formed by this light reflection.
24. as the device with gesture sensor of claim 23, and wherein this light is invisible light.
25. as the device with gesture sensor of claim 23, also comprise a display element, this display element is electrically connected this processing unit, the initiation gesture acquisition one of this image sensing unit more for doing from this user starts image by the gesture of this light reflection, and this processing unit starts this operation valve according to this gesture startup image, and this display element is ordered to show a starting state.
26. as the device with gesture sensor of claim 25, and wherein the action that starts image of this gesture is for clenching fist, opening palm or wave.
27. as the device with gesture sensor of claim 25, and wherein this display element comprises an illuminating part and and indicates light-passing board, and when this display element shows this starting state, this illuminating part is luminous towards this instruction light-passing board.
28. as the device with gesture sensor of claim 25, wherein the closedown gesture acquisition one of this image sensing unit more for doing from this user closes image by the gesture of this light reflection, and this processing unit closes this display element and this operation valve according to this gesture closedown image.
29. as the device with gesture sensor of claim 28, and wherein the action of closing image of this gesture is for clenching fist, opening palm or wave.
30. as the device with gesture sensor of claim 21, wherein this at least one steering order comprise an OPEN or and to cut off the water instruction, this operation valve opens this water delivering orifice according to this OPEN, these current are made to start to flow out, this operation valve closes this water delivering orifice according to this instruction of cutting off the water, and makes these current stop flowing out.
31. 1 kinds of devices with gesture sensor, is characterized in that this device is toilet, and comprise:
One toilet tanks seat, has a flush port;
One water unit, connects this toilet tanks seat, and has a water delivering orifice be communicated with this flush port, and this water delivering orifice flow to this flush port in order to export a water;
One operation valve, is installed in this water unit, and for controlling this current;
One gesture sensor, comprising:
One image sensing unit, for capturing at least one gesture image gesture image that a user does; And
One processing unit, be electrically connected this image sensing unit, wherein this processing unit according to this at least one gesture image correspondence send at least one steering order to this operation valve, this at least one steering order comprise one first flushing instruction or one second flushing instruction, the aquifer yield that this operation valve controls these current according to this first flushing instruction is one first aquifer yield, and the aquifer yield that this operation valve controls these current according to this second flushing instruction is one second aquifer yield, wherein this first aquifer yield is greater than this second aquifer yield.
32. as the device with gesture sensor of claim 31, wherein the action of this at least one gesture image be palm of clenching fist, open, wave, palm rotates clockwise or palm rotates counterclockwise.
33. as the device with gesture sensor of claim 31, wherein this gesture sensor also comprises a light emitting source, this light emitting source is in order to send a light to this user, and be electrically connected this processing unit, this image sensing unit this light emitting source contiguous, and this at least one gesture image be formed by this light reflection.
34. as the device with gesture sensor of claim 33, and wherein this light is invisible light.
35. as the device with gesture sensor of claim 33, also comprise a display element, this display element is electrically connected this processing unit, the initiation gesture acquisition one of this image sensing unit more for doing from this user starts image by the gesture of this light reflection, and this processing unit starts this operation valve according to this gesture startup image, and this display element is ordered to show an operation screen.
36. as the device with gesture sensor of claim 35, and wherein this display element comprises an illuminating part and and indicates light-passing board, and when this display element shows this operation screen, this illuminating part is luminous towards this instruction light-passing board.
37. as the device with gesture sensor of claim 35, and also comprise a control module, this control module is electrically connected this processing unit, and wherein this control module, this display element and this gesture sensor are integrated into a guidance panel.
38. as the device with gesture sensor of claim 37, also comprise one and be installed in heating cushion on this toilet tanks seat, wherein this control module is electrically connected this heating cushion, this image sensing unit more for the temperature control gesture acquisition one done from this user by the gesture temperature control image of this light reflection, and this control module controls the temperature of this heating cushion according to this gesture temperature control image.
39. as the device with gesture sensor of claim 35, wherein the closedown gesture acquisition one of this image sensing unit more for doing from this user closes image by the gesture of this light reflection, and this processing unit closes this display element and this operation valve according to this gesture closedown image.
40. as the device with gesture sensor of claim 39, and wherein the action of closing image of this gesture is for clenching fist, opening palm or wave.
41. as the device with gesture sensor of claim 31, and wherein this toilet tanks seat has a notch, and this water unit is a reserve tank, and has the front of a back side and one between this notch and this back side, and this gesture sensor configurations is on this front.
42. 1 kinds of devices with gesture sensor, is characterized in that this device is a display device, and comprise:
One display unit;
One gesture sensor, comprising:
One image sensing unit, for capturing at least one gesture image that a user does; And
One processing unit, is electrically connected this image sensing unit, wherein this processing unit according to this at least one gesture image correspondence send gesture at least on the other hand and control signal to this display unit, and control the operation of this display unit according to this gesture control signal.
43. as the device with gesture sensor of claim 42, wherein the action of this at least one gesture image be palm of clenching fist, open, wave, palm rotates clockwise or palm rotates counterclockwise.
44. as the device with gesture sensor of claim 42, wherein this gesture sensor also comprises a light emitting source, this light emitting source is in order to send a light to this user, and be electrically connected this processing unit, this image sensing unit this light emitting source contiguous, and this at least one gesture image be formed by this light reflection.
45. as the device with gesture sensor of claim 44, and wherein this light is invisible light.
46. as the device with gesture sensor of claim 44, also comprise an indicator elment, this indicator elment is electrically connected this processing unit, this image sensing unit is more for capturing the startup line gesture image that this user does, and this processing unit starts this display unit according to this startup line gesture image and orders this indicator elment to show a starting state.
47. as the device with gesture sensor of claim 46, and wherein the action of this startup line gesture image is for clenching fist, opening palm or wave.
48. as the device with gesture sensor of claim 46, and wherein this indicator elment comprises a pilot lamp, and when this indicator elment shows this starting state, this pilot lamp is luminous.
49. as the device with gesture sensor of claim 46, wherein this image sensing unit is more for capturing the termination line gesture image image that this user does, and this processing unit is according to the signal link between this termination line gesture video discontinuities and this display unit, and close this indicator elment.
50. as the device with gesture sensor of claim 42, and wherein this display device is a bathroom television, a Medical Devices TV or a kitchen TV.
51. as the device with gesture sensor of claim 50, and wherein this at least one gesture control signal comprises a channel switching instruction, and this display unit switches according to this channel switching instruction the channel that this display unit receives.
52. 1 kinds of devices with gesture sensor, is characterized in that this device is a Satellite Navigation Set, and comprise:
One display element;
One controller, sets up signal with this display unit and is connected, a map datum and a coordinate data to be sent to the display of this display element;
One gesture sensor, comprising:
One image sensing unit, for capturing at least one gesture image that a user does; And
One processing unit, be electrically connected this image sensing unit, and set up signal with this controller and link, wherein this processing unit sends gesture at least on the other hand according to this at least one gesture image correspondence and controls signal to this controller, and this controller controls according to this gesture control signal the mode that this display element shows this map datum and this coordinate data.
53. as the device with gesture sensor of claim 52, and wherein this controller comprises:
One location receiver module, in order to receive a satellite-signal;
One database, stores this map datum; And
One signal processing unit, set up signal with this position receiver module, this database and this display element to link, wherein this signal processing unit receives and processes this satellite-signal, to obtain this coordinate data, and this coordinate data and this map datum are sent to this display unit show.
54. as the device with gesture sensor of claim 52, this device also comprises a microphone, be electrically connected with this controller, wherein this at least one gesture control signal comprises a speech-input instructions, this controller opens this microphone according to this speech-input instructions, to receive a speech data, this controller also processes this speech data, to obtain target coordinates data and a path data.
55. as the device with gesture sensor of claim 54, wherein this device comprises a loudspeaker, this loudspeaker is set up signal with this controller and is connected, and wherein at least one gesture control signal comprises a voice open command, this path data is converted to a voice data according to this voice open command by this controller, and this voice data is sent to this loudspeaker and starts Voice Navigation.
56. as the device with gesture sensor of claim 55, and wherein this at least one gesture control signal comprises a voice out code, and this controller stops transmitting this voice data according to this voice out code.
57. as the device with gesture sensor of claim 52, wherein this at least one gesture control signal comprises an amplification instruction, and reduces instruction and a move, this controller according to this amplification instruction, this reduces this map datum convergent-divergent or movement that instruction or this move make to be presented on this display unit.
58. as the device with gesture sensor of claim 57, and wherein this at least one gesture control signal comprises at least one switching command, and this controller switches the display mode of this map according to this switching command.
59. 1 kinds of devices with gesture sensor, is characterized in that this device is a golf auxiliary practice device, and comprise:
One indicating member;
One gesture sensor, comprising:
One image sensing unit, when a user is with an exercising machine exercise action, this image sensing unit is in order to capture a silhouette of this user, and wherein this silhouette comprises at least one hand image and a leg image, and wherein this hand image and this leg image shape have angle; And
One processing unit, set up signal with this image sensing unit and this indicating member to link, wherein according to a built-in numerical range, this processing unit judges whether this angle falls within this numerical range, when this angle does not fall within this numerical range, this processing unit makes this indicating member send an indication information.
60. as the device with gesture sensor of claim 59, and wherein this gesture sensor also comprises a light emitting source, and this light emitting source in order to send a light to this user, and is electrically connected this processing unit, this at least one gesture image be formed by this light reflection.
61. as the device with gesture sensor of claim 60, and wherein this light is invisible light.
62. as the device with gesture sensor of claim 59, and wherein this indicating member is a pilot lamp and/or a loudspeaker, when this angle fall within this numerical range outer time, this processing unit transmits an indicator signal to open this pilot lamp and/or this loudspeaker.
63. as the device with gesture sensor of claim 59, wherein this processing unit also comprises a database, this database storage one reference table, this processing unit according to this reference table of size comparison of this hand image, and learns the hand of this user and a relative distance of this gesture sensor.
64. as the device with gesture sensor of claim 63, and wherein when this user swings, this processing unit calculates a striking speed of this user over time according to the size of this hand image.
CN201410089925.8A 2013-06-13 2014-03-12 Device with gesture sensor CN104238735A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201310233948 2013-06-13
CN201310233948.7 2013-06-13
CN201410089925.8A CN104238735A (en) 2013-06-13 2014-03-12 Device with gesture sensor

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN201811132462.3A CN109343709A (en) 2013-06-13 2014-03-12 Device with gesture sensor
CN201811131636.4A CN109240506A (en) 2013-06-13 2014-03-12 Device with gesture sensor
CN201811131610.XA CN109343708A (en) 2013-06-13 2014-03-12 Device with gesture sensor
CN201410089925.8A CN104238735A (en) 2013-06-13 2014-03-12 Device with gesture sensor

Related Child Applications (3)

Application Number Title Priority Date Filing Date
CN201811131610.XA Division CN109343708A (en) 2013-06-13 2014-03-12 Device with gesture sensor
CN201811131636.4A Division CN109240506A (en) 2013-06-13 2014-03-12 Device with gesture sensor
CN201811132462.3A Division CN109343709A (en) 2013-06-13 2014-03-12 Device with gesture sensor

Publications (1)

Publication Number Publication Date
CN104238735A true CN104238735A (en) 2014-12-24

Family

ID=52226979

Family Applications (4)

Application Number Title Priority Date Filing Date
CN201811131636.4A CN109240506A (en) 2013-06-13 2014-03-12 Device with gesture sensor
CN201811131610.XA CN109343708A (en) 2013-06-13 2014-03-12 Device with gesture sensor
CN201811132462.3A CN109343709A (en) 2013-06-13 2014-03-12 Device with gesture sensor
CN201410089925.8A CN104238735A (en) 2013-06-13 2014-03-12 Device with gesture sensor

Family Applications Before (3)

Application Number Title Priority Date Filing Date
CN201811131636.4A CN109240506A (en) 2013-06-13 2014-03-12 Device with gesture sensor
CN201811131610.XA CN109343708A (en) 2013-06-13 2014-03-12 Device with gesture sensor
CN201811132462.3A CN109343709A (en) 2013-06-13 2014-03-12 Device with gesture sensor

Country Status (1)

Country Link
CN (4) CN109240506A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105840897A (en) * 2016-04-16 2016-08-10 合肥九源环境科技有限公司 Tap water filtering faucet based on gesture recognition and using method
WO2018103303A1 (en) * 2016-12-11 2018-06-14 方翠芹 Power line carrier-based smart toilet
CN108594885A (en) * 2018-03-30 2018-09-28 上海思愚智能科技有限公司 Intelligent temperature control method and control device

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109899578A (en) * 2019-02-28 2019-06-18 上海与德通讯技术有限公司 A kind of intelligent tap, control method, electronic equipment and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080256494A1 (en) * 2007-04-16 2008-10-16 Greenfield Mfg Co Inc Touchless hand gesture device controller
CN101349944A (en) * 2008-09-03 2009-01-21 宏碁股份有限公司 Gesticulation guidance system and method for controlling computer system by touch control gesticulation
US20090178011A1 (en) * 2008-01-04 2009-07-09 Bas Ording Gesture movies
US20090288712A1 (en) * 2003-03-11 2009-11-26 Edo Lang Method for controlling the water supply in a sanitary installation
CN101853568A (en) * 2010-04-13 2010-10-06 鸿富锦精密工业(深圳)有限公司;鸿海精密工业股份有限公司 Gesture remote control device
TWM438671U (en) * 2012-05-23 2012-10-01 Tlj Intertech Inc Hand gesture manipulation electronic apparatus control system
TWM441814U (en) * 2012-06-29 2012-11-21 Chip Goal Electronics Corp Motion detecting device
CN103080439A (en) * 2010-09-08 2013-05-01 Toto株式会社 Automatic faucet

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090288712A1 (en) * 2003-03-11 2009-11-26 Edo Lang Method for controlling the water supply in a sanitary installation
US20080256494A1 (en) * 2007-04-16 2008-10-16 Greenfield Mfg Co Inc Touchless hand gesture device controller
US20090178011A1 (en) * 2008-01-04 2009-07-09 Bas Ording Gesture movies
CN101349944A (en) * 2008-09-03 2009-01-21 宏碁股份有限公司 Gesticulation guidance system and method for controlling computer system by touch control gesticulation
CN101853568A (en) * 2010-04-13 2010-10-06 鸿富锦精密工业(深圳)有限公司;鸿海精密工业股份有限公司 Gesture remote control device
CN103080439A (en) * 2010-09-08 2013-05-01 Toto株式会社 Automatic faucet
TWM438671U (en) * 2012-05-23 2012-10-01 Tlj Intertech Inc Hand gesture manipulation electronic apparatus control system
TWM441814U (en) * 2012-06-29 2012-11-21 Chip Goal Electronics Corp Motion detecting device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105840897A (en) * 2016-04-16 2016-08-10 合肥九源环境科技有限公司 Tap water filtering faucet based on gesture recognition and using method
WO2018103303A1 (en) * 2016-12-11 2018-06-14 方翠芹 Power line carrier-based smart toilet
CN108594885A (en) * 2018-03-30 2018-09-28 上海思愚智能科技有限公司 Intelligent temperature control method and control device

Also Published As

Publication number Publication date
CN109343708A (en) 2019-02-15
CN109240506A (en) 2019-01-18
CN109343709A (en) 2019-02-15

Similar Documents

Publication Publication Date Title
CN101720550B (en) Dynamic routing of audio among multiple audio devices
US6414672B2 (en) Information input apparatus
EP2095616B1 (en) Automated response to and sensing of user activity in portable devices
US10327617B2 (en) Robot cleaner system and control method of the same
US8600430B2 (en) Using ambient light sensor to augment proximity sensor output
EP2594895B1 (en) Object position and orientation detection system
JP3749369B2 (en) Hand pointing device
EP0919906A2 (en) Control method
US20170232346A1 (en) Gaming object with orientation sensor for interacting with a display and methods for use therewith
JP2013524354A (en) Computing device interface
US6738044B2 (en) Wireless, relative-motion computer input device
KR101496512B1 (en) Mobile terminal and control method thereof
US7728316B2 (en) Integrated proximity sensor and light sensor
US20100306699A1 (en) Method for controlling gesture-based remote control system
US20080134102A1 (en) Method and system for detecting movement of an object
KR101179785B1 (en) Backlight and ambient light sensor system
US20160109861A1 (en) Wearable Device
CN105913643B (en) Mobile terminal and its control method
US9232601B2 (en) Lighting system
US20130088434A1 (en) Accessory to improve user experience with an electronic display
KR101793566B1 (en) Remote controller, information processing method and system
US20140070706A1 (en) Lighting system
WO2007055907A2 (en) Touchscreen device for controlling a security system
JP2013169611A (en) Robot system, and robot
US20040160713A1 (en) Intelligent line switch

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20141224