CN106371682A - Gesture recognition system based on proximity sensor and method thereof - Google Patents
Gesture recognition system based on proximity sensor and method thereof Download PDFInfo
- Publication number
- CN106371682A CN106371682A CN201610819454.0A CN201610819454A CN106371682A CN 106371682 A CN106371682 A CN 106371682A CN 201610819454 A CN201610819454 A CN 201610819454A CN 106371682 A CN106371682 A CN 106371682A
- Authority
- CN
- China
- Prior art keywords
- gesture
- data
- proximity transducer
- sensing
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04108—Touchless 2D- digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface without distance measurement in the Z direction
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Telephone Function (AREA)
Abstract
The invention discloses a gesture recognition system based on a proximity sensor and a method thereof. The system comprises a sensor recognition unit, a gesture predefining unit, a gesture data processing unit and a gesture matching unit, wherein the sensor recognition unit comprises at least one sensor recognition panel arranged on a screen or a shell of a terminal, and is used for forming static gestures through proximity or touch; the gesture predefining unit is used for matching the preset static gestures with corresponding operations; the gesture data processing unit is used for collecting data of the sensor recognition unit, performing gesture algorithm recognition on the data and acquiring the gesture data; the gesture matching unit is used for matching according to the gesture data and the preset static gestures so as to acquire the corresponding operation. The method comprises the following steps: collecting the data of the sensor recognition unit comprising at least one sensor recognition panel; performing gesture algorithm recognition on the data, and acquiring the gesture data; matching according to the gesture data and the preset static gestures so as to acquire the corresponding operation. According to the system and method disclosed by the invention, the influence of ambient light can be eliminated, and optical crosstalk is removed.
Description
Technical field
The present invention relates to technical field of hand gesture recognition, more particularly, to a kind of gesture recognition system based on proximity transducer and
Its method.
Background technology
Gesture identification is a kind of emerging user interface mode, is usually used in the users such as building and industrial control panels and only need to lead to
Cross move or application that gesture just can be interacted with equipment in.It is particularly important in being difficult to using the scene of touch screen interface, than
As in moist environment, user have gloves on when, or be difficult to contact control panel situation.
Traditional gesture identification mode and species are relatively more.The gesture identification of such as view-based access control model, the development of this technology is compared
Early also comparative maturity, but stricter to equipment and environmental requirement, and the limitation of use is also just than larger.Another kind is based on sensing
The technology of identification of device.The gesture sensor mainly passing through the auxiliary detection of rgb infrared light such as on the market, this mode gesture is known
Other single function, by ambient light affected than larger, accuracy rate is relatively low the shortcomings of.
Content of the invention
Present invention is primarily targeted at a kind of gesture recognition system based on proximity transducer and its method are proposed it is intended to
Solve gesture identification single function that traditional gesture identification mode has, affected than larger, accuracy rate more relatively by ambient light
Low shortcoming.
For achieving the above object, a kind of gesture recognition system based on proximity transducer is it is characterised in that include:
Sensing recognition unit, including at least one the sensing identification panel on the screen being arranged on terminal or shell, is used for
Static gesture is formed by close or touching;
Gesture predefines unit, for being mated default static gesture with corresponding operation;
Gesture data processing unit, for receiving the data of the described sensing recognition unit being collected by proximity transducer,
Described data is carried out with gesture algorithm identification, obtains gesture data;
Gesture matching unit, is mated with described default static gesture according to described gesture data, obtains corresponding
Operation.
Alternatively, it is approached or the described sensing touched identifies the capacitance of panel close to saturation value.
Alternatively, described gesture data processing unit is additionally operable in default acquisition time at interval of one section of Preset Time
Just again receive the data of the described sensing recognition unit being collected by described proximity transducer.
Alternatively, described Preset Time is with millisecond meter.
Alternatively, also include application execution unit, for executing described corresponding operation.
Additionally, for achieving the above object, the present invention also proposes a kind of gesture identification method based on proximity transducer, and it is special
Levy and be, comprising:
Receive the data including at least one sensing identification panel being collected by proximity transducer;
Described data is carried out with gesture algorithm identification, obtains gesture data;
Mated with described default static gesture according to described gesture data, obtained corresponding operation.
Alternatively, it is approached or the described sensing that touches identifies the capacitance of panel close to saturation value.
Alternatively, just again receive by described proximity transducer at interval of one section of Preset Time in default acquisition time
The data of at least one the sensing identification panel described collecting.
Alternatively, described Preset Time is with millisecond meter.
Alternatively, after obtaining described corresponding operation, further comprise the steps of: the described corresponding operation of execution.
A kind of gesture recognition system based on proximity transducer proposed by the present invention and its method, by handss near or connect
Touch the change of electric capacity in sensing identification panel separate processes, after having gathered data, identified by algorithm and judge current gesture
After operation, execute described corresponding operation;The present invention can eliminate the impact of ambient light, exclusion optical crosstalk automatically.
Brief description
Fig. 1 is the hardware architecture diagram of the mobile terminal realizing each embodiment of the present invention;
Fig. 2 is the wireless communication system schematic diagram of mobile terminal as shown in Figure 1;
Fig. 3 is the structural representation of the gesture recognition system based on proximity transducer of first embodiment of the invention;
Fig. 4 is that the sensing of second embodiment of the invention identifies the schematic diagram that panel is approached or touches;
Fig. 5 is the schematic diagram of the data processing based on sensing identification panel of the embodiment of the present invention;
Fig. 6 is the structural representation of the gesture recognition system based on proximity transducer of third embodiment of the invention;
Fig. 7 is the schematic flow sheet of the gesture identification method based on proximity transducer of fourth embodiment of the invention;
Fig. 8 is the schematic flow sheet of the gesture identification method based on proximity transducer of fifth embodiment of the invention;
Fig. 9 is the schematic flow sheet of the gesture identification method based on proximity transducer of sixth embodiment of the invention;
The realization of the object of the invention, functional characteristics and advantage will be described further in conjunction with the embodiments referring to the drawings.
Specific embodiment
It should be appreciated that specific embodiment described herein, only in order to explain the present invention, is not intended to limit the present invention.
Realize the mobile terminal of each embodiment of the present invention referring now to Description of Drawings.In follow-up description, use
For represent element such as " module ", " part " or " unit " suffix only for being conducive to the explanation of the present invention, itself
Not specific meaning.Therefore, " module " and " part " can mixedly use.
Mobile terminal can be implemented in a variety of manners.For example, the terminal described in the present invention can include such as moving
Phone, smart phone, notebook computer, digit broadcasting receiver, pda (personal digital assistant), pad (panel computer), pmp
The mobile terminal of (portable media player), guider etc. and such as numeral tv, desk computer etc. consolidate
Determine terminal.Hereinafter it is assumed that terminal is mobile terminal.However, it will be understood by those skilled in the art that, except being used in particular for moving
Outside the element of purpose, construction according to the embodiment of the present invention can also apply to the terminal of fixed type.
Fig. 1 is that the hardware configuration of the mobile terminal realizing each embodiment of the present invention is illustrated.
Mobile terminal 1 00 can include wireless communication unit 110, a/v (audio/video) input block 120, user input
Unit 130, sensing unit 140, output unit 150, memorizer 160, interface unit 170, controller 180 and power subsystem 190
Etc..Fig. 1 shows the mobile terminal with various assemblies, it should be understood that being not required for implementing all groups illustrating
Part.More or less of assembly can alternatively be implemented.Will be discussed in more detail below the element of mobile terminal.
Wireless communication unit 110 generally includes one or more assemblies, and it allows mobile terminal 1 00 and wireless communication system
Or the radio communication between network.For example, wireless communication unit can include broadcasting reception module 111, mobile communication module
112nd, at least one of wireless Internet module 113, short range communication module 114 and location information module 115.
Broadcasting reception module 111 receives broadcast singal and/or broadcast via broadcast channel from external broadcast management server
Relevant information.Broadcast channel can include satellite channel and/or terrestrial channel.Broadcast management server can be generated and sent
The broadcast singal generating before the server of broadcast singal and/or broadcast related information or reception and/or broadcast related information
And send it to the server of terminal.Broadcast singal can include tv broadcast singal, radio signals, data broadcasting
Signal etc..And, broadcast singal may further include the broadcast singal combining with tv or radio signals.Broadcast phase
Pass information can also provide via mobile communications network, and in this case, broadcast related information can be by mobile communication mould
Block 112 is receiving.Broadcast singal can exist in a variety of manners, and for example, it can be with the electronics of DMB (dmb)
The form of program guide (epg), the electronic service guidebooks (esg) of digital video broadcast-handheld (dvb-h) etc. and exist.Broadcast
Receiver module 111 can be broadcasted by using various types of broadcast system receipt signals.Especially, broadcasting reception module 111
Can be wide by using such as multimedia broadcasting-ground (dmb-t), DMB-satellite (dmb-s), digital video
Broadcast-hand-held (dvb-h), forward link media (mediaflo@) Radio Data System, received terrestrial digital broadcasting integrated service
Etc. (isdb-t) digit broadcasting system receives digital broadcasting.Broadcasting reception module 111 may be constructed such that and is adapted to provide for extensively
Broadcast the various broadcast systems of signal and above-mentioned digit broadcasting system.Via broadcasting reception module 111 receive broadcast singal and/
Or broadcast related information can be stored in memorizer 160 (or other types of storage medium).
Mobile communication module 112 sends radio signals to base station (for example, access point, node b etc.), exterior terminal
And at least one of server and/or receive from it radio signal.Such radio signal can include voice and lead to
Words signal, video calling signal or the various types of data sending and/or receiving according to text and/or Multimedia Message.
Wireless Internet module 113 supports the Wi-Fi (Wireless Internet Access) of mobile terminal.This module can be internally or externally
It is couple to terminal.Wi-Fi (Wireless Internet Access) technology involved by this module can include wlan (wireless lan) (wi-fi), wibro
(WiMAX), wimax (worldwide interoperability for microwave accesses), hsdpa (high-speed downlink packet access) etc..
Short range communication module 114 is the module for supporting junction service.Some examples of short-range communication technology include indigo plant
Toothtm, RF identification (rfid), Infrared Data Association (irda), ultra broadband (uwb), purple honeybeetmEtc..
Location information module 115 be for check or obtain mobile terminal positional information module.Location information module
Typical case be gps (global positioning system).According to current technology, gps module 115 calculates and is derived from three or more satellites
Range information and correct time information and for the Information application triangulation calculating, thus according to longitude, latitude
Highly accurately calculate three-dimensional current location information.Currently, the method for calculating position and temporal information is defended using three
Star and the error of the position that calculates by using other satellite correction and temporal information.Additionally, gps module 115
Can be by Continuous plus current location information in real time come calculating speed information.
A/v input block 120 is used for receiving audio or video signal.A/v input block 120 can include camera 121 He
Mike 1220, camera 121 is to the static map being obtained by image capture apparatus in Video Capture pattern or image capture mode
The view data of piece or video is processed.Picture frame after process may be displayed on display unit 151.At camera 121
Picture frame after reason can be stored in memorizer 160 (or other storage medium) or carry out via wireless communication unit 110
Send, two or more cameras 1210 can be provided according to the construction of mobile terminal.Mike 122 can be in telephone relation mould
Sound (voice data) is received via mike in formula, logging mode, speech recognition mode etc. operational mode, and can be by
Such acoustic processing is voice data.Audio frequency (voice) data after process can be changed in the case of telephone calling model
For can be sent to the form output of mobile communication base station via mobile communication module 112.Mike 122 can implement all kinds
Noise eliminate (or suppression) algorithm with eliminate (or suppression) receive and the noise that produces during sending audio signal or
Person disturbs.
User input unit 130 can generate key input data to control each of mobile terminal according to the order of user input
Plant operation.User input unit 130 allows the various types of information of user input, and can include keyboard, metal dome, touch
Plate (for example, detection due to touched and lead to resistance, pressure, the change of electric capacity etc. sensitive component), roller, rocking bar etc.
Deng.Especially, when touch pad is superimposed upon on display unit 151 as a layer, touch screen can be formed.
Sensing unit 140 detect mobile terminal 1 00 current state, (for example, mobile terminal 1 00 open or close shape
State), the position of mobile terminal 1 00, user is for the presence or absence of the contact (that is, touch input) of mobile terminal 1 00, mobile terminal
100 orientation, the acceleration or deceleration movement of mobile terminal 1 00 and direction etc., and generate for controlling mobile terminal 1 00
The order of operation or signal.For example, when mobile terminal 1 00 is embodied as sliding-type mobile phone, sensing unit 140 can sense
This sliding-type phone opens or cuts out.In addition, sensing unit 140 can detect power subsystem 190 whether provide electric power or
Whether person's interface unit 170 is coupled with external device (ED).Sensing unit 140 can include proximity transducer 1410 and will combine below
Gesture recognition system this is described.
Interface unit 170 is connected, with mobile terminal 1 00, the interface that can pass through as at least one external device (ED).For example,
External device (ED) can include wired or wireless head-band earphone port, external power source (or battery charger) port, wired or nothing
Line FPDP, memory card port, the port of device for connection with identification module, audio input/output (i/o) end
Mouth, video i/o port, ear port etc..Identification module can be storage for verifying that user uses each of mobile terminal 1 00
Kind of information and subscriber identification module (uim), client identification module (sim), Universal Subscriber identification module (usim) can be included
Etc..In addition, the device (hereinafter referred to as " identifying device ") with identification module can take the form of smart card, therefore, know
Other device can be connected with mobile terminal 1 00 via port or other attachment means.Interface unit 170 can be used for reception and is derived from
The input (for example, data message, electric power etc.) of the external device (ED) and input receiving is transferred in mobile terminal 1 00
One or more elements or can be used for transmission data between mobile terminal and external device (ED).
In addition, when mobile terminal 1 00 is connected with external base, interface unit 170 can serve as allowing by it by electricity
Power provides the path of mobile terminal 1 00 from base or can serve as allowing the various command signals from base input to pass through it
It is transferred to the path of mobile terminal.May serve as identifying that mobile terminal is from the various command signals of base input or electric power
The no signal being accurately fitted within base.Output unit 150 is configured to defeated with the offer of vision, audio frequency and/or tactile manner
Go out signal (for example, audio signal, video signal, alarm signal, vibration signal etc.).Output unit 150 can include showing
Unit 151, dio Output Modules 152, alarm unit 153 etc..
Display unit 151 may be displayed on the information processing in mobile terminal 1 00.For example, when mobile terminal 1 00 is in electricity
During words call mode, display unit 151 can show (for example, text messaging, the multimedia file that communicate with call or other
Download etc.) related user interface (ui) or graphic user interface (gui).When mobile terminal 1 00 is in video calling pattern
Or during image capture mode, display unit 151 can show the image of capture and/or the image of reception, illustrate video or figure
Ui or gui of picture and correlation function etc..
Meanwhile, when display unit 151 and the touch pad touch screen with formation superposed on one another as a layer, display unit
151 can serve as input equipment and output device.Display unit 151 can include liquid crystal display (lcd), thin film transistor (TFT)
In lcd (tft-lcd), Organic Light Emitting Diode (oled) display, flexible display, three-dimensional (3d) display etc. at least
A kind of.Some in these display may be constructed such that transparence to allow user from outside viewing, and this is properly termed as transparent
Display, typical transparent display can be, for example, toled (transparent organic light emitting diode) display etc..According to specific
The embodiment wanted, mobile terminal 1 00 can include two or more display units (or other display device), for example, moves
Dynamic terminal can include outernal display unit (not shown) and inner display unit (not shown).Touch screen can be used for detection and touches
Input pressure and touch input position and touch input area.
Dio Output Modules 152 can mobile terminal be in call signal reception pattern, call mode, logging mode,
When under the isotypes such as speech recognition mode, broadcast reception mode, that wireless communication unit 110 is received or in memorizer 160
The voice data transducing audio signal of middle storage and be output as sound.And, dio Output Modules 152 can provide and move
The audio output (for example, call signal receives sound, message sink sound etc.) of the specific function correlation of terminal 100 execution.
Dio Output Modules 152 can include speaker, buzzer etc..
Alarm unit 153 can provide output to notify event to mobile terminal 1 00.Typical event is permissible
Including calling reception, message sink, key signals input, touch input etc..In addition to audio or video output, alarm unit
153 can provide output in a different manner with the generation of notification event.For example, alarm unit 153 can be in the form of vibrating
Output is provided, enters when communicating (incoming communication) when receiving calling, message or some other, alarm list
Unit 153 can provide tactile output (that is, vibrating) to notify to user.By providing such tactile output, even if
When the mobile phone of user is in the pocket of user, user also can recognize that the generation of various events.Alarm unit 153
The output of the generation of notification event can be provided via display unit 151 or dio Output Modules 152.
Memorizer 160 can store software program of the process being executed by controller 180 and control operation etc., or can
Temporarily to store oneself data (for example, telephone directory, message, still image, video etc.) through exporting or will export.And
And, memorizer 160 can be to store the vibration of various modes with regard to exporting and audio signal when touching and being applied to touch screen
Data.
Memorizer 160 can include the storage medium of at least one type, and described storage medium includes flash memory, hard disk, many
Media card, card-type memorizer (for example, sd or dx memorizer etc.), random access storage device (ram), static random-access storage
Device (sram), read only memory (rom), Electrically Erasable Read Only Memory (eeprom), programmable read only memory
(prom), magnetic storage, disk, CD etc..And, mobile terminal 1 00 can execute memorizer with by network connection
The network storage device cooperation of 160 store function.
Controller 180 generally controls the overall operation of mobile terminal.For example, controller 180 execution and voice call, data
The related control of communication, video calling etc. and process.In addition, controller 180 can be included for reproducing (or playback) many matchmakers
The multi-media module 1810 of volume data, multi-media module 1810 can construct in controller 180, or it is so structured that and control
Device 180 processed separates.Controller 180 can be with execution pattern identifying processing, by the handwriting input executing on the touchscreen or figure
Piece is drawn input and is identified as character or image.
Power subsystem 190 receives external power or internal power under the control of controller 180 and provides operation each unit
Suitable electric power needed for part and assembly.
Various embodiment described herein can be with using such as computer software, hardware or its any combination of calculating
Machine computer-readable recording medium is implementing.Hardware is implemented, embodiment described herein can be by using application-specific IC
(asic), digital signal processor (dsp), digital signal processing device (dspd), programmable logic device (pld), scene can
Program gate array (fpga), processor, controller, microcontroller, microprocessor, be designed to execute function described herein
At least one in electronic unit implementing, in some cases, can be implemented in controller 180 by such embodiment.
Software is implemented, the embodiment of such as process or function can with allow to execute the single of at least one function or operation
Software module is implementing.Software code can be come by the software application (or program) write with any suitable programming language
Implement, software code can be stored in memorizer 160 and be executed by controller 180.
So far, oneself is through describing mobile terminal according to its function.Below, for the sake of brevity, will describe such as folded form,
Slide type mobile terminal in various types of mobile terminals of board-type, oscillating-type, slide type mobile terminal etc. is as showing
Example.Therefore, the present invention can be applied to any kind of mobile terminal, and is not limited to slide type mobile terminal.
As shown in Figure 1 mobile terminal 1 00 may be constructed such that using via frame or packet transmission data all if any
Line and wireless communication system and satellite-based communication system are operating.
The communication system being wherein operable to according to the mobile terminal of the present invention referring now to Fig. 2 description.
Such communication system can use different air interfaces and/or physical layer.For example, used by communication system
Air interface includes such as frequency division multiple access (fdma), time division multiple acess (tdma), CDMA (cdma) and universal mobile communications system
System (umts) (especially, Long Term Evolution (lte)), global system for mobile communications (gsm) etc..As non-limiting example, under
The description in face is related to cdma communication system, but such teaching is equally applicable to other types of system.
With reference to Fig. 2, cdma wireless communication system can include multiple mobile terminal 1s 00, multiple base station (bs) 270, base station
Controller (bsc) 275 and mobile switching centre (msc) 280.Msc280 is configured to and Public Switched Telephony Network (pstn)
290 formation interfaces.Msc280 is also structured to and can form interface via the bsc275 that back haul link is couple to base station 270.
If back haul link can construct according to any one in the interface that Ganji knows, described interface includes such as e1/t1, atm, ip,
Ppp, frame relay, hdsl, adsl or xdsl.It will be appreciated that system as shown in Figure 2 can include multiple bsc2750.
Each bs270 can service one or more subregions (or region), by the sky of multidirectional antenna or sensing specific direction
Each subregion that line covers is radially away from bs270.Or, each subregion can by for diversity reception two or more
Antenna covers.Each bs270 may be constructed such that support multiple frequency distribution, and the distribution of each frequency has specific frequency spectrum
(for example, 1.25mhz, 5mhz etc.).
Intersecting that subregion and frequency are distributed can be referred to as cdma channel.Bs270 can also be referred to as base station transceiver
System (bts) or other equivalent terms.In this case, term " base station " can be used for broadly representing single
Bsc275 and at least one bs270.Base station can also be referred to as " cellular station ".Or, each subregion of specific bs270 can be claimed
For multiple cellular stations.
As shown in Figure 2, broadcast singal is sent to the mobile terminal of operation in system by broadcsting transmitter (bt) 295
100.Broadcasting reception module 111 is arranged at mobile terminal 1 00 to receive the broadcast being sent by bt295 as shown in Figure 1
Signal.In fig. 2 it is shown that several global positioning system (gps) satellites 300.Satellite 300 helps position multiple mobile terminals
At least one of 100.
In fig. 2, depict multiple satellites 300, it is understood that be, it is possible to use any number of satellite obtains useful
Location information.Gps module 115 is generally configured to coordinate with satellite 300 to obtain the positioning letter wanted as shown in Figure 1
Breath.Substitute gps tracking technique or outside gps tracking technique, it is possible to use other of the position of mobile terminal can be followed the tracks of
Technology.In addition, at least one gps satellite 300 can optionally or additionally process satellite dmb transmission.
As a typical operation of wireless communication system, bs270 receives the reverse link from various mobile terminal 1s 00
Signal.Mobile terminal 1 00 generally participates in call, information receiving and transmitting and other types of communication.Each of certain base station 270 reception is anti-
Processed in specific bs270 to link signal.The data obtaining is forwarded to the bsc275 of correlation.Bsc provides call
Resource allocation and the mobile management function of including the coordination of soft switching process between bs270.Bsc275 is also by the number receiving
According to being routed to msc280, it provides the extra route service for forming interface with pstn290.Similarly, pstn290 with
Msc280 forms interface, and msc and bsc275 form interface, and bsc275 correspondingly controls bs270 with by forward link signals
It is sent to mobile terminal 1 00.
Based on above-mentioned mobile terminal hardware configuration and communication system, each embodiment of the inventive method is proposed.
As shown in figure 3, first embodiment of the invention proposes a kind of gesture recognition system based on proximity transducer, including biography
Sense recognition unit 10, gesture predefine unit 20, gesture data processing unit 30 and gesture matching unit 40.
Wherein, sense recognition unit 10, sense identification face including at least one on the screen being arranged on terminal or shell
Plate, for forming static gesture by close or touching;Usually, a proximity transducer 1410 can correspond to some sensings and know
Other panel, terminal can also possess multiple proximity transducers.
Gesture predefines unit 20, is connected with sensing recognition unit 10, for by default static gesture and corresponding behaviour
Make to be mated;When carrying out gesture and predefining, when gesture starts operation, the sensing of proximity transducer identifies the electric capacity of panel
Can be according to handss close to changing, close or touching position can include finger, finger belly, palm, and the present invention is to this not
It is limited;In the present embodiment, it is approached or the sensing touched identifies that the capacitance of panel can be close to saturation value.
Gesture data processing unit 30, is connected with sensing recognition unit 10, is gathered by proximity transducer 1410 for receiving
Sensing recognition unit 10 data, data is carried out with gesture algorithm identification, obtains gesture data;Gesture matching unit 40, with
Gesture data processing unit 30 and gesture predefine unit 20 and are connected, are carried out according to gesture data and default static gesture
Join, obtain corresponding operation.
In the present embodiment, identify that by the sensing after being approached or touching the change of the capacitance of panel determines its static hands
Gesture, and then complete the corresponding operation being matched with this static gesture, is had and is affected smaller, accuracy rate relatively by ambient light
High the features such as.
Second embodiment of the invention proposes a kind of gesture recognition system based on proximity transducer, including sensing recognition unit
10th, gesture predefines unit 20, gesture data processing unit 30 and gesture matching unit 40.
Wherein, sense recognition unit 10, sense identification face including at least one on the screen being arranged on terminal or shell
Plate, for forming static gesture by close or touching;Usually, a proximity transducer 1410 can correspond to some sensings and know
Other panel, terminal can also possess multiple proximity transducers.
Gesture predefines unit 20, is connected with sensing recognition unit 10, for by default static gesture and corresponding behaviour
Make to be mated;When carrying out gesture and predefining, when gesture starts operation, the sensing of proximity transducer identifies the electric capacity of panel
Can be according to handss close to changing, close or touching position can include finger, finger belly, palm, and the present invention is to this not
It is limited;In the present embodiment, it is approached or the sensing touched identifies that the capacitance of panel can be close to saturation value.
Gesture data processing unit 30, is connected with sensing recognition unit 10, is gathered by proximity transducer 1410 for receiving
Sensing recognition unit 10 data, data is carried out with gesture algorithm identification, obtains gesture data;In the present embodiment, gesture number
It is additionally operable to just again receive at interval of one section of Preset Time in default acquisition time according to processing unit and adopted by proximity transducer
The data of the sensing recognition unit collecting, above-mentioned Preset Time is with millisecond meter, and default acquisition time is then used for completing
The input of static gesture.Refer to Fig. 4, in the present embodiment, sensing recognition unit 10 can include sensing identification panel s0,
s1……sn.When n be 1 when, simulation finger from close to s0 panel to away from s0, then close to s1 panel to away from s1;Transverse axis is
Time, vertical pivot is capacitance;C0 is maximum saturation value, and c3 is the minima not having any gesture close.Static hand held terminal
Capacitance is all close or equal to saturation value, and wherein △ t is the time difference allowing between multiple sensing identification panels, also is understood as
Default acquisition time in the present embodiment.In the present embodiment, n is 1, and including 2 sensing identification panels, therefore, △ t is exactly to pass
Time difference between the other panel s0 of perception and sensing identification panel s1.
Referring again to Fig. 5, Fig. 5 is to hold before to holding the gesture data processing procedure after terminal, only arranges here
Single threshold value, when more than this value then be 1;It is then 0 less than this value.1 represent close, 0 represent away from.Come from the example of Fig. 5
See, hold during perusal is almost close to s1 panel with close to s0 panel simultaneously;The value of s0 is from 0 → 1 → 1, and s1
Value be from 0 → 1 → 1.The operation that it needs is different according to gesture translational speed, and generally tens arrived between hundreds of millisecond of time
Every.
Gesture matching unit 40, predefines unit 20 with gesture data processing unit 30 and gesture and is connected, according to gesture number
Mated according to default static gesture, obtained corresponding operation.
In the present embodiment, identify that by the sensing after being approached or touching the change of the capacitance of panel determines its static hands
Gesture, and then complete the corresponding operation being matched with this static gesture, is had and is affected smaller, accuracy rate relatively by ambient light
High the features such as.
As shown in fig. 6, third embodiment of the invention proposes a kind of gesture recognition system based on proximity transducer, including biography
Sense recognition unit 10, gesture predefine unit 20, gesture data processing unit 30 and gesture matching unit 40, and application executes list
Unit 50.
Wherein, sense recognition unit 10, sense identification face including at least one on the screen being arranged on terminal or shell
Plate, for forming static gesture by close or touching;Usually, a proximity transducer 1410 can correspond to some sensings and know
Other panel, terminal can also possess multiple proximity transducers.
Gesture predefines unit 20, is connected with sensing recognition unit 10, for by default static gesture and corresponding behaviour
Make to be mated;When carrying out gesture and predefining, when gesture starts operation, the sensing of proximity transducer identifies the electric capacity of panel
Can be according to handss close to changing, close or touching position can include finger, finger belly, palm, and the present invention is to this not
It is limited;In the present embodiment, it is approached or the sensing touched identifies that the capacitance of panel can be close to saturation value.
Gesture data processing unit 30, is connected with sensing recognition unit 10, is gathered by proximity transducer 1410 for receiving
Sensing recognition unit 10 data, data is carried out with gesture algorithm identification, obtains gesture data;In the present embodiment, gesture number
It is additionally operable to just again receive at interval of one section of Preset Time in default acquisition time according to processing unit and adopted by proximity transducer
The data of the sensing recognition unit collecting, above-mentioned Preset Time is with millisecond meter, and default acquisition time is then used for completing
The input of static gesture.Refer to Fig. 4, in the present embodiment, sensing recognition unit 10 can include sensing identification panel s0,
s1……sn.When n be 1 when, simulation finger from close to s0 panel to away from s0, then close to s1 panel to away from s1;Transverse axis is
Time, vertical pivot is capacitance;C0 is maximum saturation value, and c3 is the minima not having any gesture close.Static hand held terminal
Capacitance is all close or equal to saturation value, and wherein △ t is the time difference allowing between multiple sensing identification panels, also is understood as
Default acquisition time in the present embodiment.In the present embodiment, n is 1, and including 2 sensing identification panels, therefore, △ t is exactly to pass
Time difference between the other panel s0 of perception and sensing identification panel s1.
Referring again to Fig. 5, Fig. 5 is to hold before to holding the gesture data processing procedure after terminal, only arranges here
Single threshold value, this threshold value can be slightly less than saturation value, when more than this value then be 1;It is then 0 less than this value.1 represent close, 0
Represent away from.From the point of view of the example of Fig. 5, hold during perusal is almost close to s1 panel with close to s0 panel simultaneously;
The value of s0 is from 0 → 1 → 1, and the value of s1 is from 0 → 1 → 1.The operation that it needs is different according to gesture translational speed, generally
Tens time intervals arriving hundreds of millisecond.
Gesture matching unit 40, predefines unit 20 with gesture data processing unit 30 and gesture and is connected, according to gesture number
Mated according to default static gesture, obtained corresponding operation.
Application execution unit 50, is connected with gesture matching unit 40, for executing corresponding operation.
In the present embodiment, identify that by the sensing after being approached or touching the change of the capacitance of panel determines its static hands
Gesture, and then complete the corresponding operation being matched with this static gesture, is had and is affected smaller, accuracy rate relatively by ambient light
High the features such as.
As shown in fig. 7, the present invention further provides a kind of gesture identification method based on proximity transducer, including step:
The data including at least one sensing identification panel that s1, reception are collected by proximity transducer;
At least one sensing identification panel above-mentioned is arranged on screen or the shell of terminal, for by close or touching shape
Become static gesture, usually, a proximity transducer 1410 can correspond to some sensing identification panels, and terminal can also possess many
Individual proximity transducer.When gesture starts operation, the sensing of proximity transducer identifies that the electric capacity of panel can be sent out according to the close of handss
Changing, close or touching position can include finger, finger belly, palm, and the present invention is not limited to this;In the present embodiment
In, the sensing being approached or touching identifies that the capacitance of panel can be close to saturation value;
S2, described data is carried out with gesture algorithm identification, obtain gesture data;
The sensing that this data can include being approached or touch identifies the numbering of panel, and its location etc., thus solve
Read gesture data;
S3, mated with default static gesture according to described gesture data, obtained corresponding operation.
Default static gesture is one-to-one with during the operation reached with this default static gesture, according to described handss
When gesture data is mated with default static gesture, if not having, the match is successful, return to step s1, if the match is successful, can
Get corresponding operation.
In the present embodiment, identify that by the sensing after being approached or touching the change of the capacitance of panel determines its static hands
Gesture, and then complete the corresponding operation being matched with this static gesture, is had and is affected smaller, accuracy rate relatively by ambient light
High the features such as.
As shown in figure 8, the present invention also provides a kind of gesture identification method based on proximity transducer, including step:
S1, just again receive at interval of one section of Preset Time in default acquisition time and to be collected by proximity transducer
The data of at least one sensing identification panel;
At least one sensing identification panel above-mentioned is arranged on screen or the shell of terminal, for by close or touching shape
Become static gesture, usually, a proximity transducer 1410 can correspond to some sensing identification panels, and terminal can also possess many
Individual proximity transducer.When gesture starts operation, the sensing of proximity transducer identifies that the electric capacity of panel can be sent out according to the close of handss
Changing, close or touching position can include finger, finger belly, palm, and the present invention is not limited to this;In the present embodiment
In, the sensing being approached or touching identifies that the capacitance of panel can be close to saturation value;
Above-mentioned Preset Time is with millisecond meter, and default acquisition time is then used for completing the input of static gesture.
Refer to Fig. 4, in the present embodiment, sensing recognition unit 10 can include sensing identification panel s0, s1 ... sn.When n is 1,
Simulation finger from close to s0 panel to away from s0, then close to s1 panel to away from s1;Transverse axis is the time, and vertical pivot is capacitance;
C0 is maximum saturation value, and c3 is the minima not having any gesture close.The capacitance of static hand held terminal is all close or equal to
Saturation value, wherein △ t are the time differences allowing between multiple sensing identification panels, also are understood as default in the present embodiment
Acquisition time.In the present embodiment, n is 1, and including 2 sensing identification panels, therefore, △ t is exactly sensing identification panel s0 and sensing
Time difference between identification panel s1.
Referring again to Fig. 5, Fig. 5 is to hold before to holding the gesture data processing procedure after terminal, only arranges here
Single threshold value, when more than this value then be 1;It is then 0 less than this value.1 represent close, 0 represent away from.Come from the example of Fig. 5
See, hold during perusal is almost close to s1 panel with close to s0 panel simultaneously;The value of s0 is from 0 → 1 → 1, and s1
Value be from 0 → 1 → 1.The operation that it needs is different according to gesture translational speed, and generally tens arrived between hundreds of millisecond of time
Every;
S2, described data is carried out with gesture algorithm identification, obtain gesture data;
The sensing that this data can include being approached or touch identifies the numbering of panel, and its location etc., thus solve
Read gesture data;
S3, mated with default static gesture according to described gesture data, obtained corresponding operation.
Default static gesture is one-to-one with during the operation reached with this default static gesture, according to described handss
When gesture data is mated with default static gesture, if not having, the match is successful, return to step s1, if the match is successful, can
Get corresponding operation.
In the present embodiment, identify that by the sensing after being approached or touching the change of the capacitance of panel determines its static hands
Gesture, and then complete the corresponding operation being matched with this static gesture, is had and is affected smaller, accuracy rate relatively by ambient light
High the features such as.
As shown in figure 9, the present invention provides a kind of gesture identification method based on proximity transducer, including step again:
S1, just again receive at interval of one section of Preset Time in default acquisition time and to be collected by proximity transducer
The data acquisition of at least one sensing identification panel includes the data of the sensing recognition unit of at least one sensing identification panel;
At least one sensing identification panel above-mentioned is arranged on screen or the shell of terminal, for by close or touching shape
Become static gesture, usually, a proximity transducer 1410 can correspond to some sensing identification panels, and terminal can also possess many
Individual proximity transducer.When gesture starts operation, the sensing of proximity transducer identifies that the electric capacity of panel can be sent out according to the close of handss
Changing, close or touching position can include finger, finger belly, palm, and the present invention is not limited to this;In the present embodiment
In, the sensing being approached or touching identifies that the capacitance of panel can be close to saturation value;
Above-mentioned Preset Time is with millisecond meter, and default acquisition time is then used for completing the input of static gesture.
Refer to Fig. 4, in the present embodiment, sensing recognition unit 10 can include sensing identification panel s0, s1 ... sn.When n is 1,
Simulation finger from close to s0 panel to away from s0, then close to s1 panel to away from s1;Transverse axis is the time, and vertical pivot is capacitance;
C0 is maximum saturation value, and c3 is the minima not having any gesture close.The capacitance of static hand held terminal is all close or equal to
Saturation value, wherein △ t are the time differences allowing between multiple sensing identification panels, also are understood as default in the present embodiment
Acquisition time.In the present embodiment, n is 1, and including 2 sensing identification panels, therefore, △ t is exactly sensing identification panel s0 and sensing
Time difference between identification panel s1.
Referring again to Fig. 5, Fig. 5 is to hold before to holding the gesture data processing procedure after terminal, only arranges here
Single threshold value, when more than this value then be 1;It is then 0 less than this value.1 represent close, 0 represent away from.Come from the example of Fig. 5
See, hold during perusal is almost close to s1 panel with close to s0 panel simultaneously;The value of s0 is from 0 → 1 → 1, and s1
Value be from 0 → 1 → 1.The operation that it needs is different according to gesture translational speed, and generally tens arrived between hundreds of millisecond of time
Every;.
S2, described data is carried out with gesture algorithm identification, obtain gesture data;
The sensing that this data can include being approached or touch identifies the numbering of panel, and its location etc., thus solve
Read gesture data;
S3, mated with default static gesture according to described gesture data, obtained corresponding operation.
Default static gesture is one-to-one with during the operation reached with this default static gesture, according to described handss
When gesture data is mated with default static gesture, if not having, the match is successful, return to step s1, if the match is successful, can
Get corresponding operation.
S4, the described corresponding operation of execution.
In the present embodiment, identify that by the sensing after being approached or touching the change of the capacitance of panel determines its static hands
Gesture, and then complete the corresponding operation being matched with this static gesture, is had and is affected smaller, accuracy rate relatively by ambient light
High the features such as.
Gesture identification method based on proximity transducer proposed by the present invention and its system, are mainly based upon sensing identification face
The static gesture identification of plate, can should complete to complete to use according to the gesture of definition without contact screen, button etc. in terminal
The operation that family wishes to carry out, improves user terminal interactive experience.And the impact of ambient light, exclusion optical crosstalk can be eliminated.
Recognition accuracy of the present invention is high, low in energy consumption the features such as, can be widely used on the end products such as mobile phone, flat board, computer.
It should be noted that herein, term " inclusion ", "comprising" or its any other variant are intended to non-row
The comprising of his property, so that including a series of process of key elements, method, article or device not only include those key elements, and
And also include other key elements of being not expressly set out, or also include intrinsic for this process, method, article or device institute
Key element.In the absence of more restrictions, the key element being limited by sentence "including a ..." is it is not excluded that including being somebody's turn to do
Also there is other identical element in the process of key element, method, article or device.
The embodiments of the present invention are for illustration only, do not represent the quality of embodiment.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side
Method can be realized by the mode of software plus necessary general hardware platform naturally it is also possible to pass through hardware, but in many cases
The former is more preferably embodiment.Based on such understanding, technical scheme is substantially done to prior art in other words
Go out partly can embodying in the form of software product of contribution, this computer software product is stored in a storage medium
In (as rom/ram, magnetic disc, CD), including some instructions with so that a station terminal equipment (can be mobile phone, computer, clothes
Business device, air-conditioner, or network equipment etc.) method described in execution each embodiment of the present invention.
These are only the preferred embodiments of the present invention, not thereby limit the present invention the scope of the claims, every using this
Equivalent structure or equivalent flow conversion that bright description and accompanying drawing content are made, or directly or indirectly it is used in other related skills
Art field, is included within the scope of the present invention.
Claims (10)
1. a kind of gesture recognition system based on proximity transducer is it is characterised in that include:
Sensing recognition unit, including at least one the sensing identification panel on the screen being arranged on terminal or shell, is used for passing through
Close or touching forms static gesture;
Gesture predefines unit, for being mated default static gesture with corresponding operation;
Gesture data processing unit, for receiving the data of the described sensing recognition unit being collected by proximity transducer, to institute
State data and carry out gesture algorithm identification, obtain gesture data;
Gesture matching unit, is mated with described default static gesture according to described gesture data, obtains corresponding operation.
2. the gesture recognition system based on proximity transducer according to claim 1 is it is characterised in that being approached or touching
Described sensing identify the capacitance of panel close to saturation value.
3. the gesture recognition system based on proximity transducer according to claim 1 is it is characterised in that described gesture data
Processing unit is additionally operable to just again receive by described proximity transducer at interval of one section of Preset Time in default acquisition time
The data of the described sensing recognition unit collecting.
4. the gesture recognition system based on proximity transducer according to claim 3 is it is characterised in that described Preset Time
With millisecond meter.
5. the gesture recognition system based on proximity transducer according to claim 1 is held it is characterised in that also including application
Row unit, for executing described corresponding operation.
6. a kind of gesture identification method based on proximity transducer is it is characterised in that include:
Receive the data including at least one sensing identification panel being collected by proximity transducer;
Described data is carried out with gesture algorithm identification, obtains gesture data;
Mated with described default static gesture according to described gesture data, obtained corresponding operation.
7. the gesture identification method based on proximity transducer according to claim 6 is it is characterised in that being approached or touching
The described sensing arriving identifies the capacitance of panel close to saturation value.
8. the gesture identification method based on proximity transducer according to claim 6 is it is characterised in that in default collection
Just again receive at least one sensing described in described proximity transducer collects at interval of one section of Preset Time in time to know
The data of other panel.
9. the gesture identification method based on proximity transducer according to claim 8 is it is characterised in that described Preset Time
With millisecond meter.
10. the gesture identification method based on proximity transducer according to claim 6 is it is characterised in that described obtaining
After corresponding operation, further comprise the steps of: the described corresponding operation of execution.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610819454.0A CN106371682A (en) | 2016-09-13 | 2016-09-13 | Gesture recognition system based on proximity sensor and method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610819454.0A CN106371682A (en) | 2016-09-13 | 2016-09-13 | Gesture recognition system based on proximity sensor and method thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106371682A true CN106371682A (en) | 2017-02-01 |
Family
ID=57896746
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610819454.0A Pending CN106371682A (en) | 2016-09-13 | 2016-09-13 | Gesture recognition system based on proximity sensor and method thereof |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106371682A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106484271A (en) * | 2016-09-20 | 2017-03-08 | 努比亚技术有限公司 | A kind of device, method and mobile terminal controlling mobile terminal based on user gesture |
CN108109581A (en) * | 2018-01-16 | 2018-06-01 | 深圳鑫亿光科技有限公司 | Interactive LED display and its display methods |
CN109782616A (en) * | 2018-12-29 | 2019-05-21 | 青岛海尔空调器有限总公司 | Control method, device, storage medium and computer equipment based on induction arrays |
CN110244849A (en) * | 2019-06-17 | 2019-09-17 | Oppo广东移动通信有限公司 | Close to recognition methods, device, mobile terminal and storage medium |
CN110521219A (en) * | 2017-03-03 | 2019-11-29 | 共生国际大学 | Enable interactive wearable device as the system and method for the education supplement for listening barrier individual |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102663364A (en) * | 2012-04-10 | 2012-09-12 | 四川长虹电器股份有限公司 | Imitated 3D gesture recognition system and method |
CN102749995A (en) * | 2012-06-19 | 2012-10-24 | 上海华勤通讯技术有限公司 | Mobile terminal and mobile terminal commanding and controlling method |
US20150070299A1 (en) * | 2013-09-11 | 2015-03-12 | Samsung Electro-Mechanics Co., Ltd. | Touch sensor to recognize gesture and method of controlling the same |
-
2016
- 2016-09-13 CN CN201610819454.0A patent/CN106371682A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102663364A (en) * | 2012-04-10 | 2012-09-12 | 四川长虹电器股份有限公司 | Imitated 3D gesture recognition system and method |
CN102749995A (en) * | 2012-06-19 | 2012-10-24 | 上海华勤通讯技术有限公司 | Mobile terminal and mobile terminal commanding and controlling method |
US20150070299A1 (en) * | 2013-09-11 | 2015-03-12 | Samsung Electro-Mechanics Co., Ltd. | Touch sensor to recognize gesture and method of controlling the same |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106484271A (en) * | 2016-09-20 | 2017-03-08 | 努比亚技术有限公司 | A kind of device, method and mobile terminal controlling mobile terminal based on user gesture |
CN110521219A (en) * | 2017-03-03 | 2019-11-29 | 共生国际大学 | Enable interactive wearable device as the system and method for the education supplement for listening barrier individual |
CN108109581A (en) * | 2018-01-16 | 2018-06-01 | 深圳鑫亿光科技有限公司 | Interactive LED display and its display methods |
CN108109581B (en) * | 2018-01-16 | 2018-12-25 | 深圳鑫亿光科技有限公司 | Interactive LED display and its display methods |
CN109782616A (en) * | 2018-12-29 | 2019-05-21 | 青岛海尔空调器有限总公司 | Control method, device, storage medium and computer equipment based on induction arrays |
CN109782616B (en) * | 2018-12-29 | 2022-01-21 | 青岛海尔空调器有限总公司 | Control method and device based on induction array, storage medium and computer equipment |
CN110244849A (en) * | 2019-06-17 | 2019-09-17 | Oppo广东移动通信有限公司 | Close to recognition methods, device, mobile terminal and storage medium |
CN110244849B (en) * | 2019-06-17 | 2022-04-15 | Oppo广东移动通信有限公司 | Proximity recognition method and device, mobile terminal and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104750420B (en) | Screenshotss method and device | |
CN106453963A (en) | Mobile terminal antenna switching device and method | |
CN106911806A (en) | A kind of method of PUSH message, terminal, server and system | |
CN106371682A (en) | Gesture recognition system based on proximity sensor and method thereof | |
CN104777982B (en) | Method and device for switching terminal input method | |
CN106412301A (en) | Screen control system based on proximity sensor and mobile terminal | |
CN106657602A (en) | Conversation screen-off system based on proximity sensor and mobile terminal | |
CN105554710A (en) | Message display method and device | |
CN106383707A (en) | Picture display method and system | |
CN106250130A (en) | A kind of mobile terminal and the method for response button operation | |
CN106406621B (en) | A kind of mobile terminal and its method for handling touch control operation | |
CN106354268A (en) | Screen control system based on proximity sensor and mobile terminal | |
CN105183830B (en) | picture browsing method and device | |
CN106412328A (en) | Method and device for obtaining user feedback information | |
CN107071161A (en) | The aggregation display method and mobile terminal of icon in a kind of status bar | |
CN106161790A (en) | A kind of mobile terminal and control method thereof | |
CN106125946A (en) | A kind of control method, mobile terminal and helmet | |
CN105740668A (en) | Terminal screen locking device and method | |
CN106484271A (en) | A kind of device, method and mobile terminal controlling mobile terminal based on user gesture | |
CN106648052A (en) | Application triggering system based on proximity detector and mobile terminal | |
CN106341502A (en) | Proximity sensor-based application triggering system and mobile terminal | |
CN106843684A (en) | A kind of device and method, the mobile terminal of editing screen word | |
CN106254596A (en) | A kind of kneading identification system based on proximity transducer and mobile terminal | |
CN106534498A (en) | Control device and method of application folder and mobile terminal | |
CN106254597A (en) | A kind of gripping identification system based on proximity transducer |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170201 |