CN107688385A - A kind of control method and device - Google Patents

A kind of control method and device Download PDF

Info

Publication number
CN107688385A
CN107688385A CN201610629333.XA CN201610629333A CN107688385A CN 107688385 A CN107688385 A CN 107688385A CN 201610629333 A CN201610629333 A CN 201610629333A CN 107688385 A CN107688385 A CN 107688385A
Authority
CN
China
Prior art keywords
user
control
behavioural characteristic
eyeball
eye
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610629333.XA
Other languages
Chinese (zh)
Inventor
涂畅
张扬
王砚峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sogou Technology Development Co Ltd
Original Assignee
Beijing Sogou Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sogou Technology Development Co Ltd filed Critical Beijing Sogou Technology Development Co Ltd
Priority to CN201610629333.XA priority Critical patent/CN107688385A/en
Publication of CN107688385A publication Critical patent/CN107688385A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements

Abstract

The embodiment of the present invention, which provides a kind of control method and device, methods described, to be included:Gather user's eye behavioural characteristic;When user's eye behavioural characteristic meets preparatory condition, control object is determined according to user's eye behavioural characteristic and determines control command corresponding with the eye behavioural characteristic;Control to the control object is realized according to the control command.The embodiment of the present invention is operated manually the control that can be realized to electronic equipment without user, simple and convenient, has liberated the both hands of user, improves the operating efficiency of user.

Description

A kind of control method and device
Technical field
The present embodiments relate to technical field of electronic equipment, and in particular to a kind of control method and device.
Background technology
With the development of touch technology, the electronic equipment with touch sensing unit is widely used, such as mobile phone, flat board Computer, notebook, remote control etc..User is contacted by operating bodies such as finger, felt pens or close to touch sensing unit, you can logical The contact position information obtained is crossed, realizes the control operation to electronic equipment.However, because user needs to use finger or touch Pen realizes the control to electronic equipment, and at least a hand can occupied by user, and other for influenceing user operate.When user is double When hand is taken by other affairs, then the control operation to electronic equipment can not be conveniently realized.Therefore, the electricity that prior art provides Unhandy defect be present in sub- apparatus control method.
The content of the invention
The embodiments of the invention provide a kind of control method and device, can by detecting the eye behavioural characteristic of user, The control to electronic equipment is realized, it is user-friendly, lift Consumer's Experience.
Therefore, the embodiment of the present invention provides following technical scheme:
In a first aspect, the embodiments of the invention provide a kind of control method, including:Gather user's eye behavioural characteristic;When When user's eye behavioural characteristic meets preparatory condition, according to user's eye behavioural characteristic determine control object and determine with Control command corresponding to the eye behavioural characteristic;Control to the control object is realized according to the control command.
Second aspect, the embodiments of the invention provide a kind of control device, including:Acquisition module, for gathering user's eye Portion's behavioural characteristic;Determining module, when user's eye behavioural characteristic for being gathered when the acquisition module meets preparatory condition, root Control object is determined according to user's eye behavioural characteristic and determines control command corresponding with the eye behavioural characteristic;Control Molding block, the control command for being generated according to the determining module realize the control to the control object.
The third aspect, the embodiments of the invention provide a kind of control device, including:There are memory, and one or one Program more than individual, one of them or more than one program storage are configured to by one or one in memory Above computing device is one or more than one program bag contains the instruction for being used for being operated below:Gather user's eye Behavioural characteristic;When user's eye behavioural characteristic meets preparatory condition, control pair is determined according to user's eye behavioural characteristic As and determine corresponding with eye behavioural characteristic control command;Realized according to the control command to the control object Control
Control method and device provided in an embodiment of the present invention, can be according to the eye behavioural characteristic of the user of collection, really Determine control object and control command, realize the control operation to control object.Method provided in an embodiment of the present invention need not be used Family is operated manually the control that can be realized to electronic equipment, simple and convenient, has liberated the both hands of user, has improved user Operating efficiency.
Brief description of the drawings
In order to illustrate more clearly about the embodiment of the present invention or technical scheme of the prior art, below will be to embodiment or existing There is the required accompanying drawing used in technology description to be briefly described, it should be apparent that, drawings in the following description are only this Some embodiments described in invention, for those of ordinary skill in the art, on the premise of not paying creative work, Other accompanying drawings can also be obtained according to these accompanying drawings.
Fig. 1 schematically shows the adaptable scene of the embodiment of the present invention;
Fig. 2 is the control method flow chart that one embodiment of the invention provides;
Fig. 3 is the control method flow chart that another embodiment of the present invention provides;
Fig. 4 schematically shows another adaptable scene of the embodiment of the present invention;
Fig. 5 is the control device schematic diagram that one embodiment of the invention provides;
Fig. 6 is the control device schematic diagram that one embodiment of the invention provides;
Fig. 7 is the structural representation of server in the embodiment of the present invention.
Embodiment
The embodiments of the invention provide a kind of control method and device, can by detecting the eye behavioural characteristic of user, The control to electronic equipment is realized, it is user-friendly, user's operating efficiency is improved, lifts Consumer's Experience.
In order that those skilled in the art more fully understand the technical scheme in the present invention, below in conjunction with of the invention real The accompanying drawing in example is applied, the technical scheme in the embodiment of the present invention is clearly and completely described, it is clear that described implementation Example only part of the embodiment of the present invention, rather than whole embodiments.It is common based on the embodiment in the present invention, this area The every other embodiment that technical staff is obtained under the premise of creative work is not made, should all belong to protection of the present invention Scope.
Referring to Fig. 1, for the exemplary application scene of the embodiment of the present invention.Method provided in an embodiment of the present invention can be applied In scene as shown in Figure 1, wherein, method provided in an embodiment of the present invention can apply to electronic equipment 100 as shown in Figure 1 In, the electronic equipment 100 can be existing, researching and developing or research and development in the future any electronic equipments, including but unlimited In:It is existing or research and develop or research and development in the future, desktop computer, laptop computer, mobile terminal (including intelligent hand Machine, non-smart mobile phone, various tablet personal computers) etc..As shown in figure 1, the user interface of electronic equipment 100 can include various displays Object, such as display object 101 and display object 102.When operating electronic equipment 100, its eyeball can move user 200, As shown in Figure 1 the position pointed by sight 2 is moved to from sight 1.In a possible application scenarios, the embodiment of the present invention carries The method and apparatus of confession can gather the eye behavioural characteristic of user 200, and when user's movement eyeball, sight is focused on from sight 1 When display object 101 is moved to display object 102 corresponding to the focus point of sight 2 corresponding to point, method provided by the invention and dress User's eye behavioural characteristic can be gathered by putting, and determine the moving direction of user eyeball to move from the top down, so as to by user's eye The object of eyeball viewing moves as control object, so that the display object 102 that user wants to check is moved to electronics The centre of device screen, facilitates user to watch.Thus, according to user's eye behavioural characteristic of collection determine control object and Control command, and the control to control object is realized according to the control command.Certainly, it these are only the embodiment of the present invention One exemplary illustration, method and apparatus provided in an embodiment of the present invention can also be applied to other scenes, herein without limit System.It should be noted that above-mentioned application scenarios are for only for ease of the understanding present invention and shown, embodiments of the present invention are herein Aspect is unrestricted.On the contrary, embodiments of the present invention can apply to applicable any scene.
The control method shown in the embodiment of the present invention is introduced below in conjunction with accompanying drawing 2 to accompanying drawing 3.
Referring to Fig. 2, the control method flow chart provided for one embodiment of the invention.As shown in Fig. 2 it can include:
S201, gather user's eye behavioural characteristic.
Wherein, control device provided by the invention can include acquisition module, for gathering user's eye behavioural characteristic.Institute State acquisition module and be specifically as follows camera.Certainly, control device provided by the invention can also pass through other extrinsic pathways, example Such as with other electronic equipments or device of control device communication connection, user's eye behavioural characteristic is obtained.Need what is illustrated It is that, when the control device obtains user's eye behavioural characteristic by extrinsic pathways, other electronic equipments or device can be Autonomous device with processor, it has acquisition module, and the acquisition module is used to gather user's eye behavioural characteristic.When So, control device can also obtain user's eye behavioural characteristic by being in communication with the outside acquisition module of connection, at this moment, The outside acquisition module can be the module without processor, and it can realize the work(of collection user's eye behavioural characteristic Energy.
Wherein, user's eye behavioural characteristic can include but is not limited to the moving direction of user eyeball, user eyeball Translational speed, the displacement of user eyeball, the original position of user eyeball movement and/or final position, user pupil Size, eyes of user are in opening or closure state, the spacing of the upper palpebra inferior of user, the number of user's blink, user's blink Frequency in the combination of any one or more.Certainly, exemplary illustration, eye behavior of the present invention to collection be these are only The type of feature is not limited.During specific implementation, control device can gather more framed user's eyes images, pass through more each frame Image, analysis obtain user's eye behavioural characteristic.For example, can be adjacent by contrasting with timed capture user's eyes image Two field picture obtains the change of user's eye state.It is of course also possible to identify that the change of user's eye state is turned into by other means For user's eye behavioural characteristic.
S202, when user's eye behavioural characteristic meets preparatory condition, determine to control according to user's eye behavioural characteristic Object processed and determination control command corresponding with the eye behavioural characteristic.
During specific implementation, user's eye behavioural characteristic meet preparatory condition include it is following in any one or more Combination:
(1) displacement of user eyeball and/or translational speed are more than given threshold.For example, when user eyeball is fast When speed is mobile, it may indicate that user wants conversion viewing object.When the displacement of user eyeball is more than given threshold, also may be used To show that user wants conversion viewing object.At this moment, it may be determined that user's eye behavioural characteristic meets preparatory condition.Given threshold Size rule of thumb or can need to set.
(2) moving direction of user eyeball is identical with preset direction.
(3) original position of user eyeball movement and/or final position are predeterminated position.
(4) size of user's pupil becomes big or reduced.
(5) spacing of the upper palpebra inferior of user becomes big or reduced.
(6) number of user's blink meets preset times.
(7) frequency of user's blink is more than given threshold.
(8) other situations.
It these are only exemplary illustration, those skilled in the art can also set other conditions as preparatory condition, herein Without limiting.
During specific implementation, in pair that it is determined that during control object corresponding to user's eye behavioural characteristic, user can be watched As control object.For example, in some embodiments, display object corresponding to eyes of user focus point can be obtained, will The display object is defined as control object.For example, as shown in figure 1, focus point corresponding to user's sight 1 is display object 101.If user's eye behavioural characteristic of collection is amplified for pupil, display object 101 is regard as control object, corresponding control Order can be amplification control object 101.And for example, in some embodiments, the motion track of user eyeball can be obtained, really The starting position of the fixed user eyeball motion track or end position, by the starting position of the motion track or end The display object of position correspondence is defined as control object.For example, the as shown in figure 1, beginning of the motion track of user eyeball The display object of position correspondence is 101, and the display object of the end position object of user eyeball motion track is 102, at this moment, can To regard the region of display object 102 of the end position object of user eyeball motion track as control object.In some realities Apply in mode, can will show that object is defined as control object corresponding to the starting position of user eyeball motion track.Such as Fig. 4 institutes Show, then be using the display object 103 of the starting position object of user eyeball motion track as control object, be moved into The position that control 104 overlaps.
In some embodiments, methods described also includes:In response to the trigger action of user, the instruction pair of display first As;Wherein, first denoted object is used for the motion track or focal position of instruction user eyeball;It is described according to the use Family eye behavioural characteristic determines that control object includes:When the eyeball of user is focusing on first denoted object, by described in First denoted object is as control object;Or when first denoted object overlaps with the first control, described first is controlled Part is as control object.
During specific implementation, the control command can be the order of moving displayed object, zoom in or out display object Order, the order clicking or double-click control object etc., herein without limiting.It is determined that with the eye behavioural characteristic pair During the control command answered, control life can be determined according to the corresponding relation of the eye behavioural characteristic and control command pre-saved Order.
With reference to several possible practising ways, to how according to user's eye behavioural characteristic to determine control object And determine that control command corresponding with the eye behavioural characteristic illustrates.
In some embodiments, it is described to determine that control command corresponding with the eye behavioural characteristic include:Really Determine the moving direction of user eyeball, the moving direction of control object is determined according to the moving direction of the user eyeball;Generate root The control command of the control object is moved according to the moving direction of the control object.For example, as shown in Figure 1, it is assumed that adopt The user's eye behavioural characteristic integrated as the moving direction of user eyeball, the translational speed of user eyeball, user eyeball movement away from From when the translational speed and/or displacement of user eyeball are more than given threshold, it is default to determine that user's eye behavioural characteristic meets Condition.It is determined that during control object, will be where the end position (corresponding to show object 102) of the motion track of user eyeball it is whole Body region is as control object.Then, determine that the moving direction of user eyeball to move from the top down, at this moment determines control object Moving direction to move from bottom to top, the control command of generation is moves control object from bottom to top.
In some embodiments, it is described to determine that control command corresponding with the eye behavioural characteristic include:Obtain Take the target shift position pre-set, the control command of the mobile control object of generation to the target shift position.Still Illustrated by taking Fig. 1 as an example, it is assumed that it is determined that during control object, the end position of the motion track of user eyeball is (corresponding aobvious Show object 102) where overall region as control object.The target shift position pre-set is electronic equipment display screen curtain Middle position, the control command at this moment generated is mobile control object to the middle position of display screen.In this embodiment party In formula, without determining the moving direction of control object, control object is directly moved to target shift position set in advance i.e. Can.
In some embodiments, methods described can also include:When the eyeball for detecting user stops mobile or uses The ocular focusing at family stops the operation of mobile control object in predeterminated position.Control object is moved to screen for example, working as During curtain middle position, user eyeball stares center Screen, stops movement, then can stop the mobile behaviour for the control object Make.Illustrate again, it is assumed that user eyeball focuses on predeterminated position, then may illustrate display object corresponding to the predeterminated position Exactly user wants the object of viewing, at this moment, then stops the operation of mobile control object.
In some embodiments, methods described can also include:When the eyeball for detecting user stops mobile and user Ocular focusing at the edge of the display unit of electronic equipment, control the control object persistently to move.In some embodiment party In formula, it is assumed that user eyeball stops the edge of the mobile screen for focusing on electronic equipment always, control object can be controlled to continue It is mobile, until the ocular focusing point of user leaves the edge of the screen of electronic equipment.Specifically, when user eyeball focuses on electronics During the top edge of device screen, the display object persistence of screen is controlled to move up;When user eyeball focuses on electronic equipment screen During the lower edge of curtain, the display object persistence of screen is controlled to move down;When user eyeball focuses on a left side for electronic equipment screen During edge, the display object persistence of screen is controlled to be moved to the left;When user eyeball focuses on the right hand edge of electronic equipment screen, The display object persistence of control screen moves right.
In some embodiments, it is described to determine that control command corresponding with the eye behavioural characteristic include:Really Determine user's pupil size or upper palpebra inferior spacing become it is big when, generation amplification control object order;Determine user's pupil Size or the spacing of upper palpebra inferior become hour, and the order of control object is reduced in generation.For example, user is by opening eyes wide Amplify pupil, capture eyes of user is opened wide, mydriatic information, and operation is amplified to the map that display screen is shown. User reduces pupil by narrowing eyes, and the map at this moment shown to display screen carries out reduction operation.Wherein, eyes of user is opened Big performance can be that the spacing of upper palpebra inferior becomes big, that is to say, that the amplitude that the upper palpebra inferior of user opens becomes big.Similarly, The performance that eyes of user is narrowed can be that the spacing of upper palpebra inferior diminishes, that is to say, that the amplitude that the upper palpebra inferior of user opens becomes It is small.
In some embodiments, it is described to determine corresponding with eye behavioural characteristic control command including in following The combination of any one or more:
(1) when the number for determining user's blink is the first preset times, the order of generation amplification control object.Specific implementation When, it may be determined that when the number that user continuously blinks in preset time period is the first preset times, generation amplification control object Order.For example, user can be set to be blinked 1 time in 2 seconds, to amplify the order of control object.
(2) when the number for determining user's blink is the second preset times, the order of control object is reduced in generation.Specific implementation When, it may be determined that when the number that user blinks in preset time period is the second preset times, the life of control object is reduced in generation Order.For example, user can be set to be blinked 2 times in 2 seconds, to reduce the order of control object.
(3) when the number for determining user's blink is three preset times, the order of control object is clicked in generation.Specific implementation When, it may be determined that when the number that user blinks in preset time period is three preset times, the life of control object is clicked in generation Order.For example, user can be set to be blinked 4 times in 2 seconds, to click the order of control object.
(4) when the number for determining user's blink is four preset times, the order of control object is double-clicked in generation.Specific implementation When, it may be determined that when the number that user blinks in preset time period is four preset times, the life of control object is double-clicked in generation Order.For example, user can be set to be blinked 6 times in 2 seconds, to double-click the order of control object.
Certainly, exemplary illustration is these are only, the present invention is not limited to this.
S203, the control to the control object is realized according to the control command.
Be more clearly understood that embodiment of the application under concrete scene for the ease of those skilled in the art, below with The application embodiment is introduced one specific example.It should be noted that the specific example is only to cause this area skill Art personnel more clearly understand the present invention, but embodiments of the present invention are not limited to the specific example.
The method shown in Fig. 1 and Fig. 2 is illustrated with an example below.In a possible application scenarios, this The method and apparatus that invention provides can be used for during user and map application interact.User spreads out the map should Checked with program.Control device provided by the invention identifies the behaviour of user by the eyeball of camera real-time capture user Work is intended to, such as direction when facing map by capturing the eyeball moving direction of user with respect to user, operation map displaying circle Moved towards upper and lower, left and right;When user eyeball, which stops operating, stares map, map displaying interface stops movement.Illustrate It is bright, timing can be set to be acquired eyes of user image.It is real-time by camera when user eyeball is offset to the left This image is caught, after the image before contrasting by analysis, obtains the offset direction of user, so as to control map should in real time It can be moved from left to right with the display interface of program, when the region for wanting to check by user moves on to screen middle, user Eyeball stops skew, faces map middle section, camera captures this change, stops moving map at once in real time.Similarly, use When family eyeball is offset to the right, the eyeball that map application captures user by camera acts, and can want user The region checked moves left to center Screen.It is similar, can be moved up, be moved down by following the trail of eyeball, upper left, upper right, lower-left, Bottom right etc. operates.In addition, when user's sight concentrates pupil amplification (or eyes are opened wide), amplified by capturing user's pupil (or Eyes are opened wide) information, map is amplified;And when user's myosis (or eyes diminution), map can be entered Row reduction operation.The embodiment of the present invention is used for eye behavioural characteristic by collection, corresponding control is realized, so as to meet user's Various interaction demands, so, user can be freed from cumbersome manual operations, can be with application program without touch screen Display object carry out basic running fix and zoom operation.
Above using control object to show that the overall region where object is entered as control object corresponding to user's focus point Row explanation, certainly, it will be appreciated by persons skilled in the art that can also be using individual element corresponding to user's focus point as control Object processed.Another example is illustrated referring to Fig. 3 and Fig. 4.
S301, in response to the trigger action of user, show the first denoted object.
Wherein, first denoted object is used for the motion track or focal position of instruction user eyeball.Described first Denoted object for example can be the placed arrow type cursor 103 shown in Fig. 4.Wherein, the trigger action of user can be pre-set Contact operates, such as user touches the operation of screen or long-press screen.It is of course also possible to be untouchable operation, such as User blinks two inferior in preset time.At this to the type of trigger action without limiting, as long as can be with other operations Distinguish, can trigger the first denoted object of display can.
S302, when the eyeball of user is focusing on first denoted object, using first denoted object as control Object processed.
S303, the moving direction and/or motion track of user eyeball are gathered, determines control command.
For example, the moving direction of user eyeball can be gathered, to determine the moving direction of the first denoted object, mobile the One denoted object.For example, as shown in figure 4, determine user eyeball moving direction move from the top down, at this moment move from the top down First denoted object 103.And for example, the motion track of user eyeball can also be gathered, the first denoted object 103 is gathered from sight 3 Burnt position is moved to the position of the focusing of sight 4.
S304, the first denoted object is moved according to control command.
S305, user's eye behavioural characteristic is gathered, when first denoted object overlaps with the first control, by described the One control is as control object.
As shown in figure 4, display object 104 is a control, the control can realize certain function.Pass through above-mentioned behaviour Make, the first denoted object is moved to the position that sight 4 focuses on from the position that user's sight 3 focuses on, at this moment the first denoted object 103 overlap or partially overlap with i.e. the first control of display object 104.For example, when the eye behavioural characteristic of user is to focus on First denoted object and blink three times when, can also using the control overlapped with the first denoted object (such as 104 in Fig. 4) as Control object, and generate corresponding control command.
S306, according to the corresponding relation of user's eye behavioural characteristic and control command, the order of the first control is clicked in generation.
Illustrated referring to Fig. 4.Referring to Fig. 4, for another exemplary application scene of the embodiment of the present invention.This hair The method that bright embodiment provides can apply to scene as shown in Figure 4, wherein, method provided in an embodiment of the present invention can answer For in electronic equipment 100 as shown in Figure 4, the electronic equipment 100 can be existing, researching and developing or research and develop in the future Any electronic equipment, include but is not limited to:Existing, researching and developing or research and development in the future, desktop computer, meter on knee Calculation machine, mobile terminal (including smart mobile phone, non-smart mobile phone, various tablet personal computers) etc..As shown in figure 4, electronic equipment 100 User interface can include various display objects, such as display object 103 and display object 104.Wherein, object 103 is shown Such as can be placed arrow type cursor, display object 104 for example can be control, and the control can realize certain function.User 200 When operating electronic equipment 100, its eyeball can move, such as be moved to the position pointed by sight 4 from sight 3 as shown in Figure 4 Put.In a possible application scenarios, method and apparatus provided in an embodiment of the present invention can gather the eye row of user 200 It is characterized, will shows that object is defined as control object 103 corresponding to the focus point of user's sight 3, and can be according to user eyeball Moving direction, control object 103 is moved to the position that sight 4 focuses on from the position that sight 3 focuses on, such as can be and control 104 positions overlapped, the arrow 103 shown in Fig. 4 dotted lines are the control object 103 after movement.Thus, according to the user of collection Eye behavioural characteristic determines control object and control command, and realizes the control to control object according to the control command System.Certainly, an exemplary illustration of the embodiment of the present invention is these are only, method and apparatus provided in an embodiment of the present invention may be used also Applied to other scenes, to be not limited herein.It should be noted that above-mentioned application scenarios are for only for ease of and understand this hair Bright and show, embodiments of the present invention are unrestricted in this regard.On the contrary, embodiments of the present invention can apply to Applicable any scene.
Below with the application scenarios for illustrating Fig. 4.Still illustrated by taking map application as an example, can be on ground Display highlighting (such as 103) in figure application program, by following the trail of the eyeball of user, are moved to cursor.When cursor is moved to During ad-hoc location, such as it is moved to when being overlapped with the first control 104, control 104 can be clicked on by blinking perform twice Operation.And when cursor is when without control regions, it can be zoomed in or out respectively by blinking or blinking three times twice Operation.
In some embodiments, method provided in an embodiment of the present invention can both control multiple elements or multiple elements Entirety, such as application scenarios shown in Fig. 1.In some embodiments, method provided in an embodiment of the present invention can control Individual element, such as the application scenarios shown in Fig. 4, can control cursor 103.In another embodiment, it is possible to achieve two kinds The combination of mode, i.e., the overall control to multiple elements or multiple elements can be both realized, can also have been realized to single member The control of element.By taking mobile element as an example, method provided in an embodiment of the present invention, the motion track for eyeball can be gathered, will Individual element such as cursor 103 is moved to the position of the focusing of sight 4 from the position that sight 3 focuses on.When the eyeball movement of user, and When the end position of eye motion trajectory is the edge of electronic equipment display unit, multiple elements or multiple elements can be controlled Move integrally.For example, at this moment, can it is overall upwards with multiple display elements that control display screen curtain is shown, downwards, to the left or Move right.Certainly, this is only a kind of exemplary illustration, is not intended as limitation of the present invention.
Method provided in an embodiment of the present invention is described in detail above, below to control provided in an embodiment of the present invention Device illustrates.
Referring to Fig. 5, the control device schematic diagram provided for one embodiment of the invention.
A kind of control device 500, including:
Acquisition module, for gathering user's eye behavioural characteristic;
Determining module, when user's eye behavioural characteristic for being gathered when the acquisition module meets preparatory condition, according to User's eye behavioural characteristic determines control object and determines control command corresponding with the eye behavioural characteristic;
Control module, the control command for being generated according to the determining module realize the control to the control object.
In some embodiments, the acquisition module is specifically used for the moving direction of collection user eyeball, user eyeball Translational speed, the displacement of user eyeball, the original position of user eyeball movement and/or final position, user pupil Size, eyes of user are in opening or closure state, the spacing of the upper palpebra inferior of user, the number of user's blink, user's blink Frequency in the combination of any one or more.
In some embodiments, the determining module specifically include it is following in any one or more submodules:
First determination sub-module, for when the displacement and/or translational speed of user eyeball are more than given threshold, really Determine user's eye behavioural characteristic and meet preparatory condition;
Second determination sub-module, for when the moving direction of user eyeball is identical with preset direction, determining user's eye Behavioural characteristic meets preparatory condition;
3rd determination sub-module, when original position and/or final position for being moved when user eyeball are predeterminated position, Determine that user's eye behavioural characteristic meets preparatory condition;
4th determination sub-module, for when the size of user's pupil becomes big or reduced, determining user's eye behavior spy Sign meets preparatory condition;
5th determination sub-module, for when the spacing of the upper palpebra inferior of user becomes big or reduced, determining user's eye row It is characterized and meets preparatory condition;
6th determination sub-module, when the number for being blinked as user meets preset times, determine user's eye behavior spy Sign meets preparatory condition;
7th determination sub-module, when the frequency for being blinked as user is more than given threshold, determine user's eye behavior spy Sign meets preparatory condition.
In some embodiments, the determining module is specifically used for:Obtain display pair corresponding to eyes of user focus point As the display object is defined as into control object;Or the motion track of user eyeball is obtained, determine the user eyeball The starting position of motion track or end position, by display corresponding to the starting position of the motion track or end position Object is defined as control object.
In some embodiments, the determining module is specifically used for:The moving direction of user eyeball is determined, according to described The moving direction of user eyeball determines the moving direction of control object;Generation moves institute according to the moving direction of the control object State the control command of control object;Or the target shift position pre-set is obtained, the mobile control object of generation to institute State the control command of target shift position.
In some embodiments, the control module is additionally operable to:When the eyeball for detecting user stops mobile or uses The ocular focusing at family stops the operation of mobile control object in predeterminated position.
In some embodiments, the control module is additionally operable to:When the eyeball for detecting user stops mobile and user Ocular focusing at the edge of the display unit of electronic equipment, control the control object persistently to move.
In some embodiments, the determining module is specifically used for:Determine user's pupil size or upper palpebra inferior When spacing becomes big, the order of generation amplification control object;Determine that the spacing of user's pupil size or upper palpebra inferior becomes hour, it is raw Into the order for reducing control object.
In some embodiments, the determining module is specifically used for:The number for determining user's blink is first default time During number, the order of generation amplification control object;And/or the number of user's blink is determined when be the second preset times, generation diminution The order of control object;And/or the number for determining user's blink generates the life for clicking control object when be three preset times Order;And/or the number of user's blink is determined when be four preset times, the order of generation double-click control object.
In some embodiments, described device also includes:
Display module, for the trigger action in response to user, show the first denoted object;Wherein, first instruction Object is used for the motion track or focal position of instruction user eyeball.
The determining module is additionally operable to:When the eyeball of user is focusing on first denoted object, by described first Denoted object is as control object;Or when first denoted object overlaps with the first control, first control is made For control object.
Wherein, the setting of apparatus of the present invention each unit or module is referred to the method shown in Fig. 1 to Fig. 4 and realized, This is not repeated.
Referring to Fig. 6, for a kind of block diagram of control device according to an exemplary embodiment.For example, device 600 can be with It is mobile phone, computer, digital broadcast terminal, messaging devices, game console, tablet device, Medical Devices, body-building Equipment, personal digital assistant etc..
Reference picture 6, device 600 can include following one or more assemblies:Processing component 602, memory 604, power supply Component 606, multimedia groupware 608, audio-frequency assembly 610, the interface 612 of input/output (I/O), sensor cluster 614, and Communication component 616.
The integrated operation of the usual control device 600 of processing component 602, such as communicated with display, call, data, phase The operation that machine operates and record operation is associated.Processing component 602 can refer to including one or more processors 620 to perform Order, to complete all or part of step of above-mentioned method.In addition, processing component 602 can include one or more modules, just Interaction between processing component 602 and other assemblies.For example, processing component 602 can include multi-media module, it is more to facilitate Interaction between media component 608 and processing component 602.
Memory 604 is configured as storing various types of data to support the operation in equipment 600.These data are shown Example includes the instruction of any application program or method for being operated on device 600, contact data, telephone book data, disappears Breath, picture, video etc..Memory 604 can be by any kind of volatibility or non-volatile memory device or their group Close and realize, as static RAM (SRAM), Electrically Erasable Read Only Memory (EEPROM) are erasable to compile Journey read-only storage (EPROM), programmable read only memory (PROM), read-only storage (ROM), magnetic memory, flash Device, disk or CD.
Power supply module 606 provides electric power for the various assemblies of device 600.Power supply module 606 can include power management system System, one or more power supplys, and other components associated with generating, managing and distributing electric power for device 600.
Multimedia groupware 608 is included in the screen of one output interface of offer between described device 600 and user.One In a little embodiments, screen can include liquid crystal display (LCD) and touch panel (TP).If screen includes touch panel, screen Curtain may be implemented as touch-screen, to receive the input signal from user.Touch panel includes one or more touch sensings Device is with the gesture on sensing touch, slip and touch panel.The touch sensor can not only sensing touch or sliding action Border, but also detect and touched or the related duration and pressure of slide with described.In certain embodiments, more matchmakers Body component 608 includes a front camera and/or rear camera.When equipment 600 is in operator scheme, such as screening-mode or During video mode, front camera and/or rear camera can receive outside multi-medium data.Each front camera and Rear camera can be a fixed optical lens system or have focusing and optical zoom capabilities.
Audio-frequency assembly 610 is configured as output and/or input audio signal.For example, audio-frequency assembly 610 includes a Mike Wind (MIC), when device 600 is in operator scheme, during such as call model, logging mode and speech recognition mode, microphone by with It is set to reception external audio signal.The audio signal received can be further stored in memory 604 or via communication set Part 616 is sent.In certain embodiments, audio-frequency assembly 610 also includes a loudspeaker, for exports audio signal.
I/O interfaces 612 provide interface between processing component 602 and peripheral interface module, and above-mentioned peripheral interface module can To be keyboard, click wheel, button etc..These buttons may include but be not limited to:Home button, volume button, start button and lock Determine button.
Sensor cluster 614 includes one or more sensors, and the state for providing various aspects for device 600 is commented Estimate.For example, sensor cluster 614 can detect opening/closed mode of equipment 600, and the relative positioning of component, for example, it is described Component is the display and keypad of device 600, and sensor cluster 614 can be with 600 1 components of detection means 600 or device Position change, the existence or non-existence that user contacts with device 600, the orientation of device 600 or acceleration/deceleration and device 600 Temperature change.Sensor cluster 614 can include proximity transducer, be configured to detect in no any physical contact The presence of neighbouring object.Sensor cluster 614 can also include optical sensor, such as CMOS or ccd image sensor, for into As being used in application.In certain embodiments, the sensor cluster 614 can also include acceleration transducer, gyro sensors Device, Magnetic Sensor, pressure sensor or temperature sensor.
Communication component 616 is configured to facilitate the communication of wired or wireless way between device 600 and other equipment.Device 600 can access the wireless network based on communication standard, such as WiFi, 2G or 3G, or combinations thereof.In an exemplary implementation In example, communication component 616 receives broadcast singal or broadcast related information from external broadcasting management system via broadcast channel. In one exemplary embodiment, the communication component 616 also includes near-field communication (NFC) module, to promote junction service.Example Such as, in NFC module radio frequency identification (RFID) technology can be based on, Infrared Data Association (IrDA) technology, ultra wide band (UWB) technology, Bluetooth (BT) technology and other technologies are realized.
In the exemplary embodiment, device 600 can be believed by one or more application specific integrated circuits (ASIC), numeral Number processor (DSP), digital signal processing appts (DSPD), PLD (PLD), field programmable gate array (FPGA), controller, microcontroller, microprocessor or other electronic components are realized, for performing the above method.
Specifically, the embodiments of the invention provide a kind of control device 600, include memory 604, and one or More than one program, one of them or more than one program storage in memory 604, and be configured to by one or More than one processor 620 performs one or more than one program bag and contains the instruction for being used for being operated below:
Gather user's eye behavioural characteristic;
When user's eye behavioural characteristic meets preparatory condition, control object is determined according to user's eye behavioural characteristic And determine control command corresponding with the eye behavioural characteristic;
Control to the control object is realized according to the control command.
In the exemplary embodiment, a kind of non-transitorycomputer readable storage medium including instructing, example are additionally provided Such as include the memory 604 of instruction, above-mentioned instruction can be performed to complete the above method by the processor 620 of device 600.For example, The non-transitorycomputer readable storage medium can be ROM, random access memory (RAM), CD-ROM, tape, floppy disk With optical data storage devices etc..
A kind of non-transitorycomputer readable storage medium, when the instruction in the storage medium is by the processing of electronic equipment When device performs so that electronic equipment is able to carry out a kind of control method, and methods described includes:
Gather user's eye behavioural characteristic;
When user's eye behavioural characteristic meets preparatory condition, control object is determined according to user's eye behavioural characteristic And determine control command corresponding with the eye behavioural characteristic;
Control to the control object is realized according to the control command.
Fig. 7 is the structural representation of server in the embodiment of the present invention.The server 700 can because configuration or performance it is different and Produce bigger difference, can include one or more central processing units (central processing units, CPU) 722 (for example, one or more processors) and memory 732, one or more storage application programs 742 or The storage medium 730 (such as one or more mass memory units) of data 744.Wherein, memory 732 and storage medium 730 can be of short duration storage or persistently storage.One or more modules can be included by being stored in the program of storage medium 730 (diagram does not mark), each module can include operating the series of instructions in server.Further, central processing unit 722 be could be arranged to communicate with storage medium 730, and the series of instructions behaviour in storage medium 730 is performed on server 700 Make.
Server 700 can also include one or more power supplys 726, one or more wired or wireless networks Interface 750, one or more input/output interfaces 758, one or more keyboards 756, and/or, one or one Above operating system 741, such as Windows ServerTM, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM etc..
Those skilled in the art will readily occur to the present invention its after considering specification and putting into practice invention disclosed herein Its embodiment.It is contemplated that cover the present invention any modification, purposes or adaptations, these modifications, purposes or Person's adaptations follow the general principle of the present invention and including the undocumented common knowledges in the art of the disclosure Or conventional techniques.Description and embodiments are considered only as exemplary, and true scope and spirit of the invention are by following Claim is pointed out.
It should be appreciated that the invention is not limited in the precision architecture for being described above and being shown in the drawings, and And various modifications and changes can be being carried out without departing from the scope.The scope of the present invention is only limited by appended claim
The foregoing is only presently preferred embodiments of the present invention, be not intended to limit the invention, it is all the present invention spirit and Within principle, any modification, equivalent substitution and improvements made etc., it should be included in the scope of the protection.
It should be noted that herein, such as first and second or the like relational terms are used merely to a reality Body or operation make a distinction with another entity or operation, and not necessarily require or imply and deposited between these entities or operation In any this actual relation or order.Moreover, term " comprising ", "comprising" or its any other variant are intended to Nonexcludability includes, so that process, method, article or equipment including a series of elements not only will including those Element, but also the other element including being not expressly set out, or it is this process, method, article or equipment also to include Intrinsic key element.In the absence of more restrictions, the key element limited by sentence "including a ...", it is not excluded that Other identical element also be present in process, method, article or equipment including the key element.The present invention can be by calculating Described in the general context for the computer executable instructions that machine performs, such as program module.Usually, program module includes holding Row particular task realizes the routine of particular abstract data type, program, object, component, data structure etc..It can also divide The present invention is put into practice in cloth computing environment, in these DCEs, by by communication network and connected long-range Processing equipment performs task.In a distributed computing environment, program module can be located at the local including storage device In remote computer storage medium.
Each embodiment in this specification is described by the way of progressive, identical similar portion between each embodiment Divide mutually referring to what each embodiment stressed is the difference with other embodiment.It is real especially for device For applying example, because it is substantially similar to embodiment of the method, so describing fairly simple, related part is referring to embodiment of the method Part explanation.Device embodiment described above is only schematical, wherein described be used as separating component explanation Unit can be or may not be physically separate, can be as the part that unit is shown or may not be Physical location, you can with positioned at a place, or can also be distributed on multiple NEs.Can be according to the actual needs Some or all of module therein is selected to realize the purpose of this embodiment scheme.Those of ordinary skill in the art are not paying In the case of creative work, you can to understand and implement.Described above is only the embodiment of the present invention, should be referred to Go out, for those skilled in the art, under the premise without departing from the principles of the invention, can also make some Improvements and modifications, these improvements and modifications also should be regarded as protection scope of the present invention.

Claims (10)

  1. A kind of 1. control method, it is characterised in that including:
    Gather user's eye behavioural characteristic;
    When user's eye behavioural characteristic meets preparatory condition, according to user's eye behavioural characteristic determine control object and It is determined that control command corresponding with the eye behavioural characteristic;
    Control to the control object is realized according to the control command.
  2. 2. according to the method for claim 1, it is characterised in that user's eye behavioural characteristic includes the shifting of user eyeball Dynamic direction, the translational speed of user eyeball, the displacement of user eyeball, the original position of user eyeball movement and/or termination Position, the size of user's pupil, eyes of user are in opening or closure state, the spacing of the upper palpebra inferior of user, user's blink Number, user blink frequency in the combination of any one or more.
  3. 3. method according to claim 1 or 2, it is characterised in that user's eye behavioural characteristic meets preparatory condition Including the combination of any one or more in following:
    The displacement and/or translational speed of user eyeball are more than given threshold;
    The moving direction of user eyeball is identical with preset direction;
    The original position and/or final position of user eyeball movement are predeterminated position;
    The size of user's pupil becomes big or reduced;
    The spacing of the upper palpebra inferior of user becomes big or reduced;
    The number of user's blink meets preset times;
    The frequency of user's blink is more than given threshold.
  4. 4. according to the method for claim 1, it is characterised in that described to determine to control according to user's eye behavioural characteristic Object includes:
    Display object corresponding to eyes of user focus point is obtained, the display object is defined as control object;Or
    The motion track of user eyeball is obtained, determines starting position or the end position of the user eyeball motion track, will Display object is defined as control object corresponding to the starting position of the motion track or end position.
  5. 5. according to the method for claim 1, it is characterised in that methods described also includes:
    In response to the trigger action of user, the first denoted object is shown;Wherein, first denoted object is used for instruction user eye The motion track or focal position of ball;
    It is described to determine that control object includes according to user's eye behavioural characteristic:
    When the eyeball of user is focusing on first denoted object, using first denoted object as control object;Or Person, when first denoted object overlaps with the first control, using first control as control object.
  6. 6. according to the method for claim 1, it is characterised in that described to determine control corresponding with the eye behavioural characteristic Order includes:
    The moving direction of user eyeball is determined, the moving direction of control object is determined according to the moving direction of the user eyeball; The control command of the control object is moved in generation according to the moving direction of the control object;
    Or
    The target shift position pre-set is obtained, the control of the generation mobile control object to the target shift position is ordered Order.
  7. 7. according to the method for claim 6, it is characterised in that methods described also includes:
    When the eyeball for detecting user stops mobile or user ocular focusing in predeterminated position, stop mobile control object Operation.
  8. 8. according to the method for claim 6, it is characterised in that methods described also includes:
    When the eyeball for detecting user stops mobile and user ocular focusing at the edge of the display unit of electronic equipment, control The control object is made persistently to move.
  9. A kind of 9. control device, it is characterised in that including:
    Acquisition module, for gathering user's eye behavioural characteristic;
    Determining module, when user's eye behavioural characteristic for being gathered when the acquisition module meets preparatory condition, according to described User's eye behavioural characteristic determines control object and determines control command corresponding with the eye behavioural characteristic;
    Control module, the control command for being generated according to the determining module realize the control to the control object.
  10. 10. a kind of control device, it is characterised in that include memory, and one or more than one program, wherein one Individual or more than one program storage is configured to one as described in one or more than one computing device in memory Individual or more than one program bag contains the instruction for being used for being operated below:
    Gather user's eye behavioural characteristic;
    When user's eye behavioural characteristic meets preparatory condition, according to user's eye behavioural characteristic determine control object and It is determined that control command corresponding with the eye behavioural characteristic;
    Control to the control object is realized according to the control command.
CN201610629333.XA 2016-08-03 2016-08-03 A kind of control method and device Pending CN107688385A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610629333.XA CN107688385A (en) 2016-08-03 2016-08-03 A kind of control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610629333.XA CN107688385A (en) 2016-08-03 2016-08-03 A kind of control method and device

Publications (1)

Publication Number Publication Date
CN107688385A true CN107688385A (en) 2018-02-13

Family

ID=61151296

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610629333.XA Pending CN107688385A (en) 2016-08-03 2016-08-03 A kind of control method and device

Country Status (1)

Country Link
CN (1) CN107688385A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108563238A (en) * 2018-06-15 2018-09-21 歌尔科技有限公司 A kind of method, apparatus of remote controlled drone, equipment and system
CN108613683A (en) * 2018-06-26 2018-10-02 威马智慧出行科技(上海)有限公司 On-vehicle navigation apparatus, method and automobile
CN109298782A (en) * 2018-08-31 2019-02-01 阿里巴巴集团控股有限公司 Eye movement exchange method, device and computer readable storage medium
CN110514219A (en) * 2019-09-20 2019-11-29 广州小鹏汽车科技有限公司 A kind of navigation map display methods, device, vehicle and machine readable media
CN110825228A (en) * 2019-11-01 2020-02-21 腾讯科技(深圳)有限公司 Interaction control method and device, storage medium and electronic device
CN110908513A (en) * 2019-11-18 2020-03-24 维沃移动通信有限公司 Data processing method and electronic equipment
CN113138659A (en) * 2020-01-16 2021-07-20 七鑫易维(深圳)科技有限公司 Method, device and equipment for controlling working mode and storage medium
CN114327082A (en) * 2022-03-04 2022-04-12 深圳市信润富联数字科技有限公司 Method and system for controlling industrial application screen, terminal device and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1740951A (en) * 2004-08-25 2006-03-01 西门子公司 Apparatus for controlling equipment by human eyes
CN1889016A (en) * 2006-07-25 2007-01-03 周辰 Eye-to-computer cursor automatic positioning controlling method and system
US20110169730A1 (en) * 2008-06-13 2011-07-14 Pioneer Corporation Sight line input user interface unit, user interface method, user interface program, and recording medium with user interface program recorded
CN103914151A (en) * 2014-04-08 2014-07-09 小米科技有限责任公司 Information display method and device
CN104951070A (en) * 2015-06-02 2015-09-30 无锡天脉聚源传媒科技有限公司 Method and device for manipulating device based on eyes
CN105095849A (en) * 2014-05-23 2015-11-25 财团法人工业技术研究院 Object identification method and device
CN105338192A (en) * 2015-11-25 2016-02-17 努比亚技术有限公司 Mobile terminal and operation processing method thereof

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1740951A (en) * 2004-08-25 2006-03-01 西门子公司 Apparatus for controlling equipment by human eyes
CN1889016A (en) * 2006-07-25 2007-01-03 周辰 Eye-to-computer cursor automatic positioning controlling method and system
US20110169730A1 (en) * 2008-06-13 2011-07-14 Pioneer Corporation Sight line input user interface unit, user interface method, user interface program, and recording medium with user interface program recorded
CN103914151A (en) * 2014-04-08 2014-07-09 小米科技有限责任公司 Information display method and device
CN105095849A (en) * 2014-05-23 2015-11-25 财团法人工业技术研究院 Object identification method and device
CN104951070A (en) * 2015-06-02 2015-09-30 无锡天脉聚源传媒科技有限公司 Method and device for manipulating device based on eyes
CN105338192A (en) * 2015-11-25 2016-02-17 努比亚技术有限公司 Mobile terminal and operation processing method thereof

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108563238A (en) * 2018-06-15 2018-09-21 歌尔科技有限公司 A kind of method, apparatus of remote controlled drone, equipment and system
CN108563238B (en) * 2018-06-15 2021-08-24 歌尔科技有限公司 Method, device, equipment and system for remotely controlling unmanned aerial vehicle
CN108613683A (en) * 2018-06-26 2018-10-02 威马智慧出行科技(上海)有限公司 On-vehicle navigation apparatus, method and automobile
CN109298782A (en) * 2018-08-31 2019-02-01 阿里巴巴集团控股有限公司 Eye movement exchange method, device and computer readable storage medium
CN109298782B (en) * 2018-08-31 2022-02-18 创新先进技术有限公司 Eye movement interaction method and device and computer readable storage medium
CN110514219A (en) * 2019-09-20 2019-11-29 广州小鹏汽车科技有限公司 A kind of navigation map display methods, device, vehicle and machine readable media
CN110514219B (en) * 2019-09-20 2022-03-18 广州小鹏汽车科技有限公司 Navigation map display method and device, vehicle and machine readable medium
CN110825228A (en) * 2019-11-01 2020-02-21 腾讯科技(深圳)有限公司 Interaction control method and device, storage medium and electronic device
CN110908513A (en) * 2019-11-18 2020-03-24 维沃移动通信有限公司 Data processing method and electronic equipment
CN110908513B (en) * 2019-11-18 2022-05-06 维沃移动通信有限公司 Data processing method and electronic equipment
CN113138659A (en) * 2020-01-16 2021-07-20 七鑫易维(深圳)科技有限公司 Method, device and equipment for controlling working mode and storage medium
CN114327082A (en) * 2022-03-04 2022-04-12 深圳市信润富联数字科技有限公司 Method and system for controlling industrial application screen, terminal device and storage medium

Similar Documents

Publication Publication Date Title
CN107688385A (en) A kind of control method and device
CN104571922B (en) Touch-responsive method, apparatus and terminal
CN104090721B (en) terminal control method and device
CN104461255B (en) Page display method and device, electronic equipment
CN106371688A (en) Full-screen single-hand operation method and apparatus
CN104536684B (en) interface display method and device
CN106572299A (en) Camera switching-on method and device
CN108536365A (en) A kind of images share method and terminal
CN106127129A (en) Fingerprint typing reminding method and device
CN104715757A (en) Terminal voice control operation method and device
CN106020796A (en) Interface display method and device
WO2020238647A1 (en) Hand gesture interaction method and terminal
CN103995666A (en) Method and device for setting work mode
CN110908513B (en) Data processing method and electronic equipment
CN107515669A (en) Display methods and device
CN107529699A (en) Control method of electronic device and device
CN106909256A (en) Screen control method and device
CN107704190A (en) Gesture identification method, device, terminal and storage medium
CN105892881A (en) Human-computer interaction method and device, and mobile equipment
CN105335061A (en) Information display method and apparatus and terminal
CN105094297A (en) Display content zooming method and display content zooming device
CN112114653A (en) Terminal device control method, device, equipment and storage medium
CN107544686A (en) Operation performs method and device
CN113747073A (en) Video shooting method and device and electronic equipment
CN103543825A (en) Camera cursor system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20180213