CN105787485A - Identification clicking operation device and identification clicking operation method - Google Patents

Identification clicking operation device and identification clicking operation method Download PDF

Info

Publication number
CN105787485A
CN105787485A CN201410826952.9A CN201410826952A CN105787485A CN 105787485 A CN105787485 A CN 105787485A CN 201410826952 A CN201410826952 A CN 201410826952A CN 105787485 A CN105787485 A CN 105787485A
Authority
CN
China
Prior art keywords
depth
clicking operation
degree
infrared
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410826952.9A
Other languages
Chinese (zh)
Other versions
CN105787485B (en
Inventor
陈悦
王琳
崔恒利
庄凯
王竞
李翔
周扬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201410826952.9A priority Critical patent/CN105787485B/en
Publication of CN105787485A publication Critical patent/CN105787485A/en
Application granted granted Critical
Publication of CN105787485B publication Critical patent/CN105787485B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides an identification clicking operation device and an identification clicking operation method, which can be used to reduce the high requirement on the precision of the device, are suitable for the small-sized devices, are used to reduce the errors, and are used to improve the use effect and the user experience. The identification clicking operation method is characterized in that a media projection operation interface can be displayed, and comprises at least two interface elements; the infrared light can be projected on the operation interface; when the user carries out the clicking operation of the target elements, the shooting of the operation interface can be carried out to acquire at least one frame of the colorful image and at least two frames of the colorful image based on the infrared light; the target elements can be determined according to the colorful image; when the first clicking depth is different from the second clicking depth, the clicking operation of the user is determined, and the clicking operation of the target elements carried out by the user can be identified; the first clicking depth is the depth of the clicking operation displayed by the first infrared image, and the second clicking depth is the depth of the clicking operation displayed by the second infrared image.

Description

Identify the apparatus and method of clicking operation
Technical field
The present invention relates to gesture identification field, and more particularly, to identifying the apparatus and method of clicking operation.
Background technology
Development along with science and technology, gesture operation more comes into one's own as the important means of man-machine interaction gradually, such as, the hands gesture of user can be caught by camera head, thus realizing gesture operation, this technology can apply to the operation at the user projection operation interface to being incident upon on screen, in the art, gesture operation can be divided into plane operations and clicking operation (), plane operations refer to user in the face of or during back to this camera head, in the plane formed along body bilateral direction and above-below direction, in other words, parallel with above-mentioned screen or approximately parallel plane is (following, for the ease of understanding and distinguishing, be called projection plane) on gesture operation, such as, the gestures such as slide.Clicking operation, also referred to as degree of depth gesture operation, refer to user in the face of or during back to this camera head, along being perpendicular to the gesture operation on the direction of above-mentioned projection plane.
Identification for clicking operation, it is possible to adopt such as, the technology such as infrared depth finding.
But, due to above-mentioned technology be utilize infrared light to be incident upon user is upper and that produce texture carries out depth finding and be only capable of and distinguish the position that texture occurs, and then using the interface element corresponding to this position as operating object, therefore, if aforesaid operations interface includes multiple interface element, the interface element that particularly position is nearer, if identifying which icon has specifically been carried out clicking operation by user, then need the projection plane making infrared light corresponding with the projection plane perfection at operation interface, therefore, the required precision of equipment is high, it is dfficult to apply to the mini-plants such as such as mobile phone, and, identification error is bigger, result of use and Consumer's Experience are had a strong impact on.
Summary of the invention
The embodiment of the present invention provides a kind of apparatus and method identifying clicking operation, it is possible to reduce the required precision of equipment is high, it is possible to suitable in mini-plant, further, it is possible to reduce identification error, improves result of use and Consumer's Experience.
First aspect, provide a kind of device identifying clicking operation, it is characterized in that, this device includes: the first projecting unit, for to presenting medium projection operation interface, this operation interface includes at least two interface element, in order to the object element in this at least two interface element is carried out clicking operation by user;First image unit, for when this user is to the clicking operation of this object element, shooting this operation interface, to obtain at least one color image frame;Second projecting unit, for projecting infrared light to this operation interface;At least one second image unit, for when this user is to the clicking operation of this object element, shooting this operation interface, to obtain at least two frame infrared images based on this Infrared;Processing unit, for according to this coloured image, determine this object element, and for when first clicks the degree of depth and the second click degree of depth is different, determine that this user has carried out clicking operation, to identify that this object element has been carried out clicking operation by this user, wherein, this the first click degree of depth is the degree of depth of this clicking operation that the first infrared image in this at least two frames infrared image presents, and this second click degree of depth is the degree of depth of this clicking operation that the second infrared image in this at least two frames infrared image presents.
In conjunction with first aspect, in the first implementation of first aspect, this first image unit uses same photographic head with this second image unit, this photographic head be configured with can opening and closing arrange infrared fileter, wherein when not obtaining this coloured image and this at least two frames infrared image in the same time, when this infrared fileter is opened, the first image unit obtains this coloured image by this photographic head, and when this infrared fileter cuts out, the second image unit obtains this infrared image by this photographic head.
In conjunction with first aspect and above-mentioned implementation thereof, in the second implementation of first aspect, this device includes two the second image units, and the distance on the first in-plane between these two second image units is be more than or equal to the first pre-determined threshold, wherein, the depth direction of this clicking operation is perpendicular to this first plane.
In conjunction with first aspect and above-mentioned implementation thereof, in the third implementation of first aspect, this processing unit is different with the second click degree of depth specifically for clicking the degree of depth first, and this first click degree of depth with this second click the degree of depth difference be more than or equal to the second pre-determined threshold time, it is identified as this user and this object element has been carried out clicking operation, wherein, this second pre-determined threshold is to determine according to the distance on the first in-plane between this second projecting unit and second projecting unit, wherein, the depth direction of this clicking operation is perpendicular to this first plane.
Second aspect, it is provided that a kind of method identifying clicking operation, the method includes: to presenting medium projection operation interface, this operation interface includes at least two interface element, in order to the object element in this at least two interface element is carried out clicking operation by user;Infrared light is projected to this operation interface;When this user is to the clicking operation of this object element, this operation interface is shot, to obtain at least one color image frame and at least two frame infrared images based on this Infrared;According to this coloured image, it is determined that this object element;When first clicks the degree of depth and the second click degree of depth is different, determine that this user has carried out clicking operation, to identify that this object element has been carried out clicking operation by this user, wherein, this the first click degree of depth is the degree of depth of this clicking operation that the first infrared image in this at least two frames infrared image presents, and this second click degree of depth is the degree of depth of this clicking operation that the second infrared image in this at least two frames infrared image presents.
In conjunction with second aspect, in the first implementation of second aspect, this operation interface is shot by this, include obtaining at least one color image frame and at least two frame infrared images based on this Infrared: being provided by can the photographic head of infrared fileter that arranges of opening and closing, this operation interface is shot, not obtain this coloured image and this at least two frames infrared image in the same time, this coloured image is obtained by this photographic head when this infrared fileter is opened, this at least two frames infrared image is obtained by this photographic head when this infrared fileter cuts out.
In conjunction with second aspect and above-mentioned implementation thereof, in the second implementation of second aspect, this operation interface is shot by this, including: by two photographic head, this operation interface is shot, wherein, the distance on the first in-plane between these two photographic head is be more than or equal to the first pre-determined threshold, and wherein, the depth direction of this clicking operation is perpendicular to this first plane.
In conjunction with second aspect and above-mentioned implementation thereof, in the third implementation of second aspect, should when first clicks the degree of depth and the second click degree of depth is different, it is identified as this user and this object element has been carried out clicking operation, including: click the degree of depth first different with the second click degree of depth, and this first click degree of depth with this second click the degree of depth difference be more than or equal to the second pre-determined threshold time, it is identified as this user and this object element has been carried out clicking operation, wherein, this second pre-determined threshold is to determine according to the distance on the first in-plane between this second projecting unit and second projecting unit, wherein, the depth direction of this clicking operation is perpendicular to this first plane.
The apparatus and method of identification clicking operation according to embodiments of the present invention, when the target interface element in operation interface is carried out clicking operation by user, by this operation interface is shot, to obtain coloured image, this target interface element can be determined from least two interface element that this operation interface includes by this coloured image, and, by this operation interface is shot, to obtain infrared image, this infrared image can be passed through, determine that user has carried out clicking operation, and then can easily identify target interface element has been carried out clicking operation, thus, the required precision to equipment can be reduced high, mini-plant can be applicable to, and, identification error can be reduced, improve result of use and Consumer's Experience.
Accompanying drawing explanation
Fig. 1 is the schematic block diagram of the device 100 of identification clicking operation according to embodiments of the present invention.
Fig. 2 is the schematic diagram of photographic head according to embodiments of the present invention.
Fig. 3 is the schematic diagram of the terminal unit of the method and apparatus of the identification clicking operation being suitable for the present invention.
Fig. 4 is the indicative flowchart of the method for identification clicking operation according to embodiments of the present invention.
Detailed description of the invention
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is clearly and completely described, it is clear that described embodiment is a part of embodiment of the present invention, rather than whole embodiments.Based on the embodiment in the present invention, the every other embodiment that those of ordinary skill in the art obtain under not making creative work premise, broadly fall into the scope of protection of the invention.
Fig. 1 illustrates the schematic block diagram of the device 100 of the identification clicking operation of the embodiment of the present invention, as it is shown in figure 1, this device 100 includes:
First projecting unit 110, for presenting medium projection operation interface, this operation interface includes at least two interface element, in order to the object element in this at least two interface element is carried out clicking operation by user;
First image unit 120, for when this user is to the clicking operation of this object element, shooting this operation interface, to obtain at least one color image frame;
Second projecting unit 130, for projecting infrared light to this operation interface;
At least one second image unit 140, for when this user is to the clicking operation of this object element, shooting this operation interface, to obtain at least two frame infrared images based on this Infrared;
Processing unit 150, for according to this coloured image, determine this object element, and for when first clicks the degree of depth and the second click degree of depth is different, determine that this user has carried out clicking operation, to identify that this object element has been carried out clicking operation by this user, wherein, this the first click degree of depth is the degree of depth of this clicking operation that the first infrared image in this at least two frames infrared image presents, and this second click degree of depth is the degree of depth of this clicking operation that the second infrared image in this at least two frames infrared image presents.
Below, function and handling process to each unit are described in detail.
A. the first projecting unit 110
In embodiments of the present invention, operation interface can be projected to and present medium by the first projecting unit 110, medium is presented as this, can being the various media such as body of wall, projection screen or water curtain, the present invention be also not particularly limited, and, the method at this first projecting unit 110 projection operation interface can be similar to prior art with process, here, in order to avoid repeating, description is omitted.
In embodiments of the present invention, this operation interface can include patterned user interface, i.e. graphic user interface (GUI, GraphicUserInterface), it it is the modal a kind of user interface in the world today, the interface that such as mobile phone shows in the energized state is graphic user interface, and user to interact with device hardware by the figure (icon) shown, dummy keyboard (soft keyboard), virtual mouse etc..Interact the information with information exchange between said system and user and can include auditory information, visual information, tactile data, action message, odiferous information etc..
In embodiments of the present invention, user can by gesture operation to user interface (specifically, be the interface element in user interface), and to carry out information mutual with system.
In embodiments of the present invention, this interface element can be the graphic elements being presented in operation interface.
Specifically, interface element refers to the visualized graphs " element " that can place on a user interface, such as button, Document Editing frame etc., wherein great majority are to have perform function or cause code to run by " event " and complete the function of response, event refers to that control is to input operation (such as, clicking operation) response, any control has the event sets of oneself, once certain event of control occurs, just can cause the execution of corresponding event process, event object has oneself specific title, event procedure code is to be write according to the issue requirement of oneself by programmer.In embodiments of the present invention, exemplarily non-limiting, interface control may include that tab (Tab), title bar (Titlebar), action hurdle (Actionbar), progress bar (Progressbar), multiselect button (Checkbox) etc..
Should be understood that user interface listed above is only one embodiment of the present of invention, the present invention is not limited to this, and other can pass through projection pattern etc. and present to the user interface that user performs operation for user and each fall within protection scope of the present invention.
It should be noted that, when this operation interface includes at least two interface element, it is the clicking operation that interface element is carried out owing to system needs to distinguish user, therefore, when being applicable to the scene that operation interface includes at least two interface element, it is possible to embody the superior effect of the present invention further.
Thus, this operation interface can be carried out clicking operation by user, such as, click the target interface element at least two interface element, thus, said apparatus 100 specifically aftermentioned processor 150 needs to identify this clicking operation, and the object of this clicking operation is (namely, above-mentioned target interface element), complete above-mentioned information mutual.
B. the first image unit 120
In embodiments of the present invention, first image unit 120 is for when this user is to the clicking operation of this object element, this operation interface is shot, to obtain at least one color image frame, thus, this coloured image is able to record that hands during user's clicking operation (or, the instrument carrying out clicking operation that user uses) position on operation interface, in other words, it is position relationship between hands and each interface element.
Further, the existing device that can carry out coloured image shooting arbitrarily can be used as this first image unit 120, for instance, the shooting of mobile phone is first-class.
C. the second projecting unit 130
In embodiments of the present invention, the second projecting unit 130 can present medium to above-mentioned, specifically, is present projection on medium to have the scope internal radiation infrared light at this operation interface.In embodiments of the present invention, it is possible to adopt such as, the device such as infrared light emitting diode (IRLED).
D. at least one second image unit 140
In embodiments of the present invention, second image unit 140 is for when this user is to the clicking operation of this object element, this operation interface is shot, to obtain at least two frame infrared images, thus, the few two frame infrared images of this coloured silk user can carry out the infrared information of object captured during clicking operation, for instance, texture.
It should be noted that the existing device that can carry out infrared image shooting arbitrarily can be used as this second image unit 140, or, it is also possible to it is configuration infrared fileter on the device of coloured image shooting can be carried out, to carry out infrared image shooting.
Further, in embodiments of the present invention, the second projecting unit 130 and the second image unit 140 can be integrated in same device and can also be respectively configured, and the present invention is also not particularly limited.
Alternatively, this first image unit uses same photographic head with this second image unit, this photographic head be configured with can opening and closing arrange infrared fileter, wherein when not obtaining this coloured image and this at least two frames infrared image in the same time, when this infrared fileter is opened, the first image unit obtains this coloured image by this photographic head, and when this infrared fileter cuts out, the second image unit obtains this infrared image by this photographic head
Specifically, as shown in Figure 2, in embodiments of the present invention, first image unit 120 and the second image unit 140 can public same photographic head, specifically, this photographic head can be able to carry out the device of coloured image shooting, by arranging infrared fileter on the device, enables this photographic head to carry out infrared image shooting.
Further, this infrared fileter could be arranged to can open and close type, when infrared fileter open wide time this photographic head can carry out coloured image shooting, when infrared fileter closes, this photographic head can carry out infrared image shooting.
Alternatively, this device includes two the second image units, and the distance on the first in-plane between these two second image units is be more than or equal to the first pre-determined threshold, and wherein, the depth direction of this clicking operation is perpendicular to this first plane.
Specifically, in embodiments of the present invention, it is possible to adopt binocular technology to carry out the identification of clicking operation.Binocular technology is also referred to as one of important branch that binocular stereo vision is computer vision research field, it is by the mode perception objective world of direct simulating human visual system, is widely used in the pose Detection & Controling of micro OS, robot navigation and aerial survey, the three-dimensional field such as non-cpntact measurement and virtual reality.Based on binocular technology, in embodiments of the present invention, by configuring two the second image units 140, it is possible to realize the contactless depth survey based on binocular parallax, relatively use the situation of single second image unit 140, it is possible to significantly improve the accuracy of clicking operation identification.
Additionally, in embodiments of the present invention, upper distance is (in other words in configuration plane (plane that the depth direction of clicking operation is vertical) for two the second image units 140, above-mentioned first pre-determined threshold) can set according to the method similar to prior art, here, in order to avoid repeating, description is omitted.
E. processing unit 150
Specifically, in embodiments of the present invention, processing unit 150 can communicate to connect with the first image unit 120, it is thus possible to obtain above-mentioned coloured image from this first image unit 120, and, can based on the prior model prestored, identify user hands (or, user carries out the instrument that clicking operation uses), it is thus possible to determine the position at hands of user, and can by the interface element corresponding with this position (such as, it is positioned at the interface element of the coverage of hands, or, coverage includes the interface element of the position of hands) it is defined as target interface element.
And, in embodiments of the present invention, processing unit 150 can communicate to connect with the second image unit 140, it is thus possible to obtain above-mentioned two frame infrared images from this second image unit 140, and, it is analyzed processing by the infrared information that this infrared image is recorded, to identify that in above-mentioned two frame infrared images, the degree of depth of the hands of user is (in other words, the degree of depth of clicking operation), here, identify that in infrared image, object is (such as, the hands of user) the method for the degree of depth can be similar to prior art with process, here, in order to avoid repeating, description is omitted.
Further, when in two frame infrared images, the degree of depth of the hands of user is different, processing unit 150 may identify which and carried out clicking operation for user.
Alternatively, this processing unit is different with the second click degree of depth specifically for clicking the degree of depth first, and this first click degree of depth with this second click the degree of depth difference be more than or equal to the second pre-determined threshold time, it is identified as this user and this object element has been carried out clicking operation, wherein, this second pre-determined threshold is to determine according to the distance on the first in-plane between this second projecting unit and second projecting unit, and wherein, the depth direction of this clicking operation is perpendicular to this first plane.
Specifically, in embodiments of the present invention, when in two frame infrared images, the degree of depth of the hands of user is different, processing unit 150 can identify the difference of this above-mentioned difference further, when this difference is more than pre-determined threshold (that is, the second pre-determined threshold), processing unit 150 may identify which and carried out clicking operation for user, thus, it is possible to improve the accuracy identifying clicking operation further.
Processing unit 150 can pass through different devices in sum, identifies whether user has carried out clicking operation and the target of clicking operation respectively.
It addition, the process whether identification user has carried out the process of clicking operation and the target of identification clicking operation can carry out carrying out at times simultaneously, the present invention is also not particularly limited.
Additionally, in embodiments of the present invention, the device 100 of this identification clicking operation can embed or itself be and can outwardly carry out projecting to present operation interface, and by catching terminal unit or the system of the gesture identification user operation of user, such as, mobile phone, panel computer, computing equipment, device for display of message or somatic game system etc..
It is introduced for the terminal unit (UE, UserEquipment) that the present invention is suitable for by the mobile phone shown in Fig. 3.In embodiments of the present invention, mobile phone can include radio frequency (RF, RadioFrequency) circuit 210, memorizer 220, input block 230, Wireless Fidelity (WiFi, wirelessfidelity) module 270, display unit 240, sensor 250, voicefrequency circuit 260, processor 280, projecting cell 290, the shooting parts such as unit 295.
Wherein, the function of projecting cell 290 is similar with above-mentioned first projecting unit and the second image unit with structure, and here, in order to avoid repeating, description is omitted.
The function of same shooting unit 295 is similar with above-mentioned first image unit and the second image unit with structure, and here, in order to avoid repeating, description is omitted.
It addition, it will be understood by those skilled in the art that the handset structure shown in Fig. 3 is merely illustrative and non-limiting, mobile phone can also include ratio and illustrate more or less of parts, or combines some parts, or different parts are arranged.
RF circuit 210 can be used for receiving and sending messages or in communication process, the reception of signal and transmission, especially, after being received by the downlink information of base station, processes to processor 280;It addition, data up for mobile phone are sent to base station.Generally, RF circuit includes but not limited to antenna, at least one amplifier, transceiver, bonder, low-noise amplifier (LNA, LowNoiseAmplifier), duplexer etc..Communicate additionally, RF circuit 210 can also pass through radio communication with network and other equipment.nullAbove-mentioned radio communication can use arbitrary communication standard or agreement,Include but not limited to global system for mobile communications (GSM,GlobalSystemforMobilecommunication)、General packet radio service (GPRS,GeneralPacketRadioService)、CDMA (CDMA,CodeDivisionMultipleAccess)、WCDMA (WCDMA,WidebandCodeDivisionMultipleAccess)、Long Term Evolution (LTE,LongTermEvolution)、Email、Short Message Service (SMS,ShortMessagingService) etc..
Wherein, memorizer 220 can be used for storing software program and module, and processor 280 is stored in software program and the module of memorizer 220 by running, thus performing the application of various functions and the data process of mobile phone.Memorizer 220 can mainly include storage program area and storage data field, and wherein, storage program area can store the application program (such as sound-playing function, image player function etc.) etc. needed for operating system, at least one function;Storage data field can store the data (such as voice data, phone directory etc.) etc. that the use according to mobile phone creates.Additionally, memorizer 220 can include high-speed random access memory, it is also possible to include nonvolatile memory, for instance at least one disk memory, flush memory device or other volatile solid-state parts.
Input block 230 can be used for receiving numeral or the character information of input, and produces the key signals relevant with the user setup of mobile phone 200 and function control.Specifically, input block 230 can include contact panel 231 and other input equipments 232.Contact panel 231, also referred to as touch screen, user can be collected thereon or neighbouring touch operation (such as user uses any applicable object such as finger, stylus or adnexa operation on contact panel 231 or near contact panel 231), and drive corresponding connecting device according to formula set in advance.Optionally, contact panel 231 can include touch detecting apparatus and two parts of touch controller.Wherein, the touch orientation of touch detecting apparatus detection user, and detect the signal that touch operation brings, transmit a signal to touch controller;Touch controller receives touch information from touch detecting apparatus, and is converted into contact coordinate, then gives processor 280, and can receive order that processor 280 sends and be performed.Furthermore, it is possible to adopt the polytypes such as resistance-type, condenser type, infrared ray and surface acoustic wave to realize contact panel 231.Except contact panel 231, input block 230 can also include other input equipments 232.Specifically, other input equipments 232 can include but not limited to one or more in physical keyboard, function key (such as volume control button, switch key etc.), trace ball, mouse, action bars etc..
Wherein, display unit 240 can be used for showing the various menus of information or the information being supplied to user and the mobile phone inputted by user.Display unit 240 can include display floater 241, optionally, the form such as liquid crystal display (LCD, LiquidCrystalDisplay), Organic Light Emitting Diode (OLED, OrganicLight-EmittingDiode) can be adopted to configure display floater 241.Further, contact panel 231 can cover display floater 241, when contact panel 231 detects thereon or after neighbouring touch operation, send processor 280 to determine the type of touch event, on display floater 241, provide corresponding visual output with preprocessor 280 according to the type of touch event.
Wherein, the position in the outer display floater 241 of this visual output that this human eye is capable of identify that, it is possible to as aftermentioned " viewing area ".Although in figure 3, contact panel 231 and display floater 241 are to realize input and the output function of mobile phone as two independent parts, but in some embodiments it is possible to by integrated to contact panel 231 and display floater 241 and realize input and the output function of mobile phone.
It addition, mobile phone 200 may also include at least one sensor 250, such as attitude transducer, optical sensor and other sensors.
Specifically, attitude transducer can also be called motion sensor, and, one as this motion sensor, gravity sensor can be enumerated, gravity sensor adopts elastic sensing element to make cantilevered shifter, and adopts energy-stored spring that elastic sensing element makes to drive electric contact, thus realizing being converted into Gravity changer the change of the signal of telecommunication.
Another kind as motion sensor, accelerometer sensor can be enumerated, accelerometer sensor can detect the acceleration magnitude that (is generally three axles) in all directions, can detect that the size of gravity and direction time static, can be used for identifying the application (such as horizontal/vertical screen switching, dependent game, magnetometer pose calibrating) of mobile phone attitude, Vibration identification correlation function (such as pedometer, knock) etc..
In embodiments of the present invention; motion sensor listed above can be adopted as obtaining aftermentioned " attitude parameter " element; but being not limited to this, other are obtained in that the sensor of " attitude parameter " each falls within protection scope of the present invention, for instance; gyroscope etc.; further, the operation principle of this gyroscope can be similar to prior art with data handling procedure, here; in order to avoid repeating, description is omitted.
Additionally, in embodiments of the present invention, as sensor 250, can also configure other sensors such as barometer, drimeter, thermometer and infrared ray sensor, do not repeat them here.
Optical sensor can include ambient light sensor and proximity transducer, and wherein, ambient light sensor can regulate the brightness of display floater 241 according to the light and shade of ambient light, and proximity transducer when mobile phone moves in one's ear, can cut out display floater 241 and/or backlight.
Voicefrequency circuit 260, speaker 261, microphone 262 can provide the audio interface between user and mobile phone.Voicefrequency circuit 260 can by receive voice data conversion after the signal of telecommunication, be transferred to speaker 261, by speaker 261 be converted to acoustical signal output;On the other hand, the acoustical signal of collection is converted to the signal of telecommunication by microphone 262, voice data is converted to after being received by voicefrequency circuit 260, after again voice data output processor 280 being processed, through RF circuit 210 to be sent to such as another mobile phone, or voice data is exported to memorizer 220 to process further.
WiFi belongs to short range wireless transmission technology, and mobile phone can help user to send and receive e-mail by WiFi module 270, browse webpage and access streaming video etc., and it has provided the user wireless broadband internet and has accessed.Although Fig. 3 illustrates WiFi module 270, but it is understood that, it is also not belonging to must be configured into of mobile phone 200, completely can as needed in do not change invention essence scope in and omit.
Processor 280 is the control centre of mobile phone, utilize various interface and the various piece of the whole mobile phone of connection, it is stored in the software program in memorizer 220 and/or module by running or performing, and call the data being stored in memorizer 220, perform the various functions of mobile phone and process data, thus mobile phone is carried out integral monitoring.Optionally, processor 280 can include one or more processing unit;Preferably, processor 280 can integrated application processor and modem processor, wherein, application processor mainly processes operating system, user interface and application program etc., and modem processor mainly processes radio communication.
It is understood that above-mentioned modem processor can not also be integrated in processor 280.
Further, this processor 280 as the element that realizes of above-mentioned processing unit, can perform and the same or analogous function of processing unit.
Mobile phone 200 also includes the power supply (such as battery) powered to all parts.
Preferably, it is logically contiguous with processor 280 that power supply can pass through power-supply management system, realizes the functions such as management charging, electric discharge and power managed thereby through power-supply management system.Although not shown, mobile phone 200 can also include bluetooth module etc., does not repeat them here.
It should be noted that mobile phone shown in Fig. 3 is only the citing of a kind of terminal unit, the present invention is also not particularly limited, and present invention could apply to the electronic equipment such as mobile phone, panel computer, and this is not limited by the present invention.
The device of identification clicking operation according to embodiments of the present invention, when the target interface element in operation interface is carried out clicking operation by user, by this operation interface is shot, to obtain coloured image, this target interface element can be determined from least two interface element that this operation interface includes by this coloured image, and, by this operation interface is shot, to obtain infrared image, this infrared image can be passed through, determine that user has carried out clicking operation, and then can easily identify target interface element has been carried out clicking operation, thus, the required precision to equipment can be reduced high, mini-plant can be applicable to, and, identification error can be reduced, improve result of use and Consumer's Experience.
Above, describe the device of the identification clicking operation of the embodiment of the present invention in detail in conjunction with Fig. 1 to Fig. 3, below, in conjunction with Fig. 4, the method describing the identification clicking operation of the embodiment of the present invention in detail.
Fig. 4 illustrates the indicative flowchart of the 300 of the identification clicking operation of one embodiment of the invention, and as shown in Figure 4, the method 300 includes:
S310, to presenting medium projection operation interface, this operation interface includes at least two interface element, in order to the object element in this at least two interface element is carried out clicking operation by user;
S320, projects infrared light to this operation interface;
S330, when this user is to the clicking operation of this object element, shoots this operation interface, to obtain at least one color image frame and at least two frame infrared images based on this Infrared;
S340, according to this coloured image, it is determined that this object element;
S350, when first clicks the degree of depth and the second click degree of depth is different, determine that this user has carried out clicking operation, to identify that this object element has been carried out clicking operation by this user, wherein, this the first click degree of depth is the degree of depth of this clicking operation that the first infrared image in this at least two frames infrared image presents, and this second click degree of depth is the degree of depth of this clicking operation that the second infrared image in this at least two frames infrared image presents.
Alternatively, this operation interface is shot by this, includes obtaining at least one color image frame and at least two frame infrared images based on this Infrared:
Being provided by can the photographic head of infrared fileter that arranges of opening and closing, this operation interface is shot, not obtain this coloured image and this at least two frames infrared image in the same time, this coloured image is obtained by this photographic head when this infrared fileter is opened, and this at least two frames infrared image is obtained by this photographic head when this infrared fileter cuts out.
Alternatively, this operation interface is shot by this, including:
By two photographic head, this operation interface being shot, wherein, the distance on the first in-plane between these two photographic head is be more than or equal to the first pre-determined threshold, and wherein, the depth direction of this clicking operation is perpendicular to this first plane.
Alternatively, when first clicks the degree of depth and the second click degree of depth is different, this user should be identified as this object element carried out clicking operation, including:
The degree of depth is clicked different with the second click degree of depth first, and this first click degree of depth with this second click the degree of depth difference be more than or equal to the second pre-determined threshold time, it is identified as this user and this object element has been carried out clicking operation, wherein, this second pre-determined threshold is to determine according to the distance on the first in-plane between this second projecting unit and second projecting unit, wherein, the depth direction of this clicking operation is perpendicular to this first plane.
Specifically, in embodiments of the present invention, operation interface can be projected to and present medium, present medium as this, it is possible to be the various media such as body of wall, projection screen or water curtain, the present invention is also not particularly limited, further, this process can be similar to prior art, here, in order to avoid repeating, description is omitted.
In embodiments of the present invention, user can by gesture operation to user interface (specifically, be the interface element in user interface), and to carry out information mutual with system.
It should be noted that, when this operation interface includes at least two interface element, it is the clicking operation that interface element is carried out owing to system needs to distinguish user, therefore, when being applicable to the scene that operation interface includes at least two interface element, it is possible to embody the superior effect of the present invention further.
Thus, this operation interface can be carried out clicking operation by user, for instance, click the target interface element at least two interface element, accordingly it is desirable to identify this clicking operation, and the object of this clicking operation (that is, above-mentioned target interface element), complete above-mentioned information mutual.
In embodiments of the present invention, when this user is to the clicking operation of this object element, this operation interface is shot, to obtain at least one color image frame, thus, this coloured image is able to record that hands during user's clicking operation (or, the instrument carrying out clicking operation that user uses) position on operation interface, in other words, it is position relationship between hands and each interface element.
In embodiments of the present invention, it is possible to present medium to above-mentioned, specifically, it is present projection on medium to have the scope internal radiation infrared light at this operation interface.
In embodiments of the present invention, when this user is to the clicking operation of this object element, this operation interface is shot, to obtain at least two frame infrared images, thus, the few two frame infrared images of this coloured silk user can carry out the infrared information of object captured during clicking operation, such as, texture.
It should be noted that the existing device that can carry out infrared image shooting arbitrarily can be used, or, it is also possible to it is configuration infrared fileter on the device of coloured image shooting can be carried out, to carry out infrared image shooting.
Further, in embodiments of the present invention, the equipment of shooting coloured image can be integrated in same device with the equipment shooting infrared image and can also be respectively configured, and the present invention is also not particularly limited.
Such as, in embodiments of the present invention, the equipment of shooting coloured image and the equipment of shooting infrared image are with public same photographic head, specifically, this photographic head can be able to carry out the device of coloured image shooting, by arranging infrared fileter on the device, this photographic head is enable to carry out infrared image shooting.
Further, this infrared fileter could be arranged to can open and close type, when infrared fileter is opened, this photographic head can carry out coloured image shooting, and when infrared fileter cuts out (in other words, Guan Bi), this photographic head can carry out infrared image shooting.
In embodiments of the present invention, it is possible to adopt binocular technology to carry out the identification of clicking operation.Binocular technology is also referred to as one of important branch that binocular stereo vision is computer vision research field, it is by the mode perception objective world of direct simulating human visual system, is widely used in the pose Detection & Controling of micro OS, robot navigation and aerial survey, the three-dimensional field such as non-cpntact measurement and virtual reality.Based on binocular technology, in embodiments of the present invention, by configuring two photographic head, it is possible to realize the contactless depth survey based on binocular parallax, relatively use the situation of single photographic head, it is possible to significantly improve the accuracy of clicking operation identification.
Additionally, in embodiments of the present invention, upper distance is (in other words in configuration plane (plane that the depth direction of clicking operation is vertical) for two photographic head, above-mentioned first pre-determined threshold) can set according to the method similar to prior art, here, in order to avoid repeating, description is omitted.
In embodiments of the present invention, can based on the prior model prestored, identify user hands (or, user carries out the instrument that clicking operation uses), it is thus possible to determine the position at hands of user, it is possible to the interface element corresponding with this position (such as, is positioned at the interface element of the coverage of hands, or, coverage includes the interface element of the position of hands) and it is defined as target interface element.
And, in embodiments of the present invention, the infrared information that can pass through that this infrared image is recorded is analyzed processing, to identify the degree of depth (in other words, the degree of depth of clicking operation) of the hands of user in above-mentioned two frame infrared images, here, identify that in infrared image, the method for the degree of depth of object (such as, the hands of user) can be similar to prior art with process, here, in order to avoid repeating, description is omitted.
Further, when in two frame infrared images, the degree of depth of the hands of user is different, it is possible to be identified as user and carried out clicking operation.
In embodiments of the present invention, when in two frame infrared images, the degree of depth of the hands of user is different, the difference of this above-mentioned difference can be identified further, when this difference more than pre-determined threshold (namely, second pre-determined threshold) time, may identify which and carried out clicking operation for user, thus, it is possible to improve the accuracy identifying clicking operation further.
In sum, it is possible to by different devices, identify whether user has carried out clicking operation and the target of clicking operation respectively.
It addition, the process whether identification user has carried out the process of clicking operation and the target of identification clicking operation can carry out carrying out at times simultaneously, the present invention is also not particularly limited.
The method 300 of identification clicking operation according to embodiments of the present invention can by the identification clicking operation in Fig. 1 device 100 each device perform.
The method of identification clicking operation according to embodiments of the present invention, when the target interface element in operation interface is carried out clicking operation by user, by this operation interface is shot, to obtain coloured image, this target interface element can be determined from least two interface element that this operation interface includes by this coloured image, and, by this operation interface is shot, to obtain infrared image, this infrared image can be passed through, determine that user has carried out clicking operation, and then can easily identify target interface element has been carried out clicking operation, thus, the required precision to equipment can be reduced high, mini-plant can be applicable to, and, identification error can be reduced, improve result of use and Consumer's Experience.
Those of ordinary skill in the art are it is to be appreciated that the unit of each example that describes in conjunction with the embodiments described herein and algorithm steps, it is possible to being implemented in combination in of electronic hardware or computer software and electronic hardware.These functions perform with hardware or software mode actually, depend on application-specific and the design constraint of technical scheme.Professional and technical personnel specifically can should be used for using different methods to realize described function to each, but this realization is it is not considered that beyond the scope of this invention.
Those skilled in the art is it can be understood that arrive, for convenience and simplicity of description, and the specific works process of the system of foregoing description, device and unit, it is possible to reference to the corresponding process in preceding method embodiment, do not repeat them here.
It should be understood that, in various embodiments of the present invention, the size of the sequence number of above-mentioned each process is not meant to the priority of execution sequence, and the execution sequence of each process should be determined with its function and internal logic, and the implementation process of the embodiment of the present invention should not constituted any restriction.
In several embodiments provided herein, it should be understood that disclosed device can realize by another way.Such as, device embodiment described above is merely schematic, such as, the division of described unit, being only a kind of logic function to divide, actual can have other dividing mode when realizing, for instance multiple unit or assembly can in conjunction with or be desirably integrated into another system, or some features can ignore, or do not perform.Another point, shown or discussed coupling each other or direct-coupling or communication connection can be through INDIRECT COUPLING or the communication connection of some interfaces, device or unit, it is possible to be electrical, machinery or other form.
The described unit illustrated as separating component can be or may not be physically separate, and the parts shown as unit can be or may not be physical location, namely may be located at a place, or can also be distributed on multiple NE.Some or all of unit therein can be selected according to the actual needs to realize the purpose of the present embodiment scheme.
In the present invention, when be described to particular elements between first component and second component time, can there is intervening elements between this particular elements and first component or second component, it is also possible to be absent from intervening elements;When being described to particular elements and connecting other parts, this particular elements can be directly connected to other parts described and not have intervening elements, can not also be directly connected to other parts described and have intervening elements.
It addition, each functional unit in each embodiment of the present invention can be integrated in a processing unit, it is also possible to be that unit is individually physically present, it is also possible to two or more unit are integrated in a unit.
If described function is using the form realization of SFU software functional unit and as independent production marketing or use, it is possible to be stored in a computer read/write memory medium.Based on such understanding, part or the part of this technical scheme that prior art is contributed by technical scheme substantially in other words can embody with the form of software product, this computer software product is stored in a storage medium, including some instructions with so that a computer equipment (can be personal computer, server, or the network equipment etc.) perform all or part of step of method described in each embodiment of the present invention.And aforesaid storage medium includes: USB flash disk, portable hard drive, read only memory (ROM, Read-OnlyMemory), the various media that can store program code such as random access memory (RAM, RandomAccessMemory), magnetic disc or CD.
The above; being only the specific embodiment of the present invention, but protection scope of the present invention is not limited thereto, any those familiar with the art is in the technical scope that the invention discloses; change can be readily occurred in or replace, all should be encompassed within protection scope of the present invention.Therefore, protection scope of the present invention should be as the criterion with described scope of the claims.

Claims (8)

1. the device identifying clicking operation, it is characterised in that this device includes:
First projecting unit, for presenting medium projection operation interface, described operation interface includes at least two interface element, in order to the object element in described at least two interface element is carried out clicking operation by user;
First image unit, for when described user is to the clicking operation of described object element, shooting described operation interface, to obtain at least one color image frame;
Second projecting unit, for projecting infrared light to described operation interface;
At least one second image unit, for when described user is to the clicking operation of described object element, shooting described operation interface, to obtain at least two frame infrared images based on described Infrared;
Processing unit, for according to described coloured image, determine described object element, and for when first clicks the degree of depth and the second click degree of depth is different, determine that described user has carried out clicking operation, to identify that described object element has been carried out clicking operation by described user, wherein, described first clicks the degree of depth that the degree of depth is the described clicking operation that the first infrared image in described at least two frame infrared images presents, and described second clicks the degree of depth that the degree of depth is the described clicking operation that the second infrared image in described at least two frame infrared images presents.
2. device according to claim 1, it is characterized in that, described first image unit uses same photographic head with described second image unit, described photographic head be configured with can opening and closing arrange infrared fileter, wherein when not obtaining described coloured image and described at least two frame infrared image in the same time, when described infrared fileter is opened, the first image unit obtains described coloured image by described photographic head, and when described infrared fileter cuts out, the second image unit obtains described infrared image by described photographic head.
3. device according to claim 1 and 2, it is characterized in that, described device includes two the second image units, and the distance on the first in-plane between said two the second image unit is be more than or equal to the first pre-determined threshold, wherein, the depth direction of described clicking operation is perpendicular to described first plane.
4. device according to claim 1, it is characterized in that, described processing unit is different with the second click degree of depth specifically for clicking the degree of depth first, and described first click the degree of depth with described second click the degree of depth difference be more than or equal to the second pre-determined threshold time, it is identified as described user and described object element has been carried out clicking operation, wherein, described second pre-determined threshold is to determine according to the distance on the first in-plane between described second projecting unit and the second projecting unit, wherein, the depth direction of described clicking operation is perpendicular to described first plane.
5. the method identifying clicking operation, it is characterised in that described method includes:
To presenting medium projection operation interface, described operation interface includes at least two interface element, in order to the object element in described at least two interface element is carried out clicking operation by user;
Infrared light is projected to described operation interface;
When described user is to the clicking operation of described object element, described operation interface is shot, to obtain at least one color image frame and at least two frame infrared images based on described Infrared;
According to described coloured image, it is determined that described object element;
When first clicks the degree of depth and the second click degree of depth is different, determine that described user has carried out clicking operation, to identify that described object element has been carried out clicking operation by described user, wherein, described first clicks the degree of depth that the degree of depth is the described clicking operation that the first infrared image in described at least two frame infrared images presents, and described second clicks the degree of depth that the degree of depth is the described clicking operation that the second infrared image in described at least two frame infrared images presents.
6. method according to claim 5, it is characterised in that described described operation interface is shot, includes obtaining at least one color image frame and at least two frame infrared images based on described Infrared:
Being provided by can the photographic head of infrared fileter that arranges of opening and closing, described operation interface is shot, not obtain described coloured image and described at least two frame infrared images in the same time, described coloured image is obtained by described photographic head when described infrared fileter is opened, and described at least two frame infrared images are obtained by described photographic head when described infrared fileter cuts out.
7. method according to claim 1 and 2, it is characterised in that described described operation interface is shot, including:
By two photographic head, described operation interface being shot, wherein, the distance on the first in-plane between said two photographic head is be more than or equal to the first pre-determined threshold, and wherein, the depth direction of described clicking operation is perpendicular to described first plane.
8. method according to claim 1, it is characterised in that described when first clicks the degree of depth and the second click degree of depth is different, it is determined that described user has carried out clicking operation, including:
The degree of depth is clicked different with the second click degree of depth first, and described first click the degree of depth with described second click the degree of depth difference be more than or equal to the second pre-determined threshold time, determine that described object element has been carried out clicking operation by described user, wherein, described second pre-determined threshold is to determine according to the distance on the first in-plane between described second projecting unit and the second projecting unit, wherein, the depth direction of described clicking operation is perpendicular to described first plane.
CN201410826952.9A 2014-12-25 2014-12-25 The device and method for identifying clicking operation Active CN105787485B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410826952.9A CN105787485B (en) 2014-12-25 2014-12-25 The device and method for identifying clicking operation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410826952.9A CN105787485B (en) 2014-12-25 2014-12-25 The device and method for identifying clicking operation

Publications (2)

Publication Number Publication Date
CN105787485A true CN105787485A (en) 2016-07-20
CN105787485B CN105787485B (en) 2019-11-26

Family

ID=56389395

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410826952.9A Active CN105787485B (en) 2014-12-25 2014-12-25 The device and method for identifying clicking operation

Country Status (1)

Country Link
CN (1) CN105787485B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107608553A (en) * 2017-09-18 2018-01-19 联想(北京)有限公司 A kind of touch area calibration method and electronic equipment
CN110414393A (en) * 2019-07-15 2019-11-05 福州瑞芯微电子股份有限公司 A kind of natural interactive method and terminal based on deep learning

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103329147A (en) * 2010-11-04 2013-09-25 数字标记公司 Smartphone-based methods and systems
CN103809880A (en) * 2014-02-24 2014-05-21 清华大学 Man-machine interaction system and method
CN103927089A (en) * 2013-01-11 2014-07-16 株式会社理光 Method and device for controlling interactive user interface objects
CN104020853A (en) * 2014-06-23 2014-09-03 暨南大学 Kinect-based system and method for controlling network browser
CN104076914A (en) * 2013-03-28 2014-10-01 联想(北京)有限公司 Electronic equipment and projection display method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103329147A (en) * 2010-11-04 2013-09-25 数字标记公司 Smartphone-based methods and systems
CN103927089A (en) * 2013-01-11 2014-07-16 株式会社理光 Method and device for controlling interactive user interface objects
CN104076914A (en) * 2013-03-28 2014-10-01 联想(北京)有限公司 Electronic equipment and projection display method
CN103809880A (en) * 2014-02-24 2014-05-21 清华大学 Man-machine interaction system and method
CN104020853A (en) * 2014-06-23 2014-09-03 暨南大学 Kinect-based system and method for controlling network browser

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107608553A (en) * 2017-09-18 2018-01-19 联想(北京)有限公司 A kind of touch area calibration method and electronic equipment
CN110414393A (en) * 2019-07-15 2019-11-05 福州瑞芯微电子股份有限公司 A kind of natural interactive method and terminal based on deep learning

Also Published As

Publication number Publication date
CN105787485B (en) 2019-11-26

Similar Documents

Publication Publication Date Title
US9801009B2 (en) Location based reminder system and method for controlling the same
CN103473011B (en) A kind of mobile terminal performance detection method, device and mobile terminal
EP3147756A1 (en) Mobile terminal and method of controlling the same
CN103389863B (en) A kind of display control method and device
EP2987244B1 (en) Mobile terminal and control method for the mobile terminal
CN107896279A (en) Screenshotss processing method, device and the mobile terminal of a kind of mobile terminal
KR20160073861A (en) Portable apparatus and method for controlling a location information
CN103365419B (en) A kind of method and apparatus triggering alarm clock control command
CN103455256A (en) Method and terminal for rotating display picture of screen
CN110034876A (en) PUCCH resource instruction, processing method, network side equipment, user terminal
CN109597558A (en) A kind of display control method and terminal device
CN103559731B (en) Method and terminal for displaying lyrics under screen locking state
CN104598476A (en) Message aggregation display method and information display method and relevant device
CN104199596B (en) scene interface switching method and device
CN104869465A (en) Video playing control method and device
CN110149663A (en) Condition handover cancelling method and communication equipment
CN109032486A (en) A kind of display control method and terminal device
CN106170034A (en) A kind of sound effect treatment method and mobile terminal
CN108108113A (en) Webpage switching method and device
CN108898555A (en) A kind of image processing method and terminal device
CN110225180A (en) A kind of content input method and terminal device
CN110147186A (en) A kind of control method and terminal device of application
CN108681427A (en) A kind of method and terminal device of access privilege control
CN108196781A (en) The display methods and mobile terminal at interface
CN105653112A (en) Method and device for displaying floating layer

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant