CN104049807A - Information processing method and electronic equipment - Google Patents

Information processing method and electronic equipment Download PDF

Info

Publication number
CN104049807A
CN104049807A CN201310076677.9A CN201310076677A CN104049807A CN 104049807 A CN104049807 A CN 104049807A CN 201310076677 A CN201310076677 A CN 201310076677A CN 104049807 A CN104049807 A CN 104049807A
Authority
CN
China
Prior art keywords
operating body
image information
projection operation
image
primary importance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201310076677.9A
Other languages
Chinese (zh)
Other versions
CN104049807B (en
Inventor
高歌
张柳新
张超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201310076677.9A priority Critical patent/CN104049807B/en
Publication of CN104049807A publication Critical patent/CN104049807A/en
Application granted granted Critical
Publication of CN104049807B publication Critical patent/CN104049807B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Position Input By Displaying (AREA)

Abstract

The invention discloses an information processing method for reducing the faulty operation rate. The method can comprise the following steps: when a to-be-projected content is projected to a projection operation surface through a projection module, acquiring first image information through an image acquisition unit; judging whether the first image information includes a first image corresponding to a first operation body according to the first image information; when the first image corresponding to the first operation body exists, determining a first position of the first operation body on the first image information and a second position of a first operation body shadow on the first image information according to the first image and a second image of the first operation body shadow of the first operation body on the projection operation surface; judging whether the first operation body is in contact with or is not in contact with the surface of the projection operation surface according to the first position and the second position. The invention further discloses electronic equipment for implementing the method.

Description

A kind of information processing method and electronic equipment
Technical field
The present invention relates to computing machine and shadow casting technique field, particularly a kind of information processing method and electronic equipment.
Background technology
Along with scientific and technical development, electronic technology has also obtained development at full speed, and the kind of electronic product is also more and more, and people have also enjoyed the various facilities that development in science and technology brings.People can pass through various types of electronic equipments now, enjoy the comfortable life bringing along with development in science and technology.
Along with popularizing of shadow casting technique, in the process of projection in order to be better controlled, often need to utilize telepilot to control projection, so also need user's remote controller, need extra hardware device, and after needing user to understand each keypress function of telepilot, just can operate, obviously comparatively inconvenience.
Present inventor, in realizing the process of the embodiment of the present application technical scheme, at least finds to exist in prior art following technical matters:
In order to increase the interactivity of projector equipment, hope can make user directly pass through at the enterprising line operate in projection interface, and without using a teleswitch.But in this mode of operation, whether the hand that need to determine user has directly touched projecting plane, sometimes user's hand does not have the actual projecting plane that touches, and just the projection of hand is positioned at projecting plane, for example at this moment user's finger has carried out clicking operation under suspended state, if at this moment projector equipment responds this operation of user, may cause responding inaccurate, cause maloperation.
Summary of the invention
The embodiment of the present invention provides a kind of information processing method and electronic equipment, for solving the technical matters that prior art misuse rate is higher, has realized the technique effect that reduces misuse rate.
An information processing method, application one electronic equipment, described electronic equipment comprises a projection module and an image acquisition units, described method comprises:
By described projection module by one when project content projects on projection operation's face, by described image acquisition units, gather the first image information, described the first image information comprises projection operation's face image information that described projection operation face is corresponding;
According to described the first image information, judge in described the first image information, whether to include the first image that one first operating body is corresponding;
When there is described the first image corresponding to described the first operating body, the second image of the first operating body shade on described projection operation face according to described the first image and described the first operating body, determines primary importance and the described first operating body shade second place on described first image information of described the first operating body in described the first image information;
According to described primary importance and the described second place, determine that described the first operating body is the surface of the described projection operation of contact face, does not still contact the surface of described projection operation face.
Preferably, determine that the primary importance of described the first operating body in described the first image information and the step of the second place of described the first operating body shade in described the first image information comprise: in described the first image, determine the first identification point corresponding to described the first operating body, by the location positioning of described the first identification point, it is described primary importance, and in described the first image, determine the summit of described the first operating body shade, the position of determining described summit is the described second place.
Preferably, the second image of the first operating body shade on described projection operation face according to the first image of described the first operating body in described image information and described the first operating body, determine that the primary importance of described the first operating body within the scope of described pre-set space and the step of the second place of described the first operating body shade on described projection operation face comprise: according to described the first image and described the second image, determine the first coordinate of described the first identification point, and second coordinate on described summit.
Preferably, according to described primary importance and the described second place, determine that described the first operating body is the surface of the described projection operation of contact face, still the surperficial step that does not contact described projection operation face comprises: judge whether described primary importance and the described second place overlap in described the first image information, or judge whether described primary importance and the distance of the described second place in described the first image information are not more than the first predeterminable range.
Preferably, according to described primary importance and the described second place, determine that described the first operating body is the surface of the described projection operation of contact face, the surperficial step that does not still contact described projection operation face comprises:
When described primary importance and the described second place overlap in described the first image information, determine that described the first operating body contacts the surface of described projection operation face, otherwise determine that described the first operating body does not contact the surface of described projection operation face; Or
When described primary importance and the described second place, the distance in described the first image information is not more than the first predeterminable range, determine that described the first operating body contacts the surface of described projection operation face, otherwise determine that described the first operating body does not contact the surface of described projection operation face.
Preferably, judge whether described primary importance and the described second place overlap in described the first image information, or judge that the step whether described primary importance and the distance of the described second place in described the first image information are not more than the first predeterminable range comprises: judge whether described the first coordinate and described the second coordinate are same coordinate in described the first image information, or judge whether described the first coordinate and the distance of described the second coordinate in described the first image information are not more than described the first predeterminable range.
Preferably, according to described primary importance and the described second place, determine that described the first operating body is the surface of the described projection operation of contact face, the surface that does not still contact described projection operation face also comprises step before:
Determine the first area and second area corresponding to described the second image that described the first image is corresponding;
Judge whether described the first area equates with described second area, obtain the first judged result;
When described primary importance and the described second place overlap in described the first image information, determine that described the first operating body contacts the surface of described projection operation face, otherwise determine that the surperficial step that described the first operating body does not contact described projection operation face comprises: when described primary importance and the described second place overlap in described the first image information, and when described the first judged result shows that described the first area equates with described second area, determine that described the first operating body contacts the surface of described projection operation face, otherwise determine that described the first operating body does not contact the surface of described projection operation face.
Preferably, according to described primary importance and the described second place, determine that described the first operating body is the surface of the described projection operation of contact face, the surface that does not still contact described projection operation face also comprises step before:
Determine the first area and second area corresponding to described the second image that described the first image is corresponding;
Judge whether described the first area equates with described second area, obtain the first judged result;
When when described primary importance and the described second place, the distance in described the first image information is not more than the first predeterminable range, determine that described the first operating body contacts the surface of described projection operation face, otherwise the surperficial step that definite described the first operating body does not contact described projection operation face comprises: when described primary importance and the described second place, the distance in described the first image information is not more than described the first predeterminable range, and when described the first judged result shows that described the first area equates with described second area, determine that described the first operating body contacts the surface of described projection operation face, otherwise determine that described the first operating body does not contact the surface of described projection operation face.
Preferably, according to described primary importance and the described second place, determine that described the first operating body is the surface of the described projection operation of contact face, the surface that does not still contact described projection operation face also comprises step afterwards:
Obtain the first operation that described the first operating body carries out;
Described the first operation is responded, call the first function with described the first operational correspondence.
Preferably, according to described primary importance and the described second place, determine that described the first operating body is the surface of the described projection operation of contact face, the surface that does not still contact described projection operation face also comprises step afterwards:
Described the first operating body is monitored, determined that described the first operating body does not contact the surface of described projection operation face in first moment, obtain monitored results;
According to described monitored results, determine the first operation that described the first operating body carries out, thereby can determine corresponding response mode according to described the first operation.
An electronic equipment, described electronic equipment comprises a projection module and an image acquisition units, described electronic equipment also comprises:
Operational module, for by described projection module by one when project content projects on projection operation's face, by described image acquisition units, gather the first image information, described the first image information comprises projection operation's face image information that described projection operation face is corresponding;
The first judge module, for according to described the first image information, judges in described the first image information, whether to include the first image that one first operating body is corresponding;
The first determination module, for when there is described the first image corresponding to described the first operating body, the second image of the first operating body shade on described projection operation face according to described the first image and described the first operating body, determines primary importance and the described first operating body shade second place on described first image information of described the first operating body in described the first image information;
The second determination module, for according to described primary importance and the described second place, determines that described the first operating body is the surface of the described projection operation of contact face, does not still contact the surface of described projection operation face.
Preferably, described the first determination module specifically for: in described the first image, determine the first identification point corresponding to described the first operating body, by the location positioning of described the first identification point, it is described primary importance, and in described the first image, determine the summit of described the first operating body shade, the position of determining described summit is the described second place.
Preferably, described the first determination module specifically for: according to described the first image and described the second image, determine the first coordinate of described the first identification point, and second coordinate on described summit.
Preferably, described the first judge module specifically for: judge whether described primary importance and the described second place overlap in described the first image information, or judge whether described primary importance and the distance of the described second place in described the first image information are not more than the first predeterminable range.
Preferably, described the second determination module specifically for:
When described primary importance and the described second place overlap in described the first image information, determine that described the first operating body contacts the surface of described projection operation face, otherwise determine that described the first operating body does not contact the surface of described projection operation face; Or
When described primary importance and the described second place, the distance in described the first image information is not more than the first predeterminable range, determine that described the first operating body contacts the surface of described projection operation face, otherwise determine that described the first operating body does not contact the surface of described projection operation face.
Preferably, described the first judge module specifically for: judge whether described the first coordinate and described the second coordinate are same coordinate in described the first image information, or judge whether described the first coordinate and the distance of described the second coordinate in described the first image information are not more than described the first predeterminable range.
Preferably, described electronic equipment also comprises the 3rd determination module and the second judge module;
Described the 3rd determination module is used for: determine the first area and second area corresponding to described the second image that described the first image is corresponding;
Described the second judge module is used for: judge whether described the first area equates with described second area, obtain the first judged result;
Described the second determination module specifically for: when described primary importance and the described second place overlap in described the first image information, and when described the first judged result shows that described the first area equates with described second area, determine that described the first operating body contacts the surface of described projection operation face, otherwise determine that described the first operating body does not contact the surface of described projection operation face.
Preferably, described electronic equipment also comprises the 3rd determination module and the second judge module;
Described the 3rd determination module is used for: determine the first area and second area corresponding to described the second image that described the first image is corresponding;
Described the second judge module is used for: judge whether described the first area equates with described second area, obtain the first judged result;
Described the second determination module specifically for: when described primary importance and the described second place, the distance in described the first image information is not more than described the first predeterminable range, and when described the first judged result shows that described the first area equates with described second area, determine that described the first operating body contacts the surface of described projection operation face, otherwise determine that described the first operating body does not contact the surface of described projection operation face.
Preferably, described electronic equipment also comprises acquisition module and calling module;
The first operation that described acquisition module carries out for obtaining described the first operating body;
Described calling module, for described the first operation is responded, calls the first function with described the first operational correspondence.
Preferably, described electronic equipment also comprises monitoring module and the 4th determination module;
Described monitoring module, for described the first operating body is monitored, determines that described the first operating body does not contact the surface of described projection operation face in first moment, obtain monitored results;
First operation of described the 4th determination module for determining that according to described monitored results described the first operating body carries out, thus can determine corresponding response mode according to described the first operation.
Information processing method in the embodiment of the present invention can be applied to electronic equipment, described electronic equipment can comprise a projecting cell and an image acquisition units, described method can comprise: by described projection module by one when project content projects on projection operation's face, by described image acquisition units, gather the first image information, described the first image information comprises projection operation's face image information that described projection operation face is corresponding; According to described the first image information, judge in described the first image information, whether to include the first image that one first operating body is corresponding; When there is described the first image corresponding to described the first operating body, the second image of the first operating body shade on described projection operation face according to described the first image and described the first operating body, determines primary importance and the described first operating body shade second place on described first image information of described the first operating body in described the first image information; According to described primary importance and the described second place, determine that described the first operating body is the surface of the described projection operation of contact face, does not still contact the surface of described projection operation face.
In the embodiment of the present invention, can gather described the first image information by described image acquisition units, according to described the first image information, can determine the described primary importance of described the first operating body and the described second place of described the first operating body shade, thereby can determine according to described primary importance and the described second place whether described the first operating body has touched described projection operation face actually, thereby can determine whether to respond the corresponding operating of described the first operating body, avoided when described the first operating body is actual be during in suspended state described in electronic equipment may be also because cannot judge that whether described the first operating body contacts and the corresponding operating of described the first operating body is responded with described projection operation face, or described the first operating body reality contacts with described projection operation face, but whether contact and cannot respond the operation of described the first operating body with projection operation face because described electronic equipment cannot judge described the first operating body, effectively reduce misuse rate, also improved operating efficiency, improving user experiences.
Wherein, described image acquisition units can be camera, and the present invention only need adopt a camera to finish the work, and has saved hardware resource.
Accompanying drawing explanation
Fig. 1 is the main process flow diagram of information processing method in the embodiment of the present invention;
Fig. 2 A is a kind of possible application scenarios schematic diagram in the embodiment of the present invention;
Fig. 2 B is another kind of possible application scenarios schematic diagram in the embodiment of the present invention;
Fig. 2 C is the detailed structure view of electronic equipment in the embodiment of the present invention.
Embodiment
Information processing method in the embodiment of the present invention can be applied to electronic equipment, described electronic equipment can comprise a projecting cell and an image acquisition units, described method can comprise: by described projection module by one when project content projects on projection operation's face, by described image acquisition units, gather the first image information, described the first image information comprises projection operation's face image information that described projection operation face is corresponding; According to described the first image information, judge in described the first image information, whether to include the first image that one first operating body is corresponding; When there is described the first image corresponding to described the first operating body, the second image of the first operating body shade on described projection operation face according to described the first image and described the first operating body, determines primary importance and the described first operating body shade second place on described first image information of described the first operating body in described the first image information; According to described primary importance and the described second place, determine that described the first operating body is the surface of the described projection operation of contact face, does not still contact the surface of described projection operation face.
In the embodiment of the present invention, can gather described the first image information by described image acquisition units, according to described the first image information, can determine the described primary importance of described the first operating body and the described second place of described the first operating body shade, thereby can determine according to described primary importance and the described second place whether described the first operating body has touched described projection operation face actually, thereby can determine whether to respond the corresponding operating of described the first operating body, avoided when described the first operating body is actual be during in suspended state described in electronic equipment may be also because cannot judge that whether described the first operating body contacts and the corresponding operating of described the first operating body is responded with described projection operation face, or described the first operating body reality contacts with described projection operation face, but whether contact and cannot respond the operation of described the first operating body with projection operation face because described electronic equipment cannot judge described the first operating body, effectively reduce misuse rate, also improved operating efficiency, improving user experiences.
Wherein, described image acquisition units can be camera, and the present invention only need adopt a camera to finish the work, and has saved hardware resource.
Referring to Fig. 1, the information processing method in the embodiment of the present invention can be applied to electronic equipment, and described electronic equipment can have a projecting cell and an image acquisition units, and described projecting cell can carry out projection for treating project content.The main flow process of described method is as follows:
Step 101: by described projection module by one when project content projects on projection operation's face, by described image acquisition units, gather the first image information, described the first image information comprises projection operation's face image information that described projection operation face is corresponding.
In the embodiment of the present invention, when by described projection module will described in when project content projects on described projection operation face, can gather described the first image information by described image acquisition units, wherein, described the first image information can comprise the described projection operation face image information that described projection operation face is corresponding.
In the embodiment of the present invention, described image acquisition units can be camera.Described camera can be arranged in described electronic equipment, is an ingredient of described electronic equipment, or described camera can be also two independently equipment with described electronic equipment, and described camera can communicate with described electronic equipment.
In the embodiment of the present invention, when carrying out projection by described projection module, the region that projection covers can be called view field, and described view field had both comprised projector space region, has also comprised described projection operation face.Image information corresponding to described view field can be called described the first image information.
Described camera should be arranged on the side of described view field, can gather preferably described the first image information like this.
For example, if described camera is arranged on to the side of described view field, even if described the first operating body is suspended in the air, described camera also can collect the image information of described the first operating body, and described camera also can collect the image information on described projection operation face.
Step 102: according to described the first image information, judge whether include the first image that one first operating body is corresponding in described the first image information.
In the embodiment of the present invention, after obtaining described the first image information, can analyze described the first image information, to judge whether include corresponding described the first image of described the first operating body in described the first image information.If determine to have described the first image, can determine within the scope of the described pre-set space corresponding with described projection operation face and have described the first operating body.
That is,, in the embodiment of the present invention, after obtaining described the first image information, can within the scope of the described pre-set space corresponding with described projection operation face, whether there is described the first operating body according to described the first image information judgement.
In the embodiment of the present invention, described pre-set space scope can refer to described projector space region.
Step 103: when there is described the first image corresponding to described the first operating body, the second image of the first operating body shade on described projection operation face according to described the first image and described the first operating body, determines primary importance and the described first operating body shade second place on described first image information of described the first operating body in described the first image information.
In the embodiment of the present invention, if judgement is determined in described the first image information, there is described the first image, can determine within the scope of the pre-set space corresponding with described projection operation face and have described the first operating body.In the embodiment of the present invention, described pre-set space scope can refer to described projector space region.
Meanwhile, described the first operating body has projection, corresponding, can determine corresponding described the first image and described the second image corresponding to described the first operating body shade of described the first operating body in described the first image information.
Preferably, in the embodiment of the present invention, described the first operating body can refer to user's a finger.
After definite described the first image and described the second image, can determine respectively the described primary importance of described the first operating body correspondence in described the first image information and the described second place of described the first operating body shade correspondence in described the first image information.
Preferably, determine the described primary importance of described the first operating body correspondence in described the first image information and the described second place of definite described the first operating body shade correspondence in described the first image information, can be specifically: the first identification point of determining described the first image, determine described first identification point of described the first operating body, for example, can choose arbitrarily in described the first image a bit as described the first identification point, can choose arbitrarily on described the first operating body a bit as described the first identification point, or also can choose the end points of described the first operating body as described the first identification point, can choose the end points of described the first image as described the first identification point.For example, if the finger that described the first operating body is user, can choose this finger end a bit in described the first image corresponding point as described the first identification point.After determining described the first identification point, can the location positioning in described the first image information be described primary importance by described the first identification point.Determine the summit of described the first operating body shade, for example, if the finger that described the first operating body is user, described the first operating body shade is also a finger, can by the end of this finger be a bit defined as described summit, can the location positioning in described the first image information be the described second place by described summit.
Preferably, in the embodiment of the present invention, behind definite described the first identification point and described summit, can determine respectively the first coordinate of described the first identification point and second coordinate on described summit.
In the embodiment of the present invention, at definite described the first coordinate and described the second coordinate time, can determine according to the same coordinate system.
In the embodiment of the present invention, when described projecting cell carries out projection, project content can be positioned on wall, on projection screen, on desktop, etc., be that projecting plane can be wall, projection screen, desktop etc., described projection operation face can be positioned on wall, on projection screen, on desktop, etc.Described projection operation face in the embodiment of the present invention can be positioned on described projecting plane.
Step 104: according to described primary importance and the described second place, determine that described the first operating body is the surface of the described projection operation of contact face, does not still contact the surface of described projection operation face.
In the embodiment of the present invention, after definite described primary importance and the described second place, can determine whether described the first operating body has touched the surface of described projection operation face according to described primary importance and the described second place.
Preferably, can determine whether described the first operating body has touched the surface of described projection operation face according to described primary importance and the described second place, can be specifically to judge whether described primary importance and the described second place overlap in described the first image information, or judge whether described primary importance and the distance of the described second place in described the first image information are not more than the first predeterminable range.
If judgement is determined described primary importance and the described second place and is overlapped in described the first image information, can determine that described the first operating body has touched the surface of described projection operation face, otherwise can determine that described the first operating body does not touch the surface of described projection operation face.
If judgement is determined described primary importance and the distance of the described second place in described the first image information and is not more than the first predeterminable range, can determine that described the first operating body has touched the surface of described projection operation face, otherwise can determine that described the first operating body does not touch the surface of described projection operation face.
Preferably, judging whether described primary importance and the described second place overlap in described the first image information, can be specifically to judge whether described the first coordinate and described the second coordinate are same coordinate in described the first image information.If judgement determines that described the first coordinate and described the second coordinate are same coordinate in described the first image information, can determine that described primary importance and the described second place overlap in described the first image information, otherwise can determine that described primary importance and the described second place do not overlap in described the first image information.
Preferably, or judge whether described primary importance and the distance of the described second place in described the first image information are not more than described the first predeterminable range, can be specifically to judge whether described the first coordinate and the distance of described the second coordinate in described the first image information are not more than described the first predeterminable range.If judgement is determined described the first coordinate and the distance of described the second coordinate in described the first image information and is not more than described the first predeterminable range, can determine that described primary importance and the described second place distance in described the first image information is not more than described the first predeterminable range, otherwise can determine that described primary importance and the distance of the described second place in described the first image information are greater than described the first predeterminable range.
Preferably, in the embodiment of the present invention, according to described primary importance and the described second place, determine that described the first operating body is the surface of the described projection operation of contact face, still, before not contacting the surface of described projection operation face, can also determine the first area of described the first image and the second area of described the second image.After definite described the first area and described second area, can judge whether described the first area equates with described second area, obtain the first judged result, described the first judged result can be for showing whether described the first area equates with described second area.Can determine that described the first operating body is the surface of the described projection operation of contact face, does not still contact the surface of described projection operation face according to described the first judged result and the second judged result.Wherein, described the second judged result can show whether described primary importance overlaps with the described second place, or described the second judged result can show whether the distance between described primary importance and the described second place is not more than described the first predeterminable range.
Preferably, in the embodiment of the present invention, if described primary importance and the described second place overlap in described the first image information, and when described the first judged result shows that described the first area equates with described second area, can determine that described the first operating body contacts the surface of described projection operation face, otherwise can determine that described the first operating body does not contact the surface of described projection operation face.
For example, if described primary importance and the described second place overlap in described the first image information, described the first judged result show described the first area and described second area unequal, can determine that described the first operating body does not contact the surface of described projection operation face; If described primary importance and the described second place do not overlap in described the first image information, described the second judged result shows that described the first area equates with described second area, can determine that described the first operating body does not contact the surface of described projection operation face; If described primary importance and the described second place do not overlap in described the first image information, described the second judged result show described the first area and described second area unequal, can determine that described the first operating body does not contact the surface of described projection operation face.
Preferably, in the embodiment of the present invention, if described primary importance and the described second place distance in described the first image information is not more than described the first predeterminable range, and when described the first judged result shows that described the first area equates with described second area, can determine that described the first operating body contacts the surface of described projection operation face, otherwise can determine that described the first operating body does not contact the surface of described projection operation face.
For example, if described primary importance and the described second place distance in described the first image information is not more than described the first predeterminable range, described the first judged result show described the first area and described second area unequal, can determine that described the first operating body does not contact the surface of described projection operation face; If described primary importance and the described second place distance in described the first image information is greater than described the first predeterminable range, described the second judged result shows that described the first area equates with described second area, can determine that described the first operating body does not contact the surface of described projection operation face; If described primary importance and the described second place distance in described the first image information is greater than the first predeterminable range, described the second judged result show described the first area and described second area unequal, can determine that described the first operating body does not contact the surface of described projection operation face.
Preferably, in the embodiment of the present invention, according to described primary importance and the described second place, determining that described the first operating body is the surface of the described projection operation of contact face, still after not contacting the surface of described projection operation face, can also obtain the first operation that described the first operating body carries out, in the embodiment of the present invention, because determining that described the first operating body has touched the surface of described projection operation face, if described the first operating body has carried out described the first operation on described projection operation face, described electronic equipment can obtain described the first operation.
After obtaining described the first operation, described electronic equipment can respond described the first operation, for example, can call the first function with described the first operational correspondence.
For example, described projection operation face is computer desktop, the finger that described the first operating body is user, described first is operating as the double click operation that user carries out the first icon on described projection operation face with this finger, described electronic equipment can respond described the first operation, call described the first function with described the first operational correspondence, for example described the first function can be to open the function of first application interface corresponding with described the first icon.
For example, as shown in Figure 2 A, be a kind of possible embodiment of the present invention.In Fig. 2 A, the hand that described the first operating body is user.Described image acquisition units is positioned at described the first operating body side, and like this, described image acquisition units can collect described the first image and described the second image corresponding to described the first operating body shade that described the first operating body is corresponding.In Fig. 2 A, shadeless hand is described the first operating body, and hypographous hand is described the first operating body shade.In Fig. 2 A, can determine described primary importance according to described the first operating body, and can determine the described second place according to described the first operating body shade, and can find out in Fig. 2 A, described primary importance and the described second place do not overlap, and described primary importance and the described second place distance in described the first image information is obviously greater than described the first predeterminable range, visible, at the first operating body described in Fig. 2 A, in suspended state, it does not touch the surface of described projection operation face.If described the first operating body operates, described electronic equipment can not respond this operation.And, in Fig. 2 A, can find out, camera is positioned at side, like this, angle at camera, described the first operating body and described the first operating body shade are arranged in a plane, can guarantee that camera can comparatively intactly collect described the first image and described the second image corresponding to described the first operating body shade that described the first operating body is corresponding.
For example, as shown in Figure 2 B, be a kind of possible embodiment of the present invention.In Fig. 2 B, the hand that described the first operating body is user.Described image acquisition units is positioned at the certain angle of described the first operating body side, and like this, described image acquisition units can collect described the first image and described the second image corresponding to described the first operating body shade that described the first operating body is corresponding.In Fig. 2 B, shadeless hand is described the first operating body, and hypographous hand is described the first operating body shade.In Fig. 2 B, can determine described primary importance according to described the first operating body, and can determine the described second place according to described the first operating body shade, and can find out in Fig. 2 B, described primary importance and the described second place overlap, visible, at the first operating body described in Fig. 2 B, touched the surface of described projection operation face.If described the first operating body operates, described electronic equipment can respond this operation.And, in Fig. 2 B, can find out, camera is positioned at side, like this, angle at camera, described the first operating body and described the first operating body shade are arranged in a plane, can guarantee that camera can comparatively intactly collect described the first image and described the second image corresponding to described the first operating body shade that described the first operating body is corresponding.
In the embodiment of the present invention, described electronic equipment can carry out in real time or regularly monitoring described the first operating body.For example, described electronic equipment can be monitored described the first operating body by described image acquisition units.
According to described primary importance and the described second place, determine that described the first operating body is the surface of the described projection operation of contact face, still after not contacting the surface of described projection operation face, described electronic equipment can continue described the first operating body to monitor, and determines when described the first operating body leaves described projection operation face.When monitoring described the first operating body, described electronic equipment left described projection operation face, can determine that described the first operating body carries out described the first end of operation, in succession can determine which kind of operation described the first operation is specially, thereby can determine corresponding response mode according to described the first operation.
For example, described electronic equipment monitors described the first operating body and when first moment, has left described projection operation face, described electronic equipment can be determined described the first end of operation that described the first operating body carries out, described electronic equipment can judge which kind of operation described the first operation is specially according to the duration of contact of described the first operating body and described projection operation face and/or contact trace, thereby can determine the response mode with described the first operational correspondence.
For example, if described electronic equipment determines that in the second monitoring constantly time described the first operating body contacts with described projection operation face, the described first monitoring constantly after described second moment determines that described the first operating body has left described projection operation face, and described electronic equipment can determine described second constantly and the mistiming between described first moment.If the described mistiming is not more than default duration threshold value, what described electronic equipment can determine that described the first operating body carries out is single-click operation.If the described mistiming is greater than described default duration threshold value, what described electronic equipment can determine that described the first operating body carries out is long by operation, the operation of pinning for a long time.
For example, if described electronic equipment determines that in the second monitoring constantly time described the first operating body contacts with described projection operation face, the described first monitoring constantly after described second moment determines that described the first operating body has left described projection operation face, definite described the first operating body of the 3rd monitoring constantly described first after constantly contacts with described projection operation face, the 4th moment after described the 3rd moment is monitored again and determines that described the first operating body has left described projection operation face, described electronic equipment can determine that the very first time between described second moment and described first moment is poor, the second mistiming between described first moment and described the 3rd moment, and the 3rd mistiming that can determine the described the 3rd constantly and between described the 4th moment.If these three mistimings are all not more than described default duration threshold value, what described electronic equipment can determine that described the first operating body carries out is single job, and for example this operation can be double click operation.If there is one in these three mistimings, be greater than described default duration threshold value, what described electronic equipment can determine that described the first operating body carries out is twice operation, and described second mistiming poor according to the described very first time determine which kind of operation this twice operation is respectively to described electronic equipment respectively.
Preferably, described electronic equipment is except according to judging which kind of operation described the first operation is specially the duration of contact of described the first operating body and described projection operation face, contact trace that can also be by described the first operating body and described projection operation face or simultaneously described contact trace and judge that described first operates and be specially which kind of operation described duration of contact.
For example, if described electronic equipment determines that in the second monitoring constantly time described the first operating body contacts with described projection operation face, the described first monitoring constantly after described second moment determines that described the first operating body has left described projection operation face, and described electronic equipment can be determined the trace information that described the first operating body streaks on described projection operation face between described second moment and described first moment.For example, described electronic equipment can determine that described the first operating body was engraved in corresponding the 3rd positional information on described projection operation face at described second o'clock, and can determine that described the first operating body was engraved in corresponding the 4th positional information on described projection operation face at described first o'clock, , if corresponding the 3rd position of described the 3rd positional information and corresponding the 4th position of described the 4th positional information are same position, or the distance between described the 3rd position and described the 4th position is not more than the second predeterminable range, described electronic equipment can determine that described the first operating body is not mobile on described projection operation face, described the first operating body carries out is likely single-click operation, or length is pressed operation etc., and if described the 3rd position and described the 4th position are not same position, or the distance between described the 3rd position and described the 4th position is greater than described the second predeterminable range, described the first operating body carries out is likely slide.
By several specific embodiments, introduce the information processing method in the present invention below, the following examples are mainly used in introducing several possible application scenarios of described method.It should be noted that, the embodiment in the present invention is only for explaining the present invention, and can not be for limiting the present invention.Every embodiment that meets inventive concept is all within protection scope of the present invention, and those skilled in the art know how according to thought of the present invention, to carry out modification naturally.
Embodiment mono-:
Described electronic equipment is PC (personal computer), and described electronic equipment has described projection module, and has an image acquisition units, and described image acquisition units is camera.Described projecting plane is desktop.Described projection operation face can be positioned on described projecting plane.The face of projection operation described in the present embodiment is computer desktop.
When by described projection module will described in when project content projects on described projection operation face, can gather described the first image information by described image acquisition units, wherein, described the first image information can comprise the described projection operation face image information that described projection operation face is corresponding.
In the present embodiment, described camera is arranged on the side of described view field, can gather preferably described the first image information like this.
After obtaining described the first image information, can analyze described the first image information, to judge whether include corresponding described the first image of described the first operating body in described the first image information.If determine to have described the first image, can determine within the scope of the described pre-set space corresponding with described projection operation face and have described the first operating body.
In the present embodiment, in definite described the first image information of judgement, there is described the first image, can correspondingly determine within the scope of described pre-set space and have described the first operating body, can corresponding described the second image of determining described the first operating body shade correspondence in described the first image information.
After definite described the first image and described the second image, can determine respectively the described primary importance of described the first operating body correspondence in described the first image information and the described second place of described the first operating body shade correspondence in described the first image information.
In the present embodiment, determine the described primary importance that described the first operating body is corresponding and determine the described second place corresponding to described the first operating body shade, can be: the first identification point of determining described the first image, determine described first identification point of described the first operating body, for example, can choose arbitrarily in described the first image a bit as described the first identification point, can choose arbitrarily on described the first operating body a bit as described the first identification point, or also can choose the end points of described the first operating body as described the first identification point, can choose the end points of described the first image as described the first identification point.For example, if the finger that described the first operating body is user, can choose this finger end a bit in described the first image corresponding point as described the first identification point.After determining described the first identification point, can the location positioning in described the first image information be described primary importance by described the first identification point.Determine the summit of described the first operating body shade, for example, if the finger that described the first operating body is user, described the first operating body shade is also a finger, can by the end of this finger be a bit defined as described summit, can the location positioning in described the first image information be the described second place by described summit.
In the present embodiment, behind definite described the first identification point and described summit, can determine respectively the first coordinate of described the first identification point and second coordinate on described summit.
In the present embodiment, at definite described the first coordinate and described the second coordinate time, can determine according to the same coordinate system.
In the present embodiment, after definite described primary importance and the described second place, can determine whether described the first operating body has touched the surface of described projection operation face according to described primary importance and the described second place.
Preferably, a kind ofly according to described primary importance and the described second place, determine that the surperficial mode whether described the first operating body has touched described projection operation face can be: judge whether described primary importance and the described second place overlap in described the first image information, obtain the second judged result, according to described the second judged result, can determine whether described the first operating body has touched the surface of described projection operation face.
If judgement is determined described primary importance and the described second place and is overlapped in described the first image information, can determine that described the first operating body has touched the surface of described projection operation face.
If judgement is determined described primary importance and the described second place and is not overlapped in described the first image information, can determine that described the first operating body does not touch the surface of described projection operation face.
In this enforcement, a kind of mode that judges whether described primary importance and the described second place overlap in described the first image information can be: according to described the first coordinate and described the second coordinate, judge that whether described primary importance overlaps with the described second place, obtains described the second judged result.
For example, can judge whether described the first coordinate and described the second coordinate are same coordinate, if judgement determines that described the first coordinate and described the second coordinate are same coordinate, can determine that described primary importance and the described second place overlap, otherwise can determine that described primary importance and the described second place do not overlap.
In the present embodiment, determine that described primary importance and the described second place overlap, and can determine that described the first operating body has touched the surface of described projection operation face.
According to described primary importance and the described second place, determining that described the first operating body contacts behind the surface of described projection operation face, can obtain the first operation that the first operating body carries out.The first operating body described in the present embodiment can be user's a finger, described the first operation can be the double click operation that user carries out the first icon on described computer desktop with this finger, and the first icon described in the present embodiment is that word applies corresponding icon.
After obtaining described the first operation, can respond described the first operation, call the first function with described the first operational correspondence.In the present embodiment, be and open the operation interface corresponding with word.
Embodiment bis-:
Described electronic equipment is PC (personal computer), and described electronic equipment has described projection module, and has an image acquisition units, and described image acquisition units is camera.Described projecting plane is desktop.Described projection operation face can be positioned on described projecting plane.The face of projection operation described in the present embodiment is computer desktop.
When by described projection module will described in when project content projects on described projection operation face, can gather described the first image information by described image acquisition units, wherein, described the first image information can comprise the described projection operation face image information that described projection operation face is corresponding.
In the present embodiment, described camera is arranged on the side of described view field, can gather preferably described the first image information like this.
After obtaining described the first image information, can analyze described the first image information, to judge whether include corresponding described the first image of described the first operating body in described the first image information.If determine to have described the first image, can determine within the scope of the described pre-set space corresponding with described projection operation face and have described the first operating body.
In the present embodiment, in definite described the first image information of judgement, there is described the first image, can correspondingly determine within the scope of described pre-set space and have described the first operating body, can corresponding described the second image of determining described the first operating body shade correspondence in described the first image information.
After definite described the first image and described the second image, can determine respectively the described primary importance of described the first operating body correspondence in described the first image information and the described second place of described the first operating body shade correspondence in described the first image information.
In the present embodiment, determine the described primary importance that described the first operating body is corresponding and determine the described second place corresponding to described the first operating body shade, can be: the first identification point of determining described the first image, determine described first identification point of described the first operating body, for example, can choose arbitrarily in described the first image a bit as described the first identification point, can choose arbitrarily on described the first operating body a bit as described the first identification point, or also can choose the end points of described the first operating body as described the first identification point, can choose the end points of described the first image as described the first identification point.For example, if the finger that described the first operating body is user, can choose this finger end a bit in described the first image corresponding point as described the first identification point.After determining described the first identification point, can the location positioning in described the first image information be described primary importance by described the first identification point.Determine the summit of described the first operating body shade, for example, if the finger that described the first operating body is user, described the first operating body shade is also a finger, can by the end of this finger be a bit defined as described summit, can the location positioning in described the first image information be the described second place by described summit.
In the present embodiment, behind definite described the first identification point and described summit, can determine respectively the first coordinate of described the first identification point and second coordinate on described summit.
In the present embodiment, at definite described the first coordinate and described the second coordinate time, can determine according to the same coordinate system.
In the present embodiment, after definite described primary importance and the described second place, can determine whether described the first operating body has touched the surface of described projection operation face according to described primary importance and the described second place.
Preferably, a kind ofly according to described primary importance and the described second place, determine that the surperficial mode whether described the first operating body has touched described projection operation face can be: judge whether described primary importance and the described second place distance in described the first image information is not more than the first predeterminable range, obtain the second judged result, according to described the second judged result, can determine whether described the first operating body has touched the surface of described projection operation face.
If judgement is determined described primary importance and the distance of the described second place in described the first image information and is not more than described the first predeterminable range, can determine that described the first operating body has touched the surface of described projection operation face.
If judgement is determined described primary importance and the distance of the described second place in described the first image information and is greater than described the first predeterminable range, can determine that described the first operating body does not touch the surface of described projection operation face.。
In this enforcement, judge that the mode whether described primary importance and the described second place distance in described the first image information is not more than described the first predeterminable range can be: according to described the first coordinate and described the second coordinate, judge that whether described primary importance and the distance of the described second place in described the first image information are not more than described first predeterminable range, obtain described the second judged result.
For example, can judge whether described the first coordinate and the distance of described the second coordinate in described the first image information are not more than described the first predeterminable range, if judgement is determined described the first coordinate and the distance of described the second coordinate in described the first image information and is not more than described the first predeterminable range, can determine that described primary importance and the described second place distance in described the first image information is not more than described the first predeterminable range, otherwise can determine that described primary importance and the described second place distance in described the first image information is greater than described the first predeterminable range.
In the present embodiment, determine that described primary importance and the described second place distance in described the first image information is not more than described the first predeterminable range, can determine that described the first operating body has touched the surface of described projection operation face.
According to described primary importance and the described second place, determining that described the first operating body contacts behind the surface of described projection operation face, can obtain the first operation that the first operating body carries out.
In the present embodiment, described electronic equipment can carry out in real time or regularly monitoring described the first operating body.For example, described electronic equipment can be monitored described the first operating body by described image acquisition units.
After definite described the first operating body contacts the surface of described projection operation face, described electronic equipment can continue described the first operating body to monitor, and determines when described the first operating body leaves described projection operation face.
For example, described electronic equipment monitors described the first operating body and when first moment, has left described projection operation face, described electronic equipment can be determined described the first end of operation that described the first operating body carries out, described electronic equipment can judge which kind of operation described the first operation is specially according to the duration of contact of described the first operating body and described projection operation face and/or contact trace, thereby can determine the response mode with described the first operational correspondence.
For example, in the present embodiment, described electronic equipment determines that in the second monitoring constantly time described the first operating body contacts with described projection operation face, the described first monitoring constantly after described second moment determines that described the first operating body has left described projection operation face, and described electronic equipment can determine described second constantly and the mistiming between described first moment.In the present embodiment, the described mistiming is not more than default duration threshold value, and what described electronic equipment can determine that described the first operating body carries out is single-click operation, can determine the response mode corresponding with described single-click operation.
The first operating body described in the present embodiment can be user's a finger, described the first operation can be the single-click operation that user carries out the first icon on described computer desktop with this finger, and the first icon described in the present embodiment is that word applies corresponding icon.
After obtaining described the first operation, can respond described the first operation, call the first function with described the first operational correspondence.In the present embodiment, be selected described the first icon.
Embodiment tri-:
Described electronic equipment is PC (personal computer), and described electronic equipment has described projection module, and has an image acquisition units, and described image acquisition units is camera.Described projecting plane is desktop.Described projection operation face can be positioned on described projecting plane.The face of projection operation described in the present embodiment is computer desktop.
When by described projection module will described in when project content projects on described projection operation face, can gather described the first image information by described image acquisition units, wherein, described the first image information can comprise the described projection operation face image information that described projection operation face is corresponding.
In the present embodiment, described camera is arranged on the side of described view field, can gather preferably described the first image information like this.
After obtaining described the first image information, can analyze described the first image information, to judge whether include corresponding described the first image of described the first operating body in described the first image information.If determine to have described the first image, can determine within the scope of the described pre-set space corresponding with described projection operation face and have described the first operating body.
In the present embodiment, in definite described the first image information of judgement, there is described the first image, can correspondingly determine within the scope of described pre-set space and have described the first operating body, can corresponding described the second image of determining described the first operating body shade correspondence in described the first image information.
After definite described the first image and described the second image, can determine respectively the described primary importance of described the first operating body correspondence in described the first image information and the described second place of described the first operating body shade correspondence in described the first image information.
In the present embodiment, determine the described primary importance that described the first operating body is corresponding and determine the described second place corresponding to described the first operating body shade, can be: the first identification point of determining described the first image, determine described first identification point of described the first operating body, for example, can choose arbitrarily in described the first image a bit as described the first identification point, can choose arbitrarily on described the first operating body a bit as described the first identification point, or also can choose the end points of described the first operating body as described the first identification point, can choose the end points of described the first image as described the first identification point.For example, if the finger that described the first operating body is user, can choose this finger end a bit in described the first image corresponding point as described the first identification point.After determining described the first identification point, can the location positioning in described the first image information be described primary importance by described the first identification point.Determine the summit of described the first operating body shade, for example, if the finger that described the first operating body is user, described the first operating body shade is also a finger, can by the end of this finger be a bit defined as described summit, can the location positioning in described the first image information be the described second place by described summit.
In the present embodiment, behind definite described the first identification point and described summit, can determine respectively the first coordinate of described the first identification point and second coordinate on described summit.
In the present embodiment, at definite described the first coordinate and described the second coordinate time, can determine according to the same coordinate system.
In the present embodiment, after definite described primary importance and the described second place, can determine whether described the first operating body has touched the surface of described projection operation face according to described primary importance and the described second place.
Preferably, a kind ofly according to described primary importance and the described second place, determine that the surperficial mode whether described the first operating body has touched described projection operation face can be: judge whether described primary importance and the described second place overlap in described the first image information, obtain the second judged result, according to described the second judged result, can determine whether described the first operating body has touched the surface of described projection operation face.
In the present embodiment, after obtaining described the first image and described the second image, can determine respectively the first area and second area corresponding to described the second image that described the first image is corresponding.Can judge whether described the first area equates with described second area, obtain the first judged result, described the first judged result can show whether described the first area equates with described second area.
In the present embodiment, if judgement is determined described primary importance and the described second place and is overlapped in described the first image information, and described the first judged result shows that described the first area and described second area equate, can determine that described the first operating body has touched the surface of described projection operation face.
In the present embodiment, if judgement is determined described primary importance and the described second place and is not overlapped in described the first image information, and described the first judged result shows that described the first area and described second area equate, can determine that described the first operating body does not touch the surface of described projection operation face.
In the present embodiment, if judgement is determined described primary importance and the described second place and is overlapped in described the first image information, and described the first judged result show described the first area and described second area unequal, can determine that described the first operating body does not touch the surface of described projection operation face.
In the present embodiment, if judgement is determined described primary importance and the described second place and is not overlapped in described the first image information, and described the first judged result show described the first area and described second area unequal, can determine that described the first operating body does not touch the surface of described projection operation face.
In this enforcement, a kind of mode that judges whether described primary importance and the described second place overlap in described the first image information can be: according to described the first coordinate and described the second coordinate, judge that whether described primary importance overlaps with the described second place, obtains described the second judged result.
For example, can judge whether described the first coordinate and described the second coordinate are same coordinate, if judgement determines that described the first coordinate and described the second coordinate are same coordinate, can determine that described primary importance and the described second place overlap, otherwise can determine that described primary importance and the described second place do not overlap.
In the present embodiment, judgement determines that described primary importance and the described second place overlap, and described the first area equates with described second area, can determine that described the first operating body has touched the surface of described projection operation face.
At definite described the first operating body, contact behind the surface of described projection operation face, can obtain the first operation that the first operating body carries out.The first operating body described in the present embodiment can be user's a finger, and described the first operation can be the single-click operation that user carries out on described video playback interface with this finger.
After obtaining described the first operation, can respond described the first operation, call the first function with described the first operational correspondence.In the present embodiment, be and make in progress video content in described video playback interface suspend broadcasting.
Referring to Fig. 2 C, the present invention also provides a kind of electronic equipment, described electronic equipment can comprise projection module and image acquisition units, and described electronic equipment can also comprise operational module 201, the first judge module 202, the first determination module 203 and the second determination module 204.In the embodiment of the present invention, described image acquisition units can be camera.
Preferably, described electronic equipment can also comprise the 3rd determination module 205, the second judge module 206, acquisition module 207, calling module 208, monitoring module 209 and the 4th determination module 210.
Operational module 201 can for by described projection module by one when project content projects on projection operation's face, by described image acquisition units, gather the first image information, described the first image information comprises projection operation's face image information that described projection operation face is corresponding.
The first judge module 202 can be for according to described the first image information, judges in described the first image information, whether to include the first image that one first operating body is corresponding.
The first judge module 202 specifically can be for judging whether described primary importance and the described second place overlap in described the first image information, or judge whether described primary importance and the distance of the described second place in described the first image information are not more than the first predeterminable range.
The first judge module 202 specifically can be for judging whether described the first coordinate and described the second coordinate are same coordinate in described the first image information, or judge whether described the first coordinate and the distance of described the second coordinate in described the first image information are not more than described the first predeterminable range.
Can be for when there is described the first image corresponding to described the first operating body in the first determination module 203, the second image of the first operating body shade on described projection operation face according to described the first image and described the first operating body, determines primary importance and the described first operating body shade second place on described first image information of described the first operating body in described the first image information.
The first determination module 203 specifically can be for determining the first identification point corresponding to described the first operating body in described the first image, by the location positioning of described the first identification point, it is described primary importance, and in described the first image, determine the summit of described the first operating body shade, the position of determining described summit is the described second place.
The first determination module 203 specifically can be determined the first coordinate of described the first identification point for according to described the first image and described the second image, and second coordinate on described summit.
The second determination module 204 can, for according to described primary importance and the described second place, determine that described the first operating body is the surface of the described projection operation of contact face, does not still contact the surface of described projection operation face.
The second determination module 204 specifically can be for when described primary importance and the described second place overlap in described the first image information, determine that described the first operating body contacts the surface of described projection operation face, otherwise determine that described the first operating body does not contact the surface of described projection operation face; Or when when described primary importance and the described second place, the distance in described the first image information is not more than the first predeterminable range, determine that described the first operating body contacts the surface of described projection operation face, otherwise determine that described the first operating body does not contact the surface of described projection operation face.
The second determination module 204 specifically can overlap for working as described primary importance and the described second place in described the first image information, and when described the first judged result shows that described the first area equates with described second area, determine that described the first operating body contacts the surface of described projection operation face, otherwise determine that described the first operating body does not contact the surface of described projection operation face.
The second determination module 204 specifically can be not more than described the first predeterminable range for working as described primary importance and the distance of the described second place in described the first image information, and when described the first judged result shows that described the first area equates with described second area, determine that described the first operating body contacts the surface of described projection operation face, otherwise determine that described the first operating body does not contact the surface of described projection operation face.
The 3rd determination module 205 can be for determining corresponding the first area and second area corresponding to described the second image of described the first image.
The second judge module 206 can, for judging whether described the first area equates with described second area, obtain the first judged result.
The first operation that acquisition module 207 can carry out for obtaining described the first operating body.
Calling module 208 can, for described the first operation is responded, call the first function with described the first operational correspondence.
Monitoring module 209 can, for described the first operating body is monitored, determine that described the first operating body does not contact the surface of described projection operation face in first moment, obtain monitored results.
The 4th determination module 210 can be for the first operation of determining that according to described monitored results described the first operating body carries out, thereby can determine corresponding response mode according to described the first operation.
Information processing method in the embodiment of the present invention can be applied to electronic equipment, described electronic equipment can comprise a projecting cell and an image acquisition units, described method can comprise: by described projection module by one when project content projects on projection operation's face, by described image acquisition units, gather the first image information, described the first image information comprises projection operation's face image information that described projection operation face is corresponding; According to described the first image information, judge in described the first image information, whether to include the first image that one first operating body is corresponding; When there is described the first image corresponding to described the first operating body, the second image of the first operating body shade on described projection operation face according to described the first image and described the first operating body, determines primary importance and the described first operating body shade second place on described first image information of described the first operating body in described the first image information; According to described primary importance and the described second place, determine that described the first operating body is the surface of the described projection operation of contact face, does not still contact the surface of described projection operation face.
In the embodiment of the present invention, can gather described the first image information by described image acquisition units, according to described the first image information, can determine the described primary importance of described the first operating body and the described second place of described the first operating body shade, thereby can determine according to described primary importance and the described second place whether described the first operating body has touched described projection operation face actually, thereby can determine whether to respond the corresponding operating of described the first operating body, avoided when described the first operating body is actual be during in suspended state described in electronic equipment may be also because cannot judge that whether described the first operating body contacts and the corresponding operating of described the first operating body is responded with described projection operation face, or described the first operating body reality contacts with described projection operation face, but whether contact and cannot respond the operation of described the first operating body with projection operation face because described electronic equipment cannot judge described the first operating body, effectively reduce misuse rate, also improved operating efficiency, improving user experiences.
Wherein, described image acquisition units can be camera, and the present invention only need adopt a camera to finish the work, and has saved hardware resource.
In the embodiment of the present invention, according to described primary importance and the described second place, determine that described the first operating body is the surface of the described projection operation of contact face, the surperficial mode that does not still contact described projection operation face can have multiple, comparatively flexible.
In the embodiment of the present invention, determine that described the first operating body is the surface of the described projection operation of contact face, still do not contact the surface of described projection operation face, not only can determine by the position of described the first operating body and described the first operating body shade, can also determine by the area of described the first operating body and described the first operating body shade, can make the result determined more accurate.
Those skilled in the art should understand, embodiments of the invention can be provided as method, system or computer program.Therefore, the present invention can adopt complete hardware implementation example, implement software example or in conjunction with the form of the embodiment of software and hardware aspect completely.And the present invention can adopt the form that wherein includes the upper computer program of implementing of computer-usable storage medium (including but not limited to magnetic disk memory and optical memory etc.) of computer usable program code one or more.
The present invention is with reference to describing according to process flow diagram and/or the block scheme of the method for the embodiment of the present invention, equipment (system) and computer program.Should understand can be in computer program instructions realization flow figure and/or block scheme each flow process and/or the flow process in square frame and process flow diagram and/or block scheme and/or the combination of square frame.Can provide these computer program instructions to the processor of multi-purpose computer, special purpose computer, Embedded Processor or other programmable data processing device to produce a machine, the instruction of carrying out by the processor of computing machine or other programmable data processing device is produced for realizing the device in the function of flow process of process flow diagram or a plurality of flow process and/or square frame of block scheme or a plurality of square frame appointments.
These computer program instructions also can be stored in energy vectoring computer or the computer-readable memory of other programmable data processing device with ad hoc fashion work, the instruction that makes to be stored in this computer-readable memory produces the manufacture that comprises command device, and this command device is realized the function of appointment in flow process of process flow diagram or a plurality of flow process and/or square frame of block scheme or a plurality of square frame.
These computer program instructions also can be loaded in computing machine or other programmable data processing device, make to carry out sequence of operations step to produce computer implemented processing on computing machine or other programmable devices, thereby the instruction of carrying out is provided for realizing the step of the function of appointment in flow process of process flow diagram or a plurality of flow process and/or square frame of block scheme or a plurality of square frame on computing machine or other programmable devices.
Obviously, those skilled in the art can carry out various changes and modification and not depart from the spirit and scope of the present invention the present invention.Like this, if within of the present invention these are revised and modification belongs to the scope of the claims in the present invention and equivalent technologies thereof, the present invention is also intended to comprise these changes and modification interior.

Claims (20)

1. an information processing method, applies an electronic equipment, it is characterized in that, described electronic equipment comprises a projection module and an image acquisition units, and described method comprises:
By described projection module by one when project content projects on projection operation's face, by described image acquisition units, gather the first image information, described the first image information comprises projection operation's face image information that described projection operation face is corresponding;
According to described the first image information, judge in described the first image information, whether to include the first image that one first operating body is corresponding;
When there is described the first image corresponding to described the first operating body, the second image of the first operating body shade on described projection operation face according to described the first image and described the first operating body, determines primary importance and the described first operating body shade second place on described first image information of described the first operating body in described the first image information;
According to described primary importance and the described second place, determine that described the first operating body is the surface of the described projection operation of contact face, does not still contact the surface of described projection operation face.
2. the method for claim 1, it is characterized in that, determine that the primary importance of described the first operating body in described the first image information and the step of the second place of described the first operating body shade in described the first image information comprise: in described the first image, determine the first identification point corresponding to described the first operating body, by the location positioning of described the first identification point, it is described primary importance, and in described the first image, determine the summit of described the first operating body shade, the position of determining described summit is the described second place.
3. method as claimed in claim 2, it is characterized in that, the second image of the first operating body shade on described projection operation face according to the first image of described the first operating body in described image information and described the first operating body, determine that the primary importance of described the first operating body within the scope of described pre-set space and the step of the second place of described the first operating body shade on described projection operation face comprise: according to described the first image and described the second image, determine the first coordinate of described the first identification point, and second coordinate on described summit.
4. method as claimed in claim 3, it is characterized in that, according to described primary importance and the described second place, determine that described the first operating body is the surface of the described projection operation of contact face, still the surperficial step that does not contact described projection operation face comprises: judge whether described primary importance and the described second place overlap in described the first image information, or judge whether described primary importance and the distance of the described second place in described the first image information are not more than the first predeterminable range.
5. method as claimed in claim 4, is characterized in that, according to described primary importance and the described second place, determines that described the first operating body is the surface of the described projection operation of contact face, and the surperficial step that does not still contact described projection operation face comprises:
When described primary importance and the described second place overlap in described the first image information, determine that described the first operating body contacts the surface of described projection operation face, otherwise determine that described the first operating body does not contact the surface of described projection operation face; Or
When described primary importance and the described second place, the distance in described the first image information is not more than the first predeterminable range, determine that described the first operating body contacts the surface of described projection operation face, otherwise determine that described the first operating body does not contact the surface of described projection operation face.
6. method as claimed in claim 4, it is characterized in that, judge whether described primary importance and the described second place overlap in described the first image information, or judge that the step whether described primary importance and the distance of the described second place in described the first image information are not more than the first predeterminable range comprises: judge whether described the first coordinate and described the second coordinate are same coordinate in described the first image information, or judge whether described the first coordinate and the distance of described the second coordinate in described the first image information are not more than described the first predeterminable range.
7. method as claimed in claim 5, is characterized in that, according to described primary importance and the described second place, determines that described the first operating body is the surface of the described projection operation of contact face, and the surface that does not still contact described projection operation face also comprises step before:
Determine the first area and second area corresponding to described the second image that described the first image is corresponding;
Judge whether described the first area equates with described second area, obtain the first judged result;
When described primary importance and the described second place overlap in described the first image information, determine that described the first operating body contacts the surface of described projection operation face, otherwise determine that the surperficial step that described the first operating body does not contact described projection operation face comprises: when described primary importance and the described second place overlap in described the first image information, and when described the first judged result shows that described the first area equates with described second area, determine that described the first operating body contacts the surface of described projection operation face, otherwise determine that described the first operating body does not contact the surface of described projection operation face.
8. method as claimed in claim 5, is characterized in that, according to described primary importance and the described second place, determines that described the first operating body is the surface of the described projection operation of contact face, and the surface that does not still contact described projection operation face also comprises step before:
Determine the first area and second area corresponding to described the second image that described the first image is corresponding;
Judge whether described the first area equates with described second area, obtain the first judged result;
When when described primary importance and the described second place, the distance in described the first image information is not more than the first predeterminable range, determine that described the first operating body contacts the surface of described projection operation face, otherwise the surperficial step that definite described the first operating body does not contact described projection operation face comprises: when described primary importance and the described second place, the distance in described the first image information is not more than described the first predeterminable range, and when described the first judged result shows that described the first area equates with described second area, determine that described the first operating body contacts the surface of described projection operation face, otherwise determine that described the first operating body does not contact the surface of described projection operation face.
9. the method for claim 1, is characterized in that, according to described primary importance and the described second place, determines that described the first operating body is the surface of the described projection operation of contact face, and the surface that does not still contact described projection operation face also comprises step afterwards:
Obtain the first operation that described the first operating body carries out;
Described the first operation is responded, call the first function with described the first operational correspondence.
10. the method as described in claim 1 or 9, it is characterized in that, according to described primary importance and the described second place, determine that described the first operating body is the surface of the described projection operation of contact face, the surface that does not still contact described projection operation face also comprises step afterwards:
Described the first operating body is monitored, determined that described the first operating body does not contact the surface of described projection operation face in first moment, obtain monitored results;
According to described monitored results, determine the first operation that described the first operating body carries out, thereby can determine corresponding response mode according to described the first operation.
11. 1 kinds of electronic equipments, is characterized in that, described electronic equipment comprises a projection module and an image acquisition units, and described electronic equipment also comprises:
Operational module, for by described projection module by one when project content projects on projection operation's face, by described image acquisition units, gather the first image information, described the first image information comprises projection operation's face image information that described projection operation face is corresponding;
The first judge module, for according to described the first image information, judges in described the first image information, whether to include the first image that one first operating body is corresponding;
The first determination module, for when there is described the first image corresponding to described the first operating body, the second image of the first operating body shade on described projection operation face according to described the first image and described the first operating body, determines primary importance and the described first operating body shade second place on described first image information of described the first operating body in described the first image information;
The second determination module, for according to described primary importance and the described second place, determines that described the first operating body is the surface of the described projection operation of contact face, does not still contact the surface of described projection operation face.
12. electronic equipments as claimed in claim 11, it is characterized in that, described the first determination module specifically for: in described the first image, determine the first identification point corresponding to described the first operating body, by the location positioning of described the first identification point, it is described primary importance, and in described the first image, determine the summit of described the first operating body shade, the position of determining described summit is the described second place.
13. electronic equipments as claimed in claim 12, is characterized in that, described the first determination module specifically for: according to described the first image and described the second image, determine the first coordinate of described the first identification point, and second coordinate on described summit.
14. electronic equipments as claimed in claim 13, it is characterized in that, described the first judge module specifically for: judge whether described primary importance and the described second place overlap in described the first image information, or judge whether described primary importance and the distance of the described second place in described the first image information are not more than the first predeterminable range.
15. electronic equipments as claimed in claim 14, is characterized in that, described the second determination module specifically for:
When described primary importance and the described second place overlap in described the first image information, determine that described the first operating body contacts the surface of described projection operation face, otherwise determine that described the first operating body does not contact the surface of described projection operation face; Or
When described primary importance and the described second place, the distance in described the first image information is not more than the first predeterminable range, determine that described the first operating body contacts the surface of described projection operation face, otherwise determine that described the first operating body does not contact the surface of described projection operation face.
16. electronic equipments as claimed in claim 14, it is characterized in that, described the first judge module specifically for: judge whether described the first coordinate and described the second coordinate are same coordinate in described the first image information, or judge whether described the first coordinate and the distance of described the second coordinate in described the first image information are not more than described the first predeterminable range.
17. electronic equipments as claimed in claim 15, is characterized in that, described electronic equipment also comprises the 3rd determination module and the second judge module;
Described the 3rd determination module is used for: determine the first area and second area corresponding to described the second image that described the first image is corresponding;
Described the second judge module is used for: judge whether described the first area equates with described second area, obtain the first judged result;
Described the second determination module specifically for: when described primary importance and the described second place overlap in described the first image information, and when described the first judged result shows that described the first area equates with described second area, determine that described the first operating body contacts the surface of described projection operation face, otherwise determine that described the first operating body does not contact the surface of described projection operation face.
18. electronic equipments as claimed in claim 15, is characterized in that, described electronic equipment also comprises the 3rd determination module and the second judge module;
Described the 3rd determination module is used for: determine the first area and second area corresponding to described the second image that described the first image is corresponding;
Described the second judge module is used for: judge whether described the first area equates with described second area, obtain the first judged result;
Described the second determination module specifically for: when described primary importance and the described second place, the distance in described the first image information is not more than described the first predeterminable range, and when described the first judged result shows that described the first area equates with described second area, determine that described the first operating body contacts the surface of described projection operation face, otherwise determine that described the first operating body does not contact the surface of described projection operation face.
19. electronic equipments as claimed in claim 11, is characterized in that, described electronic equipment also comprises acquisition module and calling module;
The first operation that described acquisition module carries out for obtaining described the first operating body;
Described calling module, for described the first operation is responded, calls the first function with described the first operational correspondence.
20. electronic equipments as described in claim 11 or 19, is characterized in that, described electronic equipment also comprises monitoring module and the 4th determination module;
Described monitoring module, for described the first operating body is monitored, determines that described the first operating body does not contact the surface of described projection operation face in first moment, obtain monitored results;
First operation of described the 4th determination module for determining that according to described monitored results described the first operating body carries out, thus can determine corresponding response mode according to described the first operation.
CN201310076677.9A 2013-03-11 2013-03-11 A kind of information processing method and electronic equipment Active CN104049807B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310076677.9A CN104049807B (en) 2013-03-11 2013-03-11 A kind of information processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310076677.9A CN104049807B (en) 2013-03-11 2013-03-11 A kind of information processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN104049807A true CN104049807A (en) 2014-09-17
CN104049807B CN104049807B (en) 2017-11-28

Family

ID=51502763

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310076677.9A Active CN104049807B (en) 2013-03-11 2013-03-11 A kind of information processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN104049807B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107332990A (en) * 2017-06-28 2017-11-07 上海青橙实业有限公司 Control method and projectable's mobile terminal

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101887325A (en) * 2009-05-13 2010-11-17 董兰荣 Optical touch control method and device thereof
CN102841733B (en) * 2011-06-24 2015-02-18 株式会社理光 Virtual touch screen system and method for automatically switching interaction modes
CN102566835A (en) * 2012-01-06 2012-07-11 福州锐达数码科技有限公司 Electronic whiteboard system for image touch and implementation method thereof

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107332990A (en) * 2017-06-28 2017-11-07 上海青橙实业有限公司 Control method and projectable's mobile terminal

Also Published As

Publication number Publication date
CN104049807B (en) 2017-11-28

Similar Documents

Publication Publication Date Title
CN108196759B (en) Icon control method and terminal
CN104123113B (en) The multi-screen display method and device of a kind of mobile terminal and its multisystem
CN103226434B (en) A kind of method that menu information is shown and device
US9781252B2 (en) Information processing method, system and mobile terminal
CN107077295A (en) A kind of method, device, electronic equipment, display interface and the storage medium of quick split screen
EP2713254A1 (en) Touch event reporting method, device and mobile terminal
US9189152B2 (en) Touch device and method for dynamically setting touch inactive area, and non-transitory recording medium
CN107704157B (en) Multi-screen interface operation method and device and storage medium
CN103207750A (en) Method and device for scaling icon
CN102855056A (en) Terminal and terminal control method
CN103092523A (en) Unlocking method and terminal
CN112465466B (en) Method, device, computer equipment and storage medium for executing flow task
CN105573637A (en) Operation method and device for touch screen equipment
CN104063071A (en) Content input method and device
CN105653177B (en) The selection method and terminal device for clicking element at terminal device interface
CN103092518A (en) Moving cloud desktop accurate touch method based on remote desktop protocol (RDP)
CN104898880A (en) Control method and electronic equipment
CN103761041A (en) Information processing method and electronic device
CN104035714A (en) Event processing method, device and equipment based on Android system
CN108334267A (en) A kind of cursor-moving method, system and terminal device
CN103809894B (en) A kind of recognition methods of gesture and electronic equipment
CN104898818A (en) Information processing method and electronic equipment
CN104407698A (en) Projecting method and electronic equipment
CN104049807A (en) Information processing method and electronic equipment
CN104407763A (en) Content input method and system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant