CN104049807B - A kind of information processing method and electronic equipment - Google Patents

A kind of information processing method and electronic equipment Download PDF

Info

Publication number
CN104049807B
CN104049807B CN201310076677.9A CN201310076677A CN104049807B CN 104049807 B CN104049807 B CN 104049807B CN 201310076677 A CN201310076677 A CN 201310076677A CN 104049807 B CN104049807 B CN 104049807B
Authority
CN
China
Prior art keywords
operating body
image
projection operation
operation face
image information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310076677.9A
Other languages
Chinese (zh)
Other versions
CN104049807A (en
Inventor
高歌
张柳新
张超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201310076677.9A priority Critical patent/CN104049807B/en
Publication of CN104049807A publication Critical patent/CN104049807A/en
Application granted granted Critical
Publication of CN104049807B publication Critical patent/CN104049807B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Position Input By Displaying (AREA)

Abstract

The invention discloses a kind of information processing method, for reducing misuse rate.Methods described can include:When projecting to a content to be projected on one projection operation face by projection module, the first image information is gathered by image acquisition units;According to the first image information, judge whether include the first image corresponding to the first operating body in the first image information;When the first image corresponding to first operating body be present, according to the second image of the first operating body shade of the=the first image and the first operating body on projection operation face, it is determined that=first first position and first operating body shade the second place on first image information of the operating body in the first image information;According to first position and the second place, determine that the first operating body is the surface for contacting projection operation face, or the surface not in contact with projection operation face.The invention also discloses the electronic equipment for realizing methods described.

Description

A kind of information processing method and electronic equipment
Technical field
The present invention relates to computer and projection art, more particularly to a kind of information processing method and electronic equipment.
Background technology
With the continuous development of scientific technology, electronic technology has also obtained development at full speed, and the species of electronic product is also got over Come more, people have also enjoyed the various facilities that development in science and technology is brought.Present people can be set by various types of electronics It is standby, enjoy the comfortable life brought with development in science and technology.
With the popularization of shadow casting technique, in order to be better controlled during projection, generally require to utilize remote control Device is controlled to projection, so also needs to user's remote controller, it is necessary to extra hardware device, and it is distant to need user to understand It could be operated after controlling each keypress function of device, it is clear that more inconvenient.
Present inventor at least has found exist in the prior art during the embodiment of the present application technical scheme is realized Following technical problem:
In order to increase the interactivity of projector equipment, it is desirable to be able to make user directly by being operated on projection interface, Without using remote control.But in this mode of operation, it is thus necessary to determine that whether the hand of user has directly contacted perspective plane, has When user hand do not touch perspective plane actually, and simply the projection of hand is located in perspective plane, such as the at this moment finger of user Clicking operation has been carried out under suspended state, if the at this moment operation of projector equipment to user responds, may result in Response is inaccurate, causes maloperation.
The content of the invention
The embodiment of the present invention provides a kind of information processing method and electronic equipment, for solving misuse rate in the prior art Higher technical problem, realize the technique effect for reducing misuse rate.
A kind of information processing method, using an electronic equipment, the electronic equipment includes a projection module and an image is adopted Collect unit, methods described includes:
When being projected to a content to be projected on one projection operation face by the projection module, adopted by described image Collect unit and gather the first image information, described first image information includes projection operation's face image corresponding to the projection operation face Information;
According to described first image information, judge that whether including one first operating body in described first image information corresponds to The first image;
When described first image corresponding to first operating body be present, according to described first image and first behaviour Make the second image of first operating body shade of the body on the projection operation face, determine first operating body described first The second place of first position and the first operating body shade in described first image information in image information;
According to the first position and the second place, it is the contact projection operation face to determine first operating body Surface, or the surface not in contact with the projection operation face.
Preferably, determine first position and first operation of first operating body in described first image information Body shade includes in the step of second place in described first image information:First behaviour is determined in described first image Make the first identification point corresponding to body, the position of first identification point is defined as the first position, and in first figure The summit of the first operating body shade is determined as in, the position for determining the summit is the second place.
Preferably, the first image and first operating body of first operating body in described image information exist Second image of the first operating body shade on the projection operation face, determine first operating body in the pre-set space model First position and the first operating body shade in enclosing include in the step of second place on the projection operation face:According to Described first image and second image, determine the first coordinate of first identification point, and second coordinate on the summit.
Preferably, according to the first position and the second place, it is the contact throwing to determine first operating body The surface of shadow operating surface, or not in contact with the projection operation face surface the step of include:Judge the first position and institute State whether the second place overlaps in described first image information, or judge the first position and the second place described Whether the distance in the first image information is not more than the first pre-determined distance.
Preferably, according to the first position and the second place, it is the contact throwing to determine first operating body The surface of shadow operating surface, or not in contact with the projection operation face surface the step of include:
When the first position and the second place overlap in described first image information, first behaviour is determined Make the surface that body contacts the projection operation face, otherwise to determine table of first operating body not in contact with the projection operation face Face;Or
When the distance of the first position and the second place in described first image information is default no more than first Apart from when, determine that first operating body contacts the surface in the projection operation face, otherwise determine that first operating body does not connect Touch the surface in the projection operation face.
Preferably, judge whether the first position and the second place overlap in described first image information, or Judge the distance of the first position and the second place in described first image information it is whether default no more than first away from From the step of include:Judge whether first coordinate and second coordinate are same seat in described first image information Mark, or judge whether the distance of first coordinate and second coordinate in described first image information is not more than described the One pre-determined distance.
Preferably, according to the first position and the second place, it is the contact throwing to determine first operating body The surface of shadow operating surface, or also include step before not in contact with the surface in the projection operation face:
Determine second area corresponding to the first area corresponding to described first image and second image;
Judge whether first area is equal with the second area, obtain the first judged result;
When the first position and the second place overlap in described first image information, first behaviour is determined Make the surface that body contacts the projection operation face, otherwise to determine surface of first operating body not in contact with the projection operation face The step of include:When the first position and the second place overlap in described first image information, and described first sentences When disconnected result shows that first area is equal with the second area, determine that first operating body contacts the projection operation The surface in face, otherwise determine surface of first operating body not in contact with the projection operation face.
Preferably, according to the first position and the second place, determine that first operating body is described in contact The surface in projection operation face, or also include step before not in contact with the surface in the projection operation face:
Determine second area corresponding to the first area corresponding to described first image and second image;
Judge whether first area is equal with the second area, obtain the first judged result;
When the distance of the first position and the second place in described first image information is default no more than first Apart from when, determine that first operating body contacts the surface in the projection operation face, otherwise determine that first operating body does not connect The step of surface for touching the projection operation face, includes:When the first position and the second place are believed in described first image Distance in breath is not more than first pre-determined distance, and first judged result shows first area and described second During area equation, determine that first operating body contacts the surface in the projection operation face, otherwise determine first operating body Not in contact with the surface in the projection operation face.
Preferably, according to the first position and the second place, determine that first operating body is described in contact The surface in projection operation face, or also include step afterwards not in contact with the surface in the projection operation face:
Obtain the first operation that first operating body is carried out;
Described first operation is responded, calls the first function corresponding with the described first operation.
Preferably, according to the first position and the second place, determine that first operating body is described in contact The surface in projection operation face, or also include step afterwards not in contact with the surface in the projection operation face:
First operating body is monitored, determines that first operating body is grasped at the first moment not in contact with the projection Make the surface in face, obtain monitored results;
The first operation that first operating body carried out is determined according to the monitored results, so as to according to described the Response mode corresponding to one operation determination.
A kind of electronic equipment, the electronic equipment include a projection module and an image acquisition units, the electronic equipment Also include:
Operation module, for when one content to be projected is projected on a projection operation face by the projection module, First image information is gathered by described image collecting unit, described first image information is included corresponding to the projection operation face Projection operation's face image information;
First judge module, for according to described first image information, judging whether include in described first image information There is the first image corresponding to one first operating body;
First determining module, for when described first image corresponding to first operating body be present, according to described Second image of the first operating body shade of one image and first operating body on the projection operation face, determine described First position and the first operating body shade of one operating body in described first image information are in described first image information On the second place;
Second determining module, for according to the first position and the second place, determining that first operating body is Contact the surface in the projection operation face, or the surface not in contact with the projection operation face.
Preferably, first determining module is specifically used for:First operating body pair is determined in described first image The first identification point answered, the position of first identification point is defined as the first position, and in described first image really The summit of the fixed first operating body shade, the position for determining the summit is the second place.
Preferably, first determining module is specifically used for:According to described first image and second image, institute is determined State the first coordinate of the first identification point, and second coordinate on the summit.
Preferably, first judge module is specifically used for:Judge the first position and the second place described Whether overlapped in first image information, or judge the first position and the second place in described first image information Whether distance is not more than the first pre-determined distance.
Preferably, second determining module is specifically used for:
When the first position and the second place overlap in described first image information, first behaviour is determined Make the surface that body contacts the projection operation face, otherwise to determine table of first operating body not in contact with the projection operation face Face;Or
When the distance of the first position and the second place in described first image information is default no more than first Apart from when, determine that first operating body contacts the surface in the projection operation face, otherwise determine that first operating body does not connect Touch the surface in the projection operation face.
Preferably, first judge module is specifically used for:Judge first coordinate and second coordinate described Whether it is same coordinate in first image information, or judges that first coordinate and second coordinate are believed in described first image Whether the distance in breath is not more than first pre-determined distance.
Preferably, the electronic equipment also includes the 3rd determining module and the second judge module;
3rd determining module is used for:Determine that the first area corresponding to described first image and second image are corresponding Second area;
Second judge module is used for:Judge whether first area is equal with the second area, obtain first Judged result;
Second determining module is specifically used for:When the first position and the second place are believed in described first image Overlapped in breath, and when first judged result shows that first area is equal with the second area, determine described first Operating body contacts the surface in the projection operation face, otherwise determines table of first operating body not in contact with the projection operation face Face.
Preferably, the electronic equipment also includes the 3rd determining module and the second judge module;
3rd determining module is used for:Determine that the first area corresponding to described first image and second image are corresponding Second area;
Second judge module is used for:Judge whether first area is equal with the second area, obtain first Judged result;
Second determining module is specifically used for:When the first position and the second place are believed in described first image Distance in breath is not more than first pre-determined distance, and first judged result shows first area and described second During area equation, determine that first operating body contacts the surface in the projection operation face, otherwise determine first operating body Not in contact with the surface in the projection operation face.
Preferably, the electronic equipment also includes acquisition module and calling module;
The acquisition module is used to obtain the first operation that first operating body is carried out;
The calling module is used to respond the described first operation, calls the first work(corresponding with the described first operation Energy.
Preferably, the electronic equipment also includes monitoring module and the 4th determining module;
The monitoring module is used to be monitored first operating body, determines first operating body at the first moment Not in contact with the surface in the projection operation face, monitored results are obtained;
4th determining module is used to determine the first behaviour that first operating body carried out according to the monitored results Make, so as to the response mode according to corresponding to the described first operation determination.
Information processing method in the embodiment of the present invention can apply to electronic equipment, and the electronic equipment can include one Projecting cell and an image acquisition units, methods described can include:One content to be projected is being thrown by the projection module When shadow is on a projection operation face, passes through described image collecting unit and gather the first image information, described first image packet Include projection operation's face image information corresponding to the projection operation face;According to described first image information, first figure is judged The first image as corresponding to whether including one first operating body in information;When existing described corresponding to first operating body During one image, according to the first operating body shade of described first image and first operating body on the projection operation face Second image, determine first position and the first operating body shade of first operating body in described first image information The second place in described first image information;According to the first position and the second place, first behaviour is determined It is the surface for contacting the projection operation face as body, or the surface not in contact with the projection operation face.
Described first image information can be gathered by described image collecting unit in the embodiment of the present invention, according to described the One image information can determine first operating body the first position and the first operating body shade described second Position, so as to can determine whether first operating body touches according to the first position and the second place actually It the projection operation face, so can determine whether to respond the corresponding operating of first operating body, avoid and work as The electronic equipment may be also because can not judge first operation when first operating body is really in suspended state Whether body is in contact with the projection operation face and the corresponding operating of first operating body is responded, or first behaviour Make body actually be in contact with the projection operation face, but because the electronic equipment can not judge first operating body whether with Projection operation face is in contact and the operation of first operating body can not be responded, and effectively reduces misuse rate, also carries High operating efficiency, improves Consumer's Experience.
Wherein, described image collecting unit can be camera, and the present invention only need to use a camera to complete to appoint Business, saves hardware resource.
Brief description of the drawings
Fig. 1 is the broad flow diagram of information processing method in the embodiment of the present invention;
Fig. 2A is a kind of possible application scenarios schematic diagram in the embodiment of the present invention;
Fig. 2 B are application scenarios schematic diagram alternatively possible in the embodiment of the present invention;
Fig. 2 C are the detailed structure view of electronic equipment in the embodiment of the present invention.
Embodiment
Information processing method in the embodiment of the present invention can apply to electronic equipment, and the electronic equipment can include one Projecting cell and an image acquisition units, methods described can include:One content to be projected is being thrown by the projection module When shadow is on a projection operation face, passes through described image collecting unit and gather the first image information, described first image packet Include projection operation's face image information corresponding to the projection operation face;According to described first image information, first figure is judged The first image as corresponding to whether including one first operating body in information;When existing described corresponding to first operating body During one image, according to the first operating body shade of described first image and first operating body on the projection operation face Second image, determine first position and the first operating body shade of first operating body in described first image information The second place in described first image information;According to the first position and the second place, first behaviour is determined It is the surface for contacting the projection operation face as body, or the surface not in contact with the projection operation face.
Described first image information can be gathered by described image collecting unit in the embodiment of the present invention, according to described the One image information can determine first operating body the first position and the first operating body shade described second Position, so as to can determine whether first operating body touches according to the first position and the second place actually It the projection operation face, so can determine whether to respond the corresponding operating of first operating body, avoid and work as The electronic equipment may be also because can not judge first operation when first operating body is really in suspended state Whether body is in contact with the projection operation face and the corresponding operating of first operating body is responded, or first behaviour Make body actually be in contact with the projection operation face, but because the electronic equipment can not judge first operating body whether with Projection operation face is in contact and the operation of first operating body can not be responded, and effectively reduces misuse rate, also carries High operating efficiency, improves Consumer's Experience.
Wherein, described image collecting unit can be camera, and the present invention only need to use a camera to complete to appoint Business, saves hardware resource.
Referring to Fig. 1, the information processing method in the embodiment of the present invention can apply to electronic equipment, and the electronic equipment can With with a projecting cell and an image acquisition units, the projecting cell, which can be used for treating project content, to be projected.Institute The main flow for stating method is as follows:
Step 101:When projecting to a content to be projected on one projection operation face by the projection module, pass through institute State image acquisition units and gather the first image information, described first image information includes projection behaviour corresponding to the projection operation face Make face image information.
In the embodiment of the present invention, when the content to be projected is projected into the projection operation face by the projection module When upper, described first image information can be gathered by described image collecting unit, wherein, described first image information can wrap Include projection operation face image information corresponding to the projection operation face.
In the embodiment of the present invention, described image collecting unit can be camera.The camera can be located at the electricity It is a part of the electronic equipment, or the camera can also be two with the electronic equipment in sub- equipment Independent equipment, the camera can be communicated with the electronic equipment.
In the embodiment of the present invention, when being projected by the projection module, the covered region of projection is properly termed as View field, the view field both include projector space region, also include the projection operation face.The view field pair The image information answered is properly termed as described first image information.
The camera should be arranged on the side of the view field, so can preferably gather described first image letter Breath.
If for example, the camera is arranged on to the side of the view field, even if first operating body suspends In the air, the camera can also collect the image information of first operating body, and the camera can also collect Image information on the projection operation face.
Step 102:According to described first image information, judge whether include one first behaviour in described first image information Make the first image corresponding to body.
In the embodiment of the present invention, after described first image information is obtained, described first image information can be divided Analysis, to judge whether include the described first image corresponding to first operating body in described first image information.If It is determined that described first image be present, then it can determine exist in the range of the pre-set space corresponding with the projection operation face First operating body.
That is, in the embodiment of the present invention, after described first image information is obtained, can be sentenced according to described first image information Break and whether there is first operating body in the range of the pre-set space corresponding with the projection operation face.
In the embodiment of the present invention, the pre-set space scope can refer to the projector space region.
Step 103:When described first image corresponding to first operating body be present, according to described first image and institute The second image of first operating body shade of first operating body on the projection operation face is stated, determines that first operating body exists The second of first position and the first operating body shade in described first image information in described first image information Put.
In the embodiment of the present invention, if it is determined that determine described first image be present in described first image information, then can be with It is determined that first operating body be present in the range of pre-set space corresponding with the projection operation face.In the embodiment of the present invention, The pre-set space scope can refer to the projector space region.
Meanwhile first operating body has projection, then accordingly, it may be determined that described in described first image information Second image corresponding to described first image corresponding to first operating body and the first operating body shade.
Preferably, in the embodiment of the present invention, first operating body can refer to the finger of user.
It is determined that after described first image and second image, first operating body can be determined described respectively The corresponding first position and the first operating body shade are corresponding in described first image information in one image information The second place.
Preferably, determine the first operating body corresponding first position and determination in described first image information The first operating body shade corresponding second place in described first image information, can be specifically:It is it is determined that described First identification point of the first image, that is, determine first identification point of first operating body, for example, institute can arbitrarily be chosen State in the first image a little be used as described first identification point, you can using arbitrarily choose on first operating body it is some as First identification point, or the end points of first operating body can also be chosen as first identification point, you can with choosing The end points of described first image is taken as first identification point.If for example, first operating body is the hand of user Refer to, then can choose a little described first identification point of corresponding point conduct in described first image of the end of the finger. After determining first identification point, position of first identification point in described first image information can be defined as described First position.The summit of the first operating body shade is determined, if for example, first operating body is the hand of user Referring to, then the first operating body shade is also a finger, then the end of the finger can be a little defined as into the summit, Position of the summit in described first image information can be defined as the second place.
Preferably, in the embodiment of the present invention, it is determined that behind first identification point and the summit, institute can be determined respectively State the first coordinate of the first identification point and second coordinate on the summit.
, can be according to the same coordinate system it is determined that when first coordinate and second coordinate in the embodiment of the present invention To be determined.
In the embodiment of the present invention, when the projecting cell is projected, project content can be with a wall, projection screen On upper, desktop, etc., i.e. perspective plane can be wall, projection screen, desktop etc., then the projection operation face can be located at wall On wall, on projection screen, on desktop, etc..The projection operation face in the embodiment of the present invention can be located at the perspective plane On.
Step 104:According to the first position and the second place, it is the contact throwing to determine first operating body The surface of shadow operating surface, or the surface not in contact with the projection operation face.
, can be according to described first it is determined that behind the first position and the second place in the embodiment of the present invention Put and determine whether first operating body has touched the surface in the projection operation face with the second place.
Preferably, it can determine whether first operating body touches according to the first position and the second place The surface in the projection operation face, can be specifically judge the first position and the second place in described first image Whether overlapped in information, or whether judge the distance of the first position and the second place in described first image information No more than the first pre-determined distance.
, then can be with if it is determined that determine that the first position and the second place overlap in described first image information Determine that first operating body has touched the surface in the projection operation face, otherwise can determine that first operating body does not connect Contact the surface in the projection operation face.
If it is determined that determine that the distance of the first position and the second place in described first image information is little In the first pre-determined distance, then it can determine that first operating body has touched the surface in the projection operation face, otherwise can be with Determine first operating body not in contact with the surface to the projection operation face.
Preferably, judging whether the first position and the second place overlap in described first image information, have Body can judge whether first coordinate and second coordinate are same coordinate in described first image information.If Judge to determine that first coordinate and second coordinate are same coordinate in described first image information, then can determine institute State first position and the second place to overlap in described first image information, otherwise can determine the first position and institute It is misaligned in described first image information to state the second place.
Preferably, or whether judge the distance of the first position and the second place in described first image information Can be specifically to judge first coordinate and second coordinate in described first image no more than first pre-determined distance Whether the distance in information is not more than first pre-determined distance.If it is determined that determine first coordinate and second coordinate Distance in described first image information is not more than first pre-determined distance, then can determine the first position and described Distance of the second place in described first image information is not more than first pre-determined distance, otherwise can determine described first The distance of position and the second place in described first image information is more than first pre-determined distance.
Preferably, in the embodiment of the present invention, according to the first position and the second place, first behaviour is determined The surface for contacting the projection operation face as body, or not in contact with the surface in the projection operation face before, can also determine The second area of first area of described first image and second image.It is determined that first area and second face After product, it can be determined that whether first area is equal with the second area, obtains the first judged result, and described first judges As a result it may be used to indicate that whether first area is equal with the second area.Can according to first judged result and Second judged result determines that first operating body is the surface for contacting the projection operation face, or is grasped not in contact with the projection Make the surface in face.Wherein, second judged result may indicate that whether the first position overlaps with the second place, or Whether the second judged result described in person may indicate that the distance between the first position and the second place no more than described First pre-determined distance.
Preferably, in the embodiment of the present invention, if the first position and the second place are believed in described first image Overlapped in breath, and when first judged result shows that first area is equal with the second area, then can determine institute The surface that the first operating body contacts the projection operation face is stated, first operating body otherwise can be determined not in contact with the projection The surface of operating surface.
If for example, the first position and the second place overlapped in described first image information, described first Judged result shows that first area and the second area are unequal, then first operating body can be determined not in contact with institute State the surface in projection operation face;If the first position and the second place are misaligned in described first image information, Second judged result shows that first area is equal with the second area, it may be determined that first operating body does not connect Touch the surface in the projection operation face;If the first position and the second place do not weigh in described first image information Close, second judged result shows that first area and the second area are unequal, it may be determined that first operation Surface of the body not in contact with the projection operation face.
Preferably, in the embodiment of the present invention, if the first position and the second place are believed in described first image Distance in breath is not more than first pre-determined distance, and first judged result shows first area and described second During area equation, then can determine that first operating body contacts the surface in the projection operation face, otherwise can determine described in Surface of first operating body not in contact with the projection operation face.
If for example, the distance of the first position and the second place in described first image information is not more than institute The first pre-determined distance is stated, first judged result shows that first area and the second area are unequal, then can be true Surface of fixed first operating body not in contact with the projection operation face;If the first position and the second place are in institute State the distance in the first image information and be more than first pre-determined distance, second judged result show first area with The second area is equal, it may be determined that surface of first operating body not in contact with the projection operation face;If described The distance of one position and the second place in described first image information is more than the first pre-determined distance, and described second judges knot Fruit shows that first area and the second area are unequal, it may be determined that first operating body is grasped not in contact with the projection Make the surface in face.
Preferably, in the embodiment of the present invention, first behaviour is being determined according to the first position and the second place The surface for contacting the projection operation face as body, or not in contact with the surface in the projection operation face after, can also obtain The first of the first operating body progress operates, in the embodiment of the present invention, because having determined that first operating body touches The surface in the projection operation face, then if first operating body has carried out first behaviour on the projection operation face Make, the electronic equipment can obtain first operation.
After first operation is obtained, the electronic equipment can respond to the described first operation, such as can be with Call the first function corresponding with the described first operation.
For example, the projection operation face is computer desktop, first operating body is the finger of user, described first The double click operation carried out for user with the finger on the projection operation face to the first icon is operated, then the electronic equipment can To be responded to the described first operation, first function corresponding with the described first operation, such as first work(are called Can be the function of opening the first application interface corresponding with first icon.
For example, as shown in Figure 2 A, for a kind of possible embodiment of the present invention.In fig. 2, first operating body is The hand of user.Described image collecting unit is positioned at the described first operation body side surface, and so, described image collecting unit can Collect second image corresponding to described first image corresponding to first operating body and the first operating body shade. In fig. 2, shadeless hand is first operating body, and hypographous hand is the first operating body shade.In Fig. 2A In, the first position can be determined according to first operating body, and institute can be determined according to the first operating body shade State the second place, and in fig. 2 as can be seen that the first position and the second place do not overlap, and described first Put the distance with the second place in described first image information and be obviously more than first pre-determined distance, it is seen then that scheming First operating body described in 2A is in suspended state, and it does not touch the surface in the projection operation face.It is then if described First operating body is operated, and the electronic equipment can not respond to the operation.And it can be seen that in fig. 2 Camera is located at side, and so, from the perspective of camera, first operating body and the first operating body shade are located at In one plane, it is ensured that camera more can intactly collect described first image corresponding to first operating body With the first operating body shade corresponding to second image.
For example, as shown in Figure 2 B, for a kind of possible embodiment of the present invention.In fig. 2b, first operating body is The hand of user.Described image collecting unit is positioned at the certain angle of the described first operation body side surface, and so, described image is adopted Collection unit can collect institute corresponding to described first image corresponding to first operating body and the first operating body shade State the second image.In fig. 2b, shadeless hand is first operating body, and hypographous hand is that first operating body is cloudy Shadow.In fig. 2b, the first position can be determined according to first operating body, and can be cloudy according to first operating body Shadow determines the second place, and in fig. 2b as can be seen that the first position and the second place overlap, it is seen then that First operating body described in Fig. 2 B has touched the surface in the projection operation face.Then if first operating body is grasped Make, the electronic equipment can respond to the operation.And can be seen that camera in fig. 2b and be located at side, so, From the perspective of camera, first operating body and the first operating body shade are in a plane, it is ensured that Camera more can intactly collect described first image corresponding to first operating body and first operating body is cloudy Second image corresponding to shadow.
In the embodiment of the present invention, the electronic equipment first operating body can be carried out in real time or timing monitors.Example Such as, the electronic equipment can be monitored by described image collecting unit to first operating body.
According to the first position and the second place, it is the contact projection operation to determine first operating body The surface in face, or not in contact with the surface in the projection operation face after, the electronic equipment can continue to described first grasp It is monitored as body, determines when first operating body leaves the projection operation face.When the electronic equipment monitors institute State the first operating body and have left the projection operation face, you can determine that first operating body carries out first end of operation, It can determine which kind of operation first operation is specially in succession, so as to the response according to corresponding to the described first operation determination Mode.
For example, the electronic equipment, which monitors first operating body, have left the projection operation at the first moment Face, then the electronic equipment can determine first end of operation that first operating body is carried out, the electronic equipment can Judge that first operation has with the time of contact according to first operating body and the projection operation face and/or contact trace Which kind of operation body is, be may thereby determine that and the described first corresponding response mode of operation.
If for example, the electronic equipment monitors at the second moment determines first operating body and the projection operation Face is in contact, and first moment monitoring after second moment determines that first operating body have left the projection Operating surface, then the electronic equipment can determine the time difference between second moment and first moment.It is if described Time difference is not more than preset duration threshold value, then what the electronic equipment can determine that first operating body carried out is to click behaviour Make.If the time difference is more than the preset duration threshold value, the electronic equipment can determine the first operating body institute What is carried out is that long-press operates, i.e., the operation pinned for a long time.
If for example, the electronic equipment monitors at the second moment determines first operating body and the projection operation Face is in contact, and first moment monitoring after second moment determines that first operating body have left the projection Operating surface, the 3rd moment monitoring after first moment determine that first operating body connects with the projection operation face To touch, the 4th moment after the 3rd moment monitors again determines that first operating body have left the projection operation face, Then the electronic equipment can determine that the very first time between second moment and first moment is poor, first moment With the second time difference between the 3rd moment, and can determine between the 3rd moment and the 4th moment the 3rd Time difference.If these three time differences, no more than the preset duration threshold value, the electronic equipment can determine described What one operating body was carried out is once-through operation, such as the operation can be double click operation.If there is one in three time differences More than the preset duration threshold value, then what the electronic equipment can determine that first operating body carried out is to operate twice, The electronic equipment can according to the very first time, poor and described second time difference determines why this operates difference twice respectively Kind operation.
Preferably, the electronic equipment except the time of contact according to first operating body and the projection operation face come It is specially which kind of operation is outer to judge first operation, can also connecing by first operating body and the projection operation face Rail-engaging mark or the simultaneously contact trace and the time of contact judge which kind of operation first operation is specially.
If for example, the electronic equipment monitors at the second moment determines first operating body and the projection operation Face is in contact, and first moment monitoring after second moment determines that first operating body have left the projection Operating surface, then the electronic equipment can determine that first operating body exists between second moment and first moment The trace information streaked on the projection operation face.For example, the electronic equipment can determine first operating body in institute Threeth positional information of second moment corresponding on the projection operation face is stated, and first operating body can be determined in institute Fourth positional information of first moment corresponding on the projection operation face is stated, then, if the 3rd positional information institute is right The 3rd position answered is same position with the 4th position corresponding to the 4th positional information, or the 3rd position with it is described The distance between 4th position is not more than the second pre-determined distance, then the electronic equipment can determine first operating body in institute State and do not moved on projection operation face, then what first operating body was carried out is likely to single-click operation, or long-press operation etc., And if the 3rd position and the 4th position are not same position, or between the 3rd position and the 4th position Distance be more than second pre-determined distance, then what first operating body was carried out is likely to slide.
The information processing method in the present invention is introduced below by way of several specific embodiments, the following examples are mainly used In several possible application scenarios for introducing methods described.It should be noted that the embodiment in the present invention is served only for explaining this Invention, and cannot be used for the limitation present invention.Every embodiment for meeting inventive concept within protection scope of the present invention, Those skilled in the art know naturally should be how according to thought progress modification of the invention.
Embodiment one:
The electronic equipment is PC (personal computer), and the electronic equipment has the projection module, and has a figure As collecting unit, described image collecting unit is camera.The perspective plane is desktop.The projection operation face can be located at institute State on perspective plane.Projection operation face described in the present embodiment is computer desktop.
When projecting to the content to be projected on the projection operation face by the projection module, institute can be passed through Image acquisition units collection described first image information is stated, wherein, described first image information can include the projection operation Projection operation face image information corresponding to face.
In the present embodiment, the camera is arranged on the side of the view field, so can preferably gather described First image information.
After described first image information is obtained, described first image information can be analyzed, to judge described Whether described first image first operating body corresponding to is included in one image information.If it is determined that have described first Image, then it can determine first operating body be present in the range of the pre-set space corresponding with the projection operation face.
In the present embodiment, judge to determine described first image be present in described first image information, then can mutually should determine that First operating body in the range of the pre-set space be present, can mutually should determine that the first operating body shade described first Corresponding second image in image information.
It is determined that after described first image and second image, first operating body can be determined described respectively The corresponding first position and the first operating body shade are corresponding in described first image information in one image information The second place.
In the present embodiment, determine the first position corresponding to first operating body and determine that first operating body is cloudy The second place, Ke Yishi corresponding to shadow:The first identification point of described first image is determined, that is, determines first operating body First identification point, for example, can arbitrarily choose in described first image a little be used as described first identification point, you can Using arbitrarily choose on first operating body some as first identification point, or can also choose it is described first operate The end points of body is as first identification point, you can the end points to choose described first image is used as first identification point.Example Such as, if first operating body be user a finger, can choose the finger end a little described first Corresponding point is used as first identification point in image., can be by first identification point it is determined that after first identification point Position in described first image information is defined as the first position.Determine the summit of the first operating body shade, example Such as, if first operating body is the finger of user, the first operating body shade is also a finger, then can be with The end of the finger is a little defined as the summit, position that can be by the summit in described first image information is true It is set to the second place.
In the present embodiment, it is determined that behind first identification point and the summit, first mark can be determined respectively First coordinate of point and second coordinate on the summit.
In the present embodiment, it is determined that when first coordinate and second coordinate, it can be entered according to the same coordinate system Row determines.
In the present embodiment, it is determined that behind the first position and the second place, can according to the first position and The second place determines whether first operating body has touched the surface in the projection operation face.
Preferably, one kind determines whether first operating body touches according to the first position and the second place The mode on the surface in the projection operation face can be:Judge the first position and the second place in first figure As whether being overlapped in information, the second judged result is obtained, first operating body can be determined according to second judged result Whether the surface in the projection operation face has been touched.
, then can be with if it is determined that determine that the first position and the second place overlap in described first image information Determine that first operating body has touched the surface in the projection operation face.
If it is determined that determining that the first position and the second place are misaligned in described first image information, then may be used To determine first operating body not in contact with the surface to the projection operation face.
In this implementation, one kind judges whether the first position and the second place weigh in described first image information The mode of conjunction can be:Judge that the first position is with the second place according to first coordinate and second coordinate No coincidence, obtain second judged result.
Such as, it can be determined that whether first coordinate and second coordinate are same coordinate, if it is determined that determining institute It is same coordinate that the first coordinate, which is stated, with second coordinate, then can determine that the first position and the second place overlap, Otherwise it can determine that the first position and the second place are misaligned.
In the present embodiment, determine that the first position and the second place overlap, then can determine first operation Body has touched the surface in the projection operation face.
Determining that first operating body contacts the projection operation face according to the first position and the second place Surface after, can obtain the first operating body progress first operation.First operating body described in the present embodiment can be user A finger, first operation can be double-click that user is carried out with the finger on the computer desktop to the first icon Operation, the first icon described in the present embodiment are the corresponding icon of word applications.
After first operation is obtained, the described first operation can be responded, called and the described first operation pair The first function of answering.In the present embodiment, operation interface corresponding with word is as opened.
Embodiment two:
The electronic equipment is PC (personal computer), and the electronic equipment has the projection module, and has a figure As collecting unit, described image collecting unit is camera.The perspective plane is desktop.The projection operation face can be located at institute State on perspective plane.Projection operation face described in the present embodiment is computer desktop.
When projecting to the content to be projected on the projection operation face by the projection module, institute can be passed through Image acquisition units collection described first image information is stated, wherein, described first image information can include the projection operation Projection operation face image information corresponding to face.
In the present embodiment, the camera is arranged on the side of the view field, so can preferably gather described First image information.
After described first image information is obtained, described first image information can be analyzed, to judge described Whether described first image first operating body corresponding to is included in one image information.If it is determined that have described first Image, then it can determine first operating body be present in the range of the pre-set space corresponding with the projection operation face.
In the present embodiment, judge to determine described first image be present in described first image information, then can mutually should determine that First operating body in the range of the pre-set space be present, can mutually should determine that the first operating body shade described first Corresponding second image in image information.
It is determined that after described first image and second image, first operating body can be determined described respectively The corresponding first position and the first operating body shade are corresponding in described first image information in one image information The second place.
In the present embodiment, determine the first position corresponding to first operating body and determine that first operating body is cloudy The second place, Ke Yishi corresponding to shadow:The first identification point of described first image is determined, that is, determines first operating body First identification point, for example, can arbitrarily choose in described first image a little be used as described first identification point, you can Using arbitrarily choose on first operating body some as first identification point, or can also choose it is described first operate The end points of body is as first identification point, you can the end points to choose described first image is used as first identification point.Example Such as, if first operating body be user a finger, can choose the finger end a little described first Corresponding point is used as first identification point in image., can be by first identification point it is determined that after first identification point Position in described first image information is defined as the first position.Determine the summit of the first operating body shade, example Such as, if first operating body is the finger of user, the first operating body shade is also a finger, then can be with The end of the finger is a little defined as the summit, position that can be by the summit in described first image information is true It is set to the second place.
In the present embodiment, it is determined that behind first identification point and the summit, first mark can be determined respectively First coordinate of point and second coordinate on the summit.
In the present embodiment, it is determined that when first coordinate and second coordinate, it can be entered according to the same coordinate system Row determines.
In the present embodiment, it is determined that behind the first position and the second place, can according to the first position and The second place determines whether first operating body has touched the surface in the projection operation face.
Preferably, one kind determines whether first operating body touches according to the first position and the second place The mode on the surface in the projection operation face can be:Judge the first position and the second place in first figure As whether the distance in information is not more than the first pre-determined distance, the second judged result is obtained, can according to second judged result To determine whether first operating body has touched the surface in the projection operation face.
If it is determined that determine that the distance of the first position and the second place in described first image information is little In first pre-determined distance, then it can determine that first operating body has touched the surface in the projection operation face.
If it is determined that determine that the distance of the first position and the second place in described first image information is more than First pre-determined distance, then first operating body can be determined not in contact with the surface to the projection operation face..
In this implementation, a kind of distance for judging the first position and the second place in described first image information Whether being not more than the mode of first pre-determined distance can be:According to judging first coordinate and second coordinate Whether the distance of first position and the second place in described first image information is not more than first pre-determined distance, obtains Obtain second judged result.
Such as, it can be determined that whether first coordinate and distance of second coordinate in described first image information No more than first pre-determined distance, if it is determined that determining that first coordinate is believed with second coordinate in described first image Distance in breath is not more than first pre-determined distance, then the first position and the second place can be determined described Distance in one image information is not more than first pre-determined distance, otherwise can determine the first position and the second The distance put in described first image information is more than first pre-determined distance.
In the present embodiment, determine the first position with distance of the second place in described first image information not More than first pre-determined distance, then it can determine that first operating body has touched the surface in the projection operation face.
Determining that first operating body contacts the projection operation face according to the first position and the second place Surface after, can obtain the first operating body progress first operation.
In the present embodiment, the electronic equipment first operating body can be carried out in real time or timing monitors.For example, institute Electronic equipment is stated to be monitored first operating body by described image collecting unit.
It is determined that after the surface in first operating body contact projection operation face, the electronic equipment can continue First operating body is monitored, determines when first operating body leaves the projection operation face.
For example, the electronic equipment, which monitors first operating body, have left the projection operation at the first moment Face, then the electronic equipment can determine first end of operation that first operating body is carried out, the electronic equipment can Judge that first operation has with the time of contact according to first operating body and the projection operation face and/or contact trace Which kind of operation body is, be may thereby determine that and the described first corresponding response mode of operation.
For example, in the present embodiment, the electronic equipment monitored at the second moment determine first operating body with it is described Projection operation face is in contact, and first moment monitoring after second moment determines that first operating body have left The projection operation face, then the electronic equipment can determine the time difference between second moment and first moment. In the present embodiment, the time difference is not more than preset duration threshold value, then the electronic equipment can determine first operating body What is carried out is single-click operation, it may be determined that response mode corresponding with the single-click operation.
First operating body described in the present embodiment can be the finger of user, and first operation can be that user uses The single-click operation that the finger is carried out on the computer desktop to the first icon, the first icon described in the present embodiment should for word With corresponding icon.
After first operation is obtained, the described first operation can be responded, called and the described first operation pair The first function of answering.In the present embodiment, first icon is as selected.
Embodiment three:
The electronic equipment is PC (personal computer), and the electronic equipment has the projection module, and has a figure As collecting unit, described image collecting unit is camera.The perspective plane is desktop.The projection operation face can be located at institute State on perspective plane.Projection operation face described in the present embodiment is computer desktop.
When projecting to the content to be projected on the projection operation face by the projection module, institute can be passed through Image acquisition units collection described first image information is stated, wherein, described first image information can include the projection operation Projection operation face image information corresponding to face.
In the present embodiment, the camera is arranged on the side of the view field, so can preferably gather described First image information.
After described first image information is obtained, described first image information can be analyzed, to judge described Whether described first image first operating body corresponding to is included in one image information.If it is determined that have described first Image, then it can determine first operating body be present in the range of the pre-set space corresponding with the projection operation face.
In the present embodiment, judge to determine described first image be present in described first image information, then can mutually should determine that First operating body in the range of the pre-set space be present, can mutually should determine that the first operating body shade described first Corresponding second image in image information.
It is determined that after described first image and second image, first operating body can be determined described respectively The corresponding first position and the first operating body shade are corresponding in described first image information in one image information The second place.
In the present embodiment, determine the first position corresponding to first operating body and determine that first operating body is cloudy The second place, Ke Yishi corresponding to shadow:The first identification point of described first image is determined, that is, determines first operating body First identification point, for example, can arbitrarily choose in described first image a little be used as described first identification point, you can Using arbitrarily choose on first operating body some as first identification point, or can also choose it is described first operate The end points of body is as first identification point, you can the end points to choose described first image is used as first identification point.Example Such as, if first operating body be user a finger, can choose the finger end a little described first Corresponding point is used as first identification point in image., can be by first identification point it is determined that after first identification point Position in described first image information is defined as the first position.Determine the summit of the first operating body shade, example Such as, if first operating body is the finger of user, the first operating body shade is also a finger, then can be with The end of the finger is a little defined as the summit, position that can be by the summit in described first image information is true It is set to the second place.
In the present embodiment, it is determined that behind first identification point and the summit, first mark can be determined respectively First coordinate of point and second coordinate on the summit.
In the present embodiment, it is determined that when first coordinate and second coordinate, it can be entered according to the same coordinate system Row determines.
In the present embodiment, it is determined that behind the first position and the second place, can according to the first position and The second place determines whether first operating body has touched the surface in the projection operation face.
Preferably, one kind determines whether first operating body touches according to the first position and the second place The mode on the surface in the projection operation face can be:Judge the first position and the second place in first figure As whether being overlapped in information, the second judged result is obtained, first operating body can be determined according to second judged result Whether the surface in the projection operation face has been touched.
In the present embodiment, after described first image and second image is obtained, first figure can be determined respectively Second area corresponding to the first area and second image as corresponding to.It may determine that first area and second face Whether product is equal, obtains the first judged result, and first judged result may indicate that first area and second face Whether product is equal.
In the present embodiment, if it is determined that determining the first position and the second place in described first image information Overlap, and first judged result shows that first area and the second area are equal, then can determine described first Operating body has touched the surface in the projection operation face.
In the present embodiment, if it is determined that determining the first position and the second place in described first image information It is misaligned, and first judged result shows that first area and the second area are equal, then can determine described One operating body is not in contact with the surface to the projection operation face.
In the present embodiment, if it is determined that determining the first position and the second place in described first image information Overlap, and first judged result shows that first area and the second area are unequal, then can determine described the One operating body is not in contact with the surface to the projection operation face.
In the present embodiment, if it is determined that determining the first position and the second place in described first image information It is misaligned, and first judged result shows that first area and the second area are unequal, then can determine described in First operating body is not in contact with the surface to the projection operation face.
In this implementation, one kind judges whether the first position and the second place weigh in described first image information The mode of conjunction can be:Judge that the first position is with the second place according to first coordinate and second coordinate No coincidence, obtain second judged result.
Such as, it can be determined that whether first coordinate and second coordinate are same coordinate, if it is determined that determining institute It is same coordinate that the first coordinate, which is stated, with second coordinate, then can determine that the first position and the second place overlap, Otherwise it can determine that the first position and the second place are misaligned.
In the present embodiment, judge to determine that the first position and the second place overlap, and first area and institute State that second area is equal, then can determine that first operating body has touched the surface in the projection operation face.
It is determined that behind the surface in first operating body contact projection operation face, the progress of the first operating body can be obtained First operation.First operating body described in the present embodiment can be the finger of user, and first operation can be used The single-click operation that family is carried out with the finger on the video playback interface.
After first operation is obtained, the described first operation can be responded, called and the described first operation pair The first function of answering.In the present embodiment, as make the video content played in the video playback interface suspend and play.
Referring to Fig. 2 C, the present invention also provides a kind of electronic equipment, and the electronic equipment can include projection module and image Collecting unit, the electronic equipment can also include operation module 201, the first judge module 202, the and of the first determining module 203 Second determining module 204.In the embodiment of the present invention, described image collecting unit can be camera.
Preferably, the electronic equipment can also include the 3rd determining module 205, the second judge module 206, acquisition module 207th, calling module 208, the determining module 210 of monitoring module 209 and the 4th.
Operation module 201 can be used for one content to be projected is being projected into a projection operation face by the projection module When upper, the first image information was gathered by described image collecting unit, described first image information includes the projection operation face Corresponding projection operation face image information.
First judge module 202 can be used for according to described first image information, judge be in described first image information It is no to include the first image corresponding to one first operating body.
First judge module 202 specifically can be used for judging the first position and the second place in first figure As whether being overlapped in information, or judge that the distance of the first position and the second place in described first image information is It is no to be not more than the first pre-determined distance.
First judge module 202 specifically can be used for judging first coordinate and second coordinate in first figure As whether being same coordinate in information, or judge first coordinate and second coordinate in described first image information Whether distance is not more than first pre-determined distance.
First determining module 203 can be used for when described first image corresponding to first operating body be present, according to Second image of the first operating body shade of described first image and first operating body on the projection operation face, it is determined that First position and the first operating body shade of first operating body in described first image information are in first figure As the second place in information.
First determining module 203 specifically can be used for determining corresponding to first operating body the in described first image One identification point, the position of first identification point is defined as the first position, and in described first image described in determination The summit of first operating body shade, the position for determining the summit are the second place.
First determining module 203 specifically can be used for according to described first image and second image, determine described First coordinate of one identification point, and second coordinate on the summit.
Second determining module 204 can be used for according to the first position and the second place, determine first behaviour It is the surface for contacting the projection operation face as body, or the surface not in contact with the projection operation face.
Second determining module 204 specifically can be used for working as the first position and the second place in described first image When being overlapped in information, determine that first operating body contacts the surface in the projection operation face, otherwise determine first operation Surface of the body not in contact with the projection operation face;Or when the first position and the second place are in described first image information In distance when being not more than the first pre-determined distance, determine that first operating body contacts the surface in the projection operation face, otherwise Determine surface of first operating body not in contact with the projection operation face.
Second determining module 204 specifically can be used for working as the first position and the second place in described first image Overlapped in information, and when first judged result shows that first area is equal with the second area, determine described One operating body contacts the surface in the projection operation face, otherwise determines first operating body not in contact with the projection operation face Surface.
Second determining module 204 specifically can be used for working as the first position and the second place in described first image Distance in information is not more than first pre-determined distance, and first judged result shows first area and described the During two area equations, determine that first operating body contacts the surface in the projection operation face, otherwise determine first operation Surface of the body not in contact with the projection operation face.
3rd determining module 205 is determined for the first area corresponding to described first image and second image pair The second area answered.
Second judge module 206 can be used for judging whether first area is equal with the second area, obtain the One judged result.
Acquisition module 207 can be used for obtaining the first operation that first operating body is carried out.
Calling module 208 can be used for responding the described first operation, call corresponding with the described first operation the One function.
Monitoring module 209 can be used for being monitored first operating body, determine first operating body first Moment not in contact with the surface in the projection operation face, obtains monitored results.
4th determining module 210 can be used for determine that first operating body carried out according to the monitored results first Operation, so as to the response mode according to corresponding to the described first operation determination.
Information processing method in the embodiment of the present invention can apply to electronic equipment, and the electronic equipment can include one Projecting cell and an image acquisition units, methods described can include:One content to be projected is being thrown by the projection module When shadow is on a projection operation face, passes through described image collecting unit and gather the first image information, described first image packet Include projection operation's face image information corresponding to the projection operation face;According to described first image information, first figure is judged The first image as corresponding to whether including one first operating body in information;When existing described corresponding to first operating body During one image, according to the first operating body shade of described first image and first operating body on the projection operation face Second image, determine first position and the first operating body shade of first operating body in described first image information The second place in described first image information;According to the first position and the second place, first behaviour is determined It is the surface for contacting the projection operation face as body, or the surface not in contact with the projection operation face.
Described first image information can be gathered by described image collecting unit in the embodiment of the present invention, according to described the One image information can determine first operating body the first position and the first operating body shade described second Position, so as to can determine whether first operating body touches according to the first position and the second place actually It the projection operation face, so can determine whether to respond the corresponding operating of first operating body, avoid and work as The electronic equipment may be also because can not judge first operation when first operating body is really in suspended state Whether body is in contact with the projection operation face and the corresponding operating of first operating body is responded, or first behaviour Make body actually be in contact with the projection operation face, but because the electronic equipment can not judge first operating body whether with Projection operation face is in contact and the operation of first operating body can not be responded, and effectively reduces misuse rate, also carries High operating efficiency, improves Consumer's Experience.
Wherein, described image collecting unit can be camera, and the present invention only need to use a camera to complete to appoint Business, saves hardware resource.
In the embodiment of the present invention, according to the first position and the second place, determine that first operating body is to connect The surface in the projection operation face is touched, or there can be a variety of, more spirit not in contact with the mode on the surface in the projection operation face It is living.
In the embodiment of the present invention, it is the surface for contacting the projection operation face to determine first operating body, is not still connect Touch the projection operation face surface, not only can by the position of first operating body and the first operating body shade come It is determined, can also be determined, can made by the area of first operating body and the first operating body shade The result determined is more accurate.
It should be understood by those skilled in the art that, embodiments of the invention can be provided as method, system or computer program Product.Therefore, the present invention can use the reality in terms of complete hardware embodiment, complete software embodiment or combination software and hardware Apply the form of example.Moreover, the present invention can use the computer for wherein including computer usable program code in one or more The shape for the computer program product that usable storage medium is implemented on (including but is not limited to magnetic disk storage and optical memory etc.) Formula.
The present invention is the flow with reference to method according to embodiments of the present invention, equipment (system) and computer program product Figure and/or block diagram describe.It should be understood that can be by every first-class in computer program instructions implementation process figure and/or block diagram Journey and/or the flow in square frame and flow chart and/or block diagram and/or the combination of square frame.These computer programs can be provided The processors of all-purpose computer, special-purpose computer, Embedded Processor or other programmable data processing devices is instructed to produce A raw machine so that produced by the instruction of computer or the computing device of other programmable data processing devices for real The device for the function of being specified in present one flow of flow chart or one square frame of multiple flows and/or block diagram or multiple square frames.
These computer program instructions, which may be alternatively stored in, can guide computer or other programmable data processing devices with spy Determine in the computer-readable memory that mode works so that the instruction being stored in the computer-readable memory, which produces, to be included referring to Make the manufacture of device, the command device realize in one flow of flow chart or multiple flows and/or one square frame of block diagram or The function of being specified in multiple square frames.
These computer program instructions can be also loaded into computer or other programmable data processing devices so that counted Series of operation steps is performed on calculation machine or other programmable devices to produce computer implemented processing, so as in computer or The instruction performed on other programmable devices is provided for realizing in one flow of flow chart or multiple flows and/or block diagram one The step of function of being specified in individual square frame or multiple square frames.
Obviously, those skilled in the art can carry out the essence of various changes and modification without departing from the present invention to the present invention God and scope.So, if these modifications and variations of the present invention belong to the scope of the claims in the present invention and its equivalent technologies Within, then the present invention is also intended to comprising including these changes and modification.

Claims (18)

  1. A kind of 1. information processing method, using an electronic equipment, it is characterised in that the electronic equipment include a projection module and One image acquisition units, methods described include:
    When being projected to a content to be projected on one projection operation face by the projection module, gathered by described image single Member the first image information of collection, described first image information include projection operation's face image corresponding to the projection operation face and believed Breath;
    According to described first image information, judge whether to include in described first image information corresponding to one first operating body One image;
    When described first image corresponding to first operating body be present, according to described first image and first operating body Second image of the first operating body shade on the projection operation face, determine first operating body in described first image The second place of first position and the first operating body shade in described first image information in information;
    According to the first position and the second place, it is the table for contacting the projection operation face to determine first operating body Face, or the surface not in contact with the projection operation face;
    If it is determined that first operating body is the surface for contacting the projection operation face, first operating body is monitored, The surface in the projection operation face is have left to determine first operating body at the first moment, obtains monitored results;
    The first operation that first operating body carried out is determined according to the monitored results, so as to according to the described first behaviour Response mode corresponding to determining.
  2. 2. the method as described in claim 1, it is characterised in that determine first operating body in described first image information First position and the first operating body shade include in the step of second place in described first image information:Described The first identification point corresponding to first operating body is determined in first image, the position of first identification point is defined as described First position, and the summit of the first operating body shade is determined in described first image, the position for determining the summit is The second place.
  3. 3. method as claimed in claim 2, it is characterised in that of first operating body in described image information Second image of the first operating body shade of one image and first operating body on the projection operation face, determine described First position and the first operating body shade of one operating body in the range of pre-set space on the projection operation face The step of two positions, includes:According to described first image and second image, the first coordinate of first identification point is determined, And second coordinate on the summit.
  4. 4. method as claimed in claim 3, it is characterised in that according to the first position and the second place, determine institute It is the surface for contacting the projection operation face to state the first operating body, or not in contact with the projection operation face surface the step of wrap Include:Judge whether the first position and the second place overlap in described first image information, or judge described first Whether the distance of position and the second place in described first image information is not more than the first pre-determined distance.
  5. 5. method as claimed in claim 4, it is characterised in that according to the first position and the second place, determine institute It is the surface for contacting the projection operation face to state the first operating body, or not in contact with the projection operation face surface the step of wrap Include:
    When the first position and the second place overlap in described first image information, first operating body is determined The surface in the projection operation face is contacted, otherwise determines surface of first operating body not in contact with the projection operation face;Or
    When the distance of the first position and the second place in described first image information is not more than the first pre-determined distance When, determine that first operating body contacts the surface in the projection operation face, otherwise determine first operating body not in contact with institute State the surface in projection operation face.
  6. 6. method as claimed in claim 4, it is characterised in that judge the first position and the second place described Whether overlapped in one image information, or judge the first position and the second place in described first image information away from From whether be not more than the first pre-determined distance the step of include:Judge first coordinate and second coordinate in first figure As whether being same coordinate in information, or judge first coordinate and second coordinate in described first image information Whether distance is not more than first pre-determined distance.
  7. 7. method as claimed in claim 5, it is characterised in that according to the first position and the second place, determine institute It is the surface for contacting the projection operation face to state the first operating body, or is also wrapped before not in contact with the surface in the projection operation face Include step:
    Determine second area corresponding to the first area corresponding to described first image and second image;
    Judge whether first area is equal with the second area, obtain the first judged result;
    When the first position and the second place overlap in described first image information, first operating body is determined The surface in the projection operation face is contacted, otherwise determines step of first operating body not in contact with the surface in the projection operation face Suddenly include:When the first position and the second place overlap in described first image information, and described first judges knot When fruit shows that first area is equal with the second area, determine that first operating body contacts the projection operation face Surface, otherwise determine surface of first operating body not in contact with the projection operation face.
  8. 8. method as claimed in claim 5, it is characterised in that according to the first position and the second place, it is determined that First operating body is the surface for contacting the projection operation face, or is gone back before not in contact with the surface in the projection operation face Including step:
    Determine second area corresponding to the first area corresponding to described first image and second image;
    Judge whether first area is equal with the second area, obtain the first judged result;
    When the distance of the first position and the second place in described first image information is not more than the first pre-determined distance When, determine that first operating body contacts the surface in the projection operation face, otherwise determine first operating body not in contact with institute The step of surface for stating projection operation face, includes:When the first position and the second place are in described first image information Distance be not more than first pre-determined distance, and first judged result shows first area and the second area When equal, determine that first operating body contacts the surface in the projection operation face, otherwise determine that first operating body does not connect Touch the surface in the projection operation face.
  9. 9. the method as described in claim 1, it is characterised in that determine that first operating body is entered according to the monitored results The first capable operation, so as to the response mode according to corresponding to the described first operation determination:
    Obtain first operation that first operating body is carried out;
    Described first operation is responded, calls the first function corresponding with the described first operation.
  10. 10. a kind of electronic equipment, it is characterised in that the electronic equipment includes a projection module and an image acquisition units, institute Stating electronic equipment also includes:
    Operation module, for when being projected to a content to be projected on one projection operation face by the projection module, passing through Described image collecting unit gathers the first image information, and described first image information includes projecting corresponding to the projection operation face Operating surface image information;
    First judge module, for according to described first image information, judging whether include one in described first image information First image corresponding to first operating body;
    First determining module, for when described first image corresponding to first operating body be present, according to first figure Second image of the first operating body shade of picture and first operating body on the projection operation face, determine first behaviour Make first position and the first operating body shade of the body in described first image information in described first image information The second place;
    Second determining module, for according to the first position and the second place, it to be contact to determine first operating body The surface in the projection operation face, or the surface not in contact with the projection operation face;
    Monitoring module, for if it is determined that first operating body is the surface that contacts the projection operation face, being grasped to described first It is monitored as body, the surface in the projection operation face is have left to determine first operating body at the first moment, is supervised Control result;
    4th determining module, for determining the first operation that first operating body carried out according to the monitored results, so as to Response mode corresponding to determining can be operated according to described first.
  11. 11. electronic equipment as claimed in claim 10, it is characterised in that first determining module is specifically used for:Described The first identification point corresponding to first operating body is determined in first image, the position of first identification point is defined as described First position, and the summit of the first operating body shade is determined in described first image, the position for determining the summit is The second place.
  12. 12. electronic equipment as claimed in claim 11, it is characterised in that first determining module is specifically used for:According to institute The first image and second image are stated, determines the first coordinate of first identification point, and second coordinate on the summit.
  13. 13. electronic equipment as claimed in claim 12, it is characterised in that first judge module is specifically used for:Judge institute State whether first position and the second place overlap in described first image information, or judge the first position and described Whether distance of the second place in described first image information is not more than the first pre-determined distance.
  14. 14. electronic equipment as claimed in claim 13, it is characterised in that second determining module is specifically used for:
    When the first position and the second place overlap in described first image information, first operating body is determined The surface in the projection operation face is contacted, otherwise determines surface of first operating body not in contact with the projection operation face;Or
    When the distance of the first position and the second place in described first image information is not more than the first pre-determined distance When, determine that first operating body contacts the surface in the projection operation face, otherwise determine first operating body not in contact with institute State the surface in projection operation face.
  15. 15. electronic equipment as claimed in claim 13, it is characterised in that first judge module is specifically used for:Judge institute Whether be same coordinate, or judge first coordinate if stating the first coordinate and second coordinate in described first image information Whether it is not more than first pre-determined distance with distance of second coordinate in described first image information.
  16. 16. electronic equipment as claimed in claim 14, it is characterised in that the electronic equipment also include the 3rd determining module and Second judge module;
    3rd determining module is used for:Determine corresponding to the first area corresponding to described first image and second image Two areas;
    Second judge module is used for:Judge whether first area is equal with the second area, obtain first and judge As a result;
    Second determining module is specifically used for:When the first position and the second place are in described first image information Overlap, and when first judged result shows that first area is equal with the second area, determine that described first operates Body contacts the surface in the projection operation face, otherwise determines surface of first operating body not in contact with the projection operation face.
  17. 17. electronic equipment as claimed in claim 14, it is characterised in that the electronic equipment also include the 3rd determining module and Second judge module;
    3rd determining module is used for:Determine corresponding to the first area corresponding to described first image and second image Two areas;
    Second judge module is used for:Judge whether first area is equal with the second area, obtain first and judge As a result;
    Second determining module is specifically used for:When the first position and the second place are in described first image information Distance be not more than first pre-determined distance, and first judged result shows first area and the second area When equal, determine that first operating body contacts the surface in the projection operation face, otherwise determine that first operating body does not connect Touch the surface in the projection operation face.
  18. 18. electronic equipment as claimed in claim 10, it is characterised in that the electronic equipment also includes acquisition module and calling Module;
    The acquisition module is used to obtain first operation that first operating body is carried out;
    The calling module is used to respond the described first operation, calls the first function corresponding with the described first operation.
CN201310076677.9A 2013-03-11 2013-03-11 A kind of information processing method and electronic equipment Active CN104049807B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310076677.9A CN104049807B (en) 2013-03-11 2013-03-11 A kind of information processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310076677.9A CN104049807B (en) 2013-03-11 2013-03-11 A kind of information processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN104049807A CN104049807A (en) 2014-09-17
CN104049807B true CN104049807B (en) 2017-11-28

Family

ID=51502763

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310076677.9A Active CN104049807B (en) 2013-03-11 2013-03-11 A kind of information processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN104049807B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107332990A (en) * 2017-06-28 2017-11-07 上海青橙实业有限公司 Control method and projectable's mobile terminal

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101887325A (en) * 2009-05-13 2010-11-17 董兰荣 Optical touch control method and device thereof
CN102566835A (en) * 2012-01-06 2012-07-11 福州锐达数码科技有限公司 Electronic whiteboard system for image touch and implementation method thereof
CN102841733A (en) * 2011-06-24 2012-12-26 株式会社理光 Virtual touch screen system and method for automatically switching interaction modes

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101887325A (en) * 2009-05-13 2010-11-17 董兰荣 Optical touch control method and device thereof
CN102841733A (en) * 2011-06-24 2012-12-26 株式会社理光 Virtual touch screen system and method for automatically switching interaction modes
CN102566835A (en) * 2012-01-06 2012-07-11 福州锐达数码科技有限公司 Electronic whiteboard system for image touch and implementation method thereof

Also Published As

Publication number Publication date
CN104049807A (en) 2014-09-17

Similar Documents

Publication Publication Date Title
CN103339593B (en) The system and method for multiple frames to be presented on the touchscreen
CN106095226B (en) The method and apparatus of application program is run in electric terminal
JP2019037783A (en) Shooting game control method and device, storage medium, processor, and terminal
US9652038B2 (en) Magnetic tracking of glove fingertips
CN105159687B (en) A kind of information processing method, terminal and computer-readable storage medium
CN104364734B (en) Remote session control using multi-touch inputs
US20210008450A1 (en) Response Method, Apparatus and Terminal to a Control
US20160246370A1 (en) Magnetic tracking of glove fingertips with peripheral devices
CN110523085A (en) Control method, device, terminal and the storage medium of virtual objects
US10528247B2 (en) Operation system having touch operation enabling use of large screen area, operation control method, and operation control program
CN103713829B (en) System switching method, device and electronic equipment
CN104076988B (en) A kind of display methods, display control method and electronic equipment
JP2016534481A (en) System and method for providing a response to user input using information regarding state changes and predictions of future user input
CN106445347A (en) Interface displaying method and device
CN107185231A (en) Information processing method and device, storage medium, electronic equipment
CN103092525A (en) Screen capture method for mobile terminal and mobile terminal based on the screen capture method
CN105955802A (en) Application operation method for mobile terminal, and mobile terminal
CN107256121A (en) Control method and device, computer installation and the readable storage medium storing program for executing of control
US20130241944A1 (en) Electronic Device and Display Control Method Thereof
CN103873759B (en) A kind of image pickup method and electronic equipment
CN105183538B (en) A kind of information processing method and electronic equipment
WO2016141597A1 (en) Touch control method, device, terminal and graphical user interface thereof
CN110134237A (en) Interface control method and relevant device
CN104049807B (en) A kind of information processing method and electronic equipment
CN107577404A (en) Information processing method, device and electronic equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant