CN104656903A - Processing method for display image and electronic equipment - Google Patents

Processing method for display image and electronic equipment Download PDF

Info

Publication number
CN104656903A
CN104656903A CN201510095092.0A CN201510095092A CN104656903A CN 104656903 A CN104656903 A CN 104656903A CN 201510095092 A CN201510095092 A CN 201510095092A CN 104656903 A CN104656903 A CN 104656903A
Authority
CN
China
Prior art keywords
display screen
point
input gesture
electronic equipment
thing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510095092.0A
Other languages
Chinese (zh)
Inventor
钟帆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201510095092.0A priority Critical patent/CN104656903A/en
Publication of CN104656903A publication Critical patent/CN104656903A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides an information processing method which comprises the following steps: acquiring a first input gesture positioned above a display screen of electronic equipment by an image acquisition device of the electronic equipment, wherein the first input gesture comprises a datum point and an indicating line, a connecting line between a first mapping point of the datum point on the display screen and the datum point is perpendicular to the display screen, and an extension line of the indicating line corresponds to a second mapping point on the display screen; analyzing the first input gesture and obtaining first position information of the second mapping point on the display screen; determining a first display object on the display screen according to the first position information. According to the information processing method and the electronic equipment provided by the invention, a target or an area can be displayed in a highlighting form or amplified and extended under the condition that a user does not touch the display screen, large display screen interaction equipment is more efficient and practical, and the use experience of the user is improved.

Description

A kind of display image processing method and electronic equipment
Technical field
The present invention relates to a kind of information processing method and electronic equipment, more particularly, relate to disposal route and the electronic equipment of a kind of image display.
Background technology
At desktop PC (such as, association Horizon) or at other in-plant large display screen interactive device (such as, large display screen all-in-one) application scenarios in, user needs when not touching display screen and finger has certain distance with display screen sometimes, certain target of showing or region on instruction display screen, be highlighted to make this target or region or amplify, launch.Such as, a typical application scenarios can comprise: when large display screen interactive device is used to display map or building layout, user needs to use it to discuss the information relevant to geographic position or concrete building, now, when the Fingers of user is put to certain on display screen, the building of this point or this position can be needed to be highlighted.In order to realize this application demand, a scheme that can adopt is: use extra laser pen, points to certain point on large display screen.But this scheme needs the equipment outside plus, be not easy to user and use, and laser pen can only project a bright spot, and display screen cannot be made actual highlighted or amplify and be instructed to target.Another solution that can adopt is: when finger touching or unsettled side on the display screen and when being instructed on target, utilize sensor to judge the touching that user points or close, then highlighted or amplification respective objects.But this scheme needs finger to touch or is unsettledly being instructed to directly over target, user's posture is now nature, cannot as in real-life certain target of oblique indication.
Therefore, how can utilize equipment existing capability make user can when do not touch display screen and finger with display screen there is certain distance, give directions certain target or region that display screen shows, be highlighted to make this target or region or amplify, launch, and do not need extra additional firmware equipment, be current problem demanding prompt solution.
Summary of the invention
In order to solve above-mentioned technical matters of the prior art, according to an aspect of the present invention, a kind of information processing method is provided, wherein, described method comprises: the first input gesture above the display screen being positioned at described electronic equipment by the image acquisition device of described electronic equipment, wherein, described first input gesture comprises reference point and index line, and the line of first mapping point of described reference point on described display screen and described reference point is perpendicular to described display screen; The extended line of described index line corresponds to the second mapping point on described display screen; Resolve described first input gesture, obtain the primary importance information of described second mapping point on described display screen; And the first display object determining on described display screen based on described primary importance information.
In addition, according to one embodiment of present invention, wherein, described reference point corresponds to the position away from display screen one end of the sensing thing of described first input gesture, described index line corresponds to the pointing direction of the sensing thing of described first input gesture, further, the described first input gesture of described parsing, obtains the primary importance information of described second mapping point on described display screen and comprises further: obtain close display screen one end of described sensing thing and the distance d of display screen 1, described sensing thing the distance d away from display screen one end and display screen 2, point to the length l of thing, close display screen one end vertical projection of described sensing thing to the coordinate position l of the point of display screen 1and described sensing thing away from the coordinate position l of display screen one end vertical projection to the point of display screen 2, according to d 1with d 2, point to the length l of thing and l 1with l 2, calculate the coordinate of the primary importance of the straight line at the place of described sensing thing and the intersection point of display screen intersection.
In addition, according to one embodiment of present invention, described information processing method, comprise further: when the invariant position of described reference point and the direction of described index line changes time, the position of described second mapping point changes to the second place along with the change in index line direction, now, only need again to obtain close display screen one end of described sensing thing and the distance d of display screen 3and close display screen one end vertical projection of described sensing thing is to the coordinate position l of the point of display screen 3, according to d 3with d 2, point to the length l of thing and l 3with l 2, calculate the second place information of described second mapping point on described display screen.
In addition, according to one embodiment of present invention, described information processing method, comprise further: with the image of predetermined time interval by described image acquisition device input gesture, when the position of described reference point changes, the image of the second input gesture above the display screen being then positioned at described electronic equipment by the image collecting device Resurvey of described electronic equipment, and described second input gesture is resolved and determined the second display object on described display screen.
In addition, according to one embodiment of present invention, described information processing method, comprises further: the first display object on determined for described primary importance information described display screen carried out highlighted or amplify display.
In addition, according to one embodiment of present invention, described information processing method, comprise further: when the angle of described index line and display screen place plane equals 90 degree, then based on the position of the reference point of described first input gesture, the image at described first mapping point place is defined as described first display object.
According to a further aspect in the invention, additionally provide a kind of electronic equipment, described electronic equipment comprises: display screen, and described display screen is for showing at least one display object; Image collecting device, described image collecting device for gather be positioned at described electronic equipment display screen above first input gesture, wherein, described first input gesture comprises reference point and index line, and the line of first mapping point of described reference point on described display screen and described reference point is perpendicular to described display screen; The extended line of described index line corresponds to the second mapping point on described display screen; And central processing unit, described central processing unit, for resolving described first input gesture, obtains the primary importance information of described second mapping point on described display screen; Further, the first display object on described display screen is determined based on described primary importance information.
In addition, according to one embodiment of present invention, wherein, described reference point corresponds to the position away from display screen one end of the sensing thing of described first input gesture, described index line corresponds to the pointing direction of the sensing thing of described first input gesture, further, central processing unit is further used for: obtain close display screen one end of described sensing thing and the distance d of display screen 1, described sensing thing the distance d away from display screen one end and display screen 2, point to the length l of thing, close display screen one end vertical projection of described sensing thing to the coordinate position l of the point of display screen 1and described sensing thing away from the coordinate position l of display screen one end vertical projection to the point of display screen 2, according to d 1with d 2, point to the length l of thing and l 1with l 2, calculate the coordinate of the primary importance of the straight line at the place of described sensing thing and the intersection point of display screen intersection.
In addition, according to one embodiment of present invention, wherein, when the invariant position of described reference point and the direction of described index line changes time, the position of described second mapping point changes to the second place along with the change in index line direction, now, described central processing unit is further used for: obtain close display screen one end of described sensing thing and the distance d of display screen 3and close display screen one end vertical projection of described sensing thing is to the coordinate position l of the point of display screen 3, according to d 3with d 2, point to the length l of thing and l 3with l 2, calculate the second place information of described second mapping point on described display screen.
In addition, according to one embodiment of present invention, wherein, described image collecting device is further used for: with the image of predetermined time interval Gather and input gesture; When the position of described reference point changes, Resurvey is positioned at the image of the second input gesture above the display screen of described electronic equipment, and described central processing unit is further used for: resolve described second input gesture, and determine the second display object on described display screen.
In addition, according to one embodiment of present invention, wherein, described display screen is further used for: the first display object on determined for described primary importance information described display screen carried out highlighted or amplify display.
In addition, according to one embodiment of present invention, wherein, described central processing unit is further used for: when the angle of described index line and display screen place plane equals 90 degree, then based on the position of the reference point of described first input gesture, the image at described first mapping point place is defined as described first display object.
As can be seen here, information processing method provided by the invention and electronic equipment, can utilize equipment existing capability can make user can when do not touch display screen and finger with display screen there is certain distance, give directions certain target or region that display screen shows, be highlighted to make this target or region or amplify, launch, and do not need extra additional firmware equipment, thus make large display screen interactive device become more efficient and practical, and improve user's experience.
Accompanying drawing explanation
Fig. 1 shows the process flow diagram of the information processing method 100 being applied to an electronic equipment according to the embodiment of the present invention;
Fig. 2 shows the process flow diagram of the information processing method 200 being applied to an electronic equipment according to an example of the present invention;
Fig. 3 shows to input by gesture according to the user of an example of the present invention and carries out mutual schematic diagram with electronic equipment 300;
Fig. 4 shows the exemplary block diagram of the electronic equipment 400 according to the embodiment of the present invention;
Fig. 5 shows to input by gesture according to the user of an example of the present invention and carries out mutual schematic diagram with electronic equipment 500.
Embodiment
Hereinafter, the preferred embodiments of the present invention are described in detail with reference to accompanying drawing.Note, in the present description and drawings, there is substantially the same step and represent with the identical Reference numeral of element, and will be omitted the repetition of explanation of these steps and element.
Mentioned in the whole text " embodiment " or " embodiment " of this instructions means and to be contained at least one described embodiment in conjunction with special characteristic, structure or the characteristic described by described embodiment.Therefore, the appearance of phrase " in one embodiment " or " in one embodiment " all may not only be with same embodiment in the description.In addition, described special characteristic, structure or characteristic can any applicable mode be combined in one or more embodiments.
Fig. 1 shows the process flow diagram of the information processing method 100 being applied to an electronic equipment according to the embodiment of the present invention.Usually, in one embodiment of the invention, described electronic equipment can have display screen, described display screen can be set in a plurality of directions, especially, display screen described in this electronic equipment can be set in the horizontal direction or on the vertical direction placed with horizontal plane by putting.And, described electronic equipment can also have at least two image collecting devices being arranged on diverse location, such as, near at least two drift angles that described at least two image collecting devices can be arranged at the display screen of described electronic equipment or at least two frames of described electronic equipment, the three-dimensional image of the gesture of the input of user can be acquired by described at least two image collecting devices.
Below, with reference to Fig. 1, the information processing method 100 for an electronic equipment according to an embodiment of the invention is described.As shown in Figure 1, first, in step s 110, the first input gesture above the display screen that can be positioned at described electronic equipment by the image acquisition device of described electronic equipment, wherein, described first input gesture can comprise reference point and index line, and the line of first mapping point of described reference point on described display screen and described reference point is perpendicular to described display screen; The extended line of described index line can correspond to the second mapping point on described display screen.In one embodiment of the invention, described reference point can correspond to the position away from display screen one end of the sensing thing of described first input gesture, and described index line can correspond to the pointing direction of the sensing thing of described first input gesture.
Next, in the step s 120, described first input gesture can be resolved, obtain the primary importance information of described second mapping point on described display screen.In one embodiment of the invention, this step can specifically comprise: obtain close display screen one end of described sensing thing and the distance d of display screen 1, described sensing thing the distance d away from display screen one end and display screen 2, point to the length l of thing, close display screen one end vertical projection of described sensing thing to the coordinate position l of the point of display screen 1and described sensing thing away from the coordinate position l of display screen one end vertical projection to the point of display screen 2, according to d 1with d 2, point to the length l of thing and l 1with l 2, the coordinate of the primary importance of the straight line at the place of described sensing thing and the intersection point of display screen intersection can be calculated.In another embodiment of the present invention, when the invariant position of described reference point and the direction of described index line changes time, the position of described second mapping point can change to the second place along with the change in index line direction, now, only need again to obtain close display screen one end of described sensing thing and the distance d of display screen 3and close display screen one end vertical projection of described sensing thing is to the coordinate position l of the point of display screen 3, according to d 3with d 2, point to the length l of thing and l 3with l 2, the second place information of described second mapping point on described display screen can be calculated.
Finally, in step s 130, which, the first display object on described display screen can be determined based on described primary importance information.Particularly, the first display object on determined for described primary importance information described display screen can be carried out highlighted or amplify display.In one embodiment of the invention, when the angle of described index line and display screen place plane equals 90 degree, then can based on the position of the reference point of described first input gesture, the image at described first mapping point place is defined as described first display object, and the first display object on determined for described primary importance information described display screen is carried out highlighted or amplify display.In another embodiment of the present invention, can with the image of predetermined time interval by described image acquisition device input gesture, when the position of described reference point changes, the image of the second input gesture above the display screen being then positioned at described electronic equipment by the image collecting device Resurvey of described electronic equipment, and the second display object determined on described display screen is resolved to described second input gesture.
In order to illustrate in greater detail method provided by the invention use situation in the above-described embodiments, be exemplified below with reference to Fig. 2.Fig. 2 shows the process flow diagram of the information processing method 200 being applied to an electronic equipment according to an example of the present invention, and described electronic equipment has display screen, and described display screen can be set in the horizontal direction or on vertical direction.Particularly, in this example, described display screen is set in horizontal direction.Fig. 3 shows to input by gesture according to the user of an example of the present invention and carries out mutual schematic diagram with electronic equipment 300.As shown in Figure 3, electronic equipment 300 according to this example can comprise display screen 310 and image collecting device, wherein, image collecting device can comprise at least two cameras, such as, in this example, the image collecting device of electronic equipment 300 comprises four cameras (321,322,323,324).As shown in Figure 3, described four cameras can be arranged on four frames of the display screen 310 of electronic equipment 300 symmetrically.
Particularly, information processing method 200 according to an example of the present invention can comprise the following steps: as shown in Figure 2, first, in step S210, the input gesture 330 above the display screen 310 that can be positioned at described electronic equipment 300 by the image acquisition device of described electronic equipment 300.Particularly, as shown in Figure 3, the gesture 330 of user's input can be gathered by four cameras 321,322,323,324 be positioned on four frames of display screen 310.Usually, gesture 330 can comprise reference point 331 and index line 332.As shown in figure, when the input gesture of user be stretch out forefinger point to the upper display object 340 of display screen time, after the input gesture being obtained user by image collecting device, can identify the 3-D view of the input gesture of user, identify the forefinger stretched out of the input gesture 330 of user and two ends thereof and the length of this forefinger can be obtained based on three-dimensional recognition technology.Described 3-D view recognition methods, for known described in those skilled in the art, does not repeat them here.The reference point 331 of gesture 330 can be the forefinger of the input gesture 330 of user away from display screen one end 330, index line 332 can correspond to the pointing direction of the forefinger of the gesture 330 of user.The line of the mapping point of reference point 331 on display screen 310 333 and benchmark 331 is perpendicular to display screen 310, and the extended line of index line 332 and the Plane intersects at display screen 310 place are in mapping point 341.
Then, in step S220, described input gesture 330 can be resolved, to obtain the positional information of mapping point 341 on described display screen.Such as, in an example of the present invention, close display screen one end 335 of the forefinger stretched out of the gesture 330 obtaining user and the distance d of display screen can be identified by the 3-D view of the input gesture to user 1, user gesture 330 the forefinger stretched out away from one end 331 of display screen 310 and the distance d of display screen 310 2, the length l of forefinger of user, close display screen 310 one end vertical projection of the forefinger of user is to the coordinate position l of the point 334 of display screen 310 1and the forefinger of user away from the coordinate position l of display screen 310 one end 331 vertical projection to the point of display screen 310 2, according to d 1with d 2, the length l of forefinger of user and l 1with l 2, based on relevant solid geometry principle, the position coordinates of the intersection point 341 of the intersection of the straight line at index line 332 place and the plane at display screen 310 place namely can be calculated.In another example of the present invention, reference point 331 and the line of mapping point 333 can be defined as l 3, index line 332 and l can be obtained 3between angle theta, close display screen one end 335 of the forefinger stretched out of the gesture 330 of user and the distance d of display screen 1, user gesture 330 the forefinger stretched out away from display screen 310 one end 331 distance d with display screen 310 2, close display screen 310 one end vertical projection of the forefinger of user is to the coordinate position l of the point 334 of display screen 310 1and the forefinger of user away from the coordinate position l of display screen 310 one end 331 vertical projection to the point of display screen 310 2, according to d 1with d 2, index line 332 and l 3between angle theta and l 1with l 2, based on relevant solid geometry principle, the position coordinates of the intersection point 341 of the intersection of the straight line at index line 332 place and the plane at display screen 310 place also can be calculated.Usually, multiple computing method can be used to obtain the positional information of mapping point 341 on described display screen 310, do not repeat one by one at this.
Finally, in step S230, can based on the display object 340 on the coordinate position determination display screen 310 of mapping point 341 on display screen 310.Further, the display object 340 at mapping point 341 place can be carried out highlighted or amplify display.In addition, in one embodiment of the invention, when index line 332 and display screen 310 place plane orthogonal, now, mapping point 333 overlaps with mapping point 341, therefore based on the position of the reference point 331 of the input gesture of user, the image at mapping point 333 place can be defined as display object 340, and display object 340 be carried out highlighted or amplify display.In another embodiment of the present invention, the image of the gesture of user can be gathered by least two cameras of electronic equipment 300 with predetermined time interval, when the position that user inputs the reference point 311 of gesture changes, then by the input gesture of four camera Resurvey users of the image collecting device of electronic equipment 300, and the gesture that user re-enters is resolved, determine another display object on the display screen indicated by the gesture that user re-enters.
As can be seen here, the information processing method 100 that the application of the invention provides, can utilize electronic equipment existing capability can make user can when do not touch display screen and finger with display screen there is certain distance, give directions certain target or region that display screen shows, be highlighted to make this target or region or amplify, launch, and do not need extra additional firmware equipment, thus make large display screen interactive device become more efficient and practical, and improve user's experience.
Below, illustrate according to electronic equipment 400 of the present invention with reference to Fig. 4.Fig. 4 shows the exemplary block diagram of the electronic equipment 400 according to the embodiment of the present invention, described electronic equipment has display screen, described display screen can be set in a plurality of directions, especially, described display screen can be set in the horizontal direction or on the vertical direction placed with horizontal plane.And, described electronic equipment can also have at least two image collecting devices being arranged on diverse location, such as, near at least two drift angles that described at least two image collecting devices can be arranged at the display screen of described electronic equipment or at least two frames of described electronic equipment, the three-dimensional image of the gesture of the input of user can be acquired by described at least two image collecting devices.
Below, with reference to Fig. 4, electronic equipment 400 according to an embodiment of the invention is described.As shown in Figure 4, electronic equipment 400 can comprise: display screen 410, image collecting device 420 and central processing unit 430.
Particularly, display screen 410 can be configured to show at least one display object.
Image collecting device 420 may be used for gathering be positioned at described electronic equipment display screen 410 above first input gesture, wherein, described first input gesture comprises reference point and index line, and the line of first mapping point of described reference point on described display screen 410 and described reference point is perpendicular to described display screen 410; The extended line of described index line corresponds to the second mapping point on described display screen 410.In one embodiment of the invention, described reference point can correspond to the position away from display screen 410 one end of the sensing thing of described first input gesture, and described index line can correspond to the pointing direction of the sensing thing of described first input gesture.
Central processing unit 430 may be used for resolving described first input gesture, obtains the primary importance information of described second mapping point on described display screen 410.In one embodiment of the invention, central processing unit 430 can be further used for: obtain close display screen 410 one end of described sensing thing and the distance d of display screen 410 1, described sensing thing the distance d away from display screen 410 one end and display screen 410 2, point to the length l of thing, close display screen 410 one end vertical projection of described sensing thing to the coordinate position l of the point of display screen 410 1and described sensing thing away from the coordinate position l of display screen 410 one end vertical projection to the point of display screen 410 2, according to d 1with d 2, point to the length l of thing and l 1with l 2, calculate the coordinate of the primary importance of the straight line at the place of described sensing thing and the intersection point of display screen 410 intersection.In another embodiment of the present invention, when the invariant position of described reference point and the direction of described index line changes time, the position of described second mapping point can change to the second place along with the change in index line direction, now, described central processing unit 430 can be further used for: obtain close display screen 410 one end of described sensing thing and the distance d of display screen 410 3and close display screen 410 one end vertical projection of described sensing thing is to the coordinate position l of the point of display screen 410 3, according to d 3with d 2, point to the length l of thing and l 3with l 2, calculate the second place information of described second mapping point on described display screen 410.
Further, central processing unit 430 can also determine the first display object on described display screen 410 based on described primary importance information.Further, the first display object on determined for described primary importance information described display screen 410 can also be carried out highlighted or be amplified display by display screen 410.In one embodiment of the invention, described central processing unit 430 can be further used for: when the angle of described index line and display screen 410 place plane equals 90 degree, then can based on the position of the reference point of described first input gesture, the image at described first mapping point place is defined as described first display object, and the first display object on determined for described primary importance information described display screen 410 is carried out or is amplified display by described display screen 410 highlighted.In another embodiment of the present invention, described image collecting device 420 can be further used for the image of predetermined time interval Gather and input gesture, when the position of described reference point changes, the second input gesture above the display screen 410 being then positioned at described electronic equipment 400 by image collecting device 420 Resurvey of described electronic equipment 400, and central processing unit 430 resolves to described second input gesture the second display object determined on described display screen 410.
In order to illustrate in greater detail method provided by the invention use situation in the above-described embodiments, be exemplified below with reference to Fig. 5.Fig. 5 shows to input by gesture according to the user of an example of the present invention and carries out mutual schematic diagram with electronic equipment 500, described electronic equipment 500 has display screen 510, and described display screen can be set in the horizontal direction or on vertical direction.Particularly, in this example, described display screen 510 is set in the horizontal direction.As shown in Figure 5, display screen 510, image collecting device 520 can be comprised according to the electronic equipment 500 of this example and be positioned at the central processing unit (not shown) of electronic equipment 500 inside, wherein, image collecting device 520 can comprise at least two cameras, such as, in this example, the image collecting device 520 of electronic equipment 500 can comprise four cameras.As shown in Figure 5, described four cameras can be arranged on four frames of the display screen 510 of electronic equipment 500 symmetrically.
Particularly, may be used for showing at least one display object according to the display screen 510 of the electronic equipment 500 of this example, such as display object 540.Image collecting device 520 may be used for gathering the input gesture 530 above the display screen 510 being positioned at described electronic equipment 500.Particularly, as shown in Figure 5, the gesture 530 of user's input can be gathered by four cameras be positioned on four frames of display screen 510.Usually, gesture 530 can comprise reference point 531 and index line 532.As shown in figure, when the input gesture of user be stretch out forefinger point to the upper display object 540 of display screen time, after the input gesture being obtained user by image collecting device, can identify the 3-D view of the input gesture of user, identify the forefinger stretched out of the input gesture 330 of user and two ends thereof and the length of this forefinger can be obtained based on three-dimensional recognition technology.Described 3-D view recognition methods, for known described in those skilled in the art, does not repeat them here.The reference point 531 of gesture 530 can be the forefinger of the input gesture 530 of user away from display screen one end, index line 532 can correspond to the pointing direction of the forefinger of the gesture 530 of user.The line of the mapping point of reference point 531 on display screen 510 533 and benchmark 531 is perpendicular to display screen 510, and the extended line of index line 532 and the Plane intersects at display screen 510 place are in mapping point 541.
Central processing unit may be used for resolving described input gesture 530, to obtain the positional information of mapping point 541 on described display screen 510.Such as, in an example of the present invention, close display screen one end 535 of the forefinger stretched out of the gesture 530 obtaining user and the distance d of display screen can be identified by the 3-D view of the input gesture to user 1, user gesture 530 the forefinger stretched out away from display screen 510 one end 531 distance d with display screen 510 2, the length l of forefinger of user, close display screen 510 one end vertical projection of the forefinger of user is to the coordinate position l of the point 534 of display screen 510 1and the forefinger of user away from the coordinate position l of display screen 510 one end 531 vertical projection to the point of display screen 510 2, according to d 1with d 2, the length l of forefinger of user and l 1with l 2, based on relevant solid geometry principle, the position coordinates of the intersection point 541 of the intersection of the straight line at index line 532 place and the plane at display screen 510 place namely can be calculated.In another example of the present invention, reference point 531 and the line of mapping point 533 can be defined as l 3, index line 532 and l can be obtained 3between angle theta, close display screen one end 535 of the forefinger stretched out of the gesture 530 of user and the distance d of display screen 1, user gesture 530 the forefinger stretched out away from display screen 510 one end 531 distance d with display screen 510 2, close display screen 510 one end vertical projection of the forefinger of user is to the coordinate position l of the point 534 of display screen 510 1and the forefinger of user away from the coordinate position l of display screen 510 one end 531 vertical projection to the point of display screen 510 2, according to d 1with d 2, index line 532 and l 3between angle theta and l 1with l 2, based on relevant solid geometry principle, the position coordinates of the intersection point 541 of the intersection of the straight line at index line 532 place and the plane at display screen 510 place also can be calculated.Usually, multiple computing method can be used to obtain the positional information of mapping point 541 on described display screen 510, do not repeat one by one at this.
Central processing unit can also be used for based on the display object 540 on the coordinate position determination display screen 510 of mapping point 541 on display screen 510.Further, the display object 540 at mapping point 541 place can also be carried out highlighted or be amplified display by display screen 510.In addition, in one embodiment of the invention, when index line 532 and display screen 510 place plane orthogonal, now, mapping point 533 overlaps with mapping point 541, described central processing unit can be further used for the position of the reference point 531 of the input gesture based on user, and the image at mapping point 533 place is defined as display object 540, and display object 540 can also be carried out highlighted or be amplified display by display screen 510.In another embodiment of the present invention, described image collecting device 520 can to gather the image of the gesture of user with predetermined time interval by least two cameras of electronic equipment 500, when the position that user inputs the reference point 511 of gesture changes, then by the input gesture of the image collecting device 520 Resurvey user of electronic equipment 500, and central processing unit can be resolved the gesture that user re-enters, determine another display object on the display screen 510 indicated by the gesture that user re-enters.
As can be seen here, the electronic equipment 300 that the application of the invention provides, can utilize electronic equipment existing capability can make user can when do not touch display screen and finger with display screen there is certain distance, give directions certain target or region that display screen shows, be highlighted to make this target or region or amplify, launch, and do not need extra additional firmware equipment, thus make large display screen interactive device become more efficient and practical, and improve user's experience.
Those of ordinary skill in the art can recognize, in conjunction with unit and the algorithm steps of each example of embodiment disclosed herein description, can realize with electronic hardware, computer software or the combination of the two, in order to the interchangeability of hardware and software is clearly described, generally describe composition and the step of each example in the above description according to function.These functions perform with hardware or software mode actually, depend on application-specific and the design constraint of technical method.Those skilled in the art can use distinct methods to realize described function to each specifically should being used for, but this realization should not thought and exceeds scope of the present invention.
It should be appreciated by those skilled in the art that and can be dependent on design requirement and other factors carries out various amendment, combination, incorporating aspects and replacement to the present invention, as long as they are in the scope of appended claims and equivalent thereof.

Claims (12)

1. an information processing method, wherein, described method comprises:
The first input gesture above the display screen being positioned at described electronic equipment by the image acquisition device of described electronic equipment, wherein, described first input gesture comprises reference point and index line, and the line of first mapping point of described reference point on described display screen and described reference point is perpendicular to described display screen; The extended line of described index line corresponds to the second mapping point on described display screen;
Resolve described first input gesture, obtain the primary importance information of described second mapping point on described display screen; And
The first display object on described display screen is determined based on described primary importance information.
2. information processing method as claimed in claim 1, wherein, described reference point corresponds to the position away from display screen one end of the sensing thing of described first input gesture, described index line corresponds to the pointing direction of the sensing thing of described first input gesture, and, the described first input gesture of described parsing, obtains the primary importance information of described second mapping point on described display screen and comprises further: obtain close display screen one end of described sensing thing and the distance d of display screen 1, described sensing thing the distance d away from display screen one end and display screen 2, point to the length l of thing, close display screen one end vertical projection of described sensing thing to the coordinate position l of the point of display screen 1and described sensing thing away from the coordinate position l of display screen one end vertical projection to the point of display screen 2, according to d 1with d 2, point to the length l of thing and l 1with l 2, calculate the coordinate of the primary importance of the straight line at the place of described sensing thing and the intersection point of display screen intersection.
3. information processing method as claimed in claim 2, comprise further: when the invariant position of described reference point and the direction of described index line changes time, the position of described second mapping point changes to the second place along with the change in index line direction, now, only need again to obtain close display screen one end of described sensing thing and the distance d of display screen 3and close display screen one end vertical projection of described sensing thing is to the coordinate position l of the point of display screen 3, according to d 3with d 2, point to the length l of thing and l 3with l 2, calculate the second place information of described second mapping point on described display screen.
4. information processing method as claimed in claim 1, comprise further: with the image of predetermined time interval by described image acquisition device input gesture, when the position of described reference point changes, the image of the second input gesture above the display screen being then positioned at described electronic equipment by the image collecting device Resurvey of described electronic equipment, and described second input gesture is resolved and determined the second display object on described display screen.
5. information processing method as claimed in claim 1, comprises further: the first display object on determined for described primary importance information described display screen carried out highlighted or amplify display.
6. information processing method as claimed in claim 1, comprise further: when the angle of described index line and display screen place plane equals 90 degree, then based on the position of the reference point of described first input gesture, the image at described first mapping point place is defined as described first display object.
7. an electronic equipment, described electronic equipment comprises:
Display screen, described display screen is for showing at least one display object;
Image collecting device, described image collecting device for gather be positioned at described electronic equipment display screen above first input gesture, wherein, described first input gesture comprises reference point and index line, and the line of first mapping point of described reference point on described display screen and described reference point is perpendicular to described display screen; The extended line of described index line corresponds to the second mapping point on described display screen; And
Central processing unit, described central processing unit, for resolving described first input gesture, obtains the primary importance information of described second mapping point on described display screen; Further, the first display object on described display screen is determined based on described primary importance information.
8. electronic equipment as claimed in claim 7, wherein, described reference point corresponds to the position away from display screen one end of the sensing thing of described first input gesture, described index line corresponds to the pointing direction of the sensing thing of described first input gesture, further, central processing unit is further used for: obtain close display screen one end of described sensing thing and the distance d of display screen 1, described sensing thing the distance d away from display screen one end and display screen 2, point to the length l of thing, close display screen one end vertical projection of described sensing thing to the coordinate position l of the point of display screen 1and described sensing thing away from the coordinate position l of display screen one end vertical projection to the point of display screen 2, according to d 1with d 2, point to the length l of thing and l 1with l 2, calculate the coordinate of the primary importance of the straight line at the place of described sensing thing and the intersection point of display screen intersection.
9. electronic equipment as claimed in claim 8, wherein, when the invariant position of described reference point and the direction of described index line changes time, the position of described second mapping point changes to the second place along with the change in index line direction, now, described central processing unit is further used for: obtain close display screen one end of described sensing thing and the distance d of display screen 3and close display screen one end vertical projection of described sensing thing is to the coordinate position l of the point of display screen 3, according to d 3with d 2, point to the length l of thing and l 3with l 2, calculate the second place information of described second mapping point on described display screen.
10. electronic equipment as claimed in claim 7, wherein,
Described image collecting device is further used for: with the image of predetermined time interval Gather and input gesture; When the position of described reference point changes, Resurvey is positioned at the image of the second input gesture above the display screen of described electronic equipment, and
Described central processing unit is further used for: resolve described second input gesture, and determine the second display object on described display screen.
11. electronic equipments as claimed in claim 7, wherein, described display screen is further used for: the first display object on determined for described primary importance information described display screen carried out highlighted or amplify display.
12. electronic equipments as claimed in claim 7, wherein, described central processing unit is further used for: when the angle of described index line and display screen place plane equals 90 degree, then based on the position of the reference point of described first input gesture, the image at described first mapping point place is defined as described first display object.
CN201510095092.0A 2015-03-04 2015-03-04 Processing method for display image and electronic equipment Pending CN104656903A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510095092.0A CN104656903A (en) 2015-03-04 2015-03-04 Processing method for display image and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510095092.0A CN104656903A (en) 2015-03-04 2015-03-04 Processing method for display image and electronic equipment

Publications (1)

Publication Number Publication Date
CN104656903A true CN104656903A (en) 2015-05-27

Family

ID=53248136

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510095092.0A Pending CN104656903A (en) 2015-03-04 2015-03-04 Processing method for display image and electronic equipment

Country Status (1)

Country Link
CN (1) CN104656903A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106843669A (en) * 2016-12-06 2017-06-13 北京小度信息科技有限公司 Application interface operating method and device
CN109828660A (en) * 2018-12-29 2019-05-31 深圳云天励飞技术有限公司 A kind of method and device of the control application operating based on augmented reality
CN112114732A (en) * 2020-09-18 2020-12-22 歌尔科技有限公司 Screen content amplifying method and device and computer readable storage medium
CN112532874A (en) * 2020-11-23 2021-03-19 北京三快在线科技有限公司 Method and device for generating plane thermodynamic diagram, storage medium and electronic equipment
CN114089836A (en) * 2022-01-20 2022-02-25 中兴通讯股份有限公司 Labeling method, terminal, server and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1904806A (en) * 2006-07-28 2007-01-31 上海大学 System and method of contactless position input by hand and eye relation guiding
CN102841679A (en) * 2012-05-14 2012-12-26 乐金电子研发中心(上海)有限公司 Non-contact man-machine interaction method and device
US20130267318A1 (en) * 1997-08-22 2013-10-10 Motion Games, Llc Advanced video gaming methods for education and play using camera based inputs
CN103370678A (en) * 2011-02-18 2013-10-23 维塔驰有限公司 Virtual touch device without pointer

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130267318A1 (en) * 1997-08-22 2013-10-10 Motion Games, Llc Advanced video gaming methods for education and play using camera based inputs
CN1904806A (en) * 2006-07-28 2007-01-31 上海大学 System and method of contactless position input by hand and eye relation guiding
CN103370678A (en) * 2011-02-18 2013-10-23 维塔驰有限公司 Virtual touch device without pointer
CN102841679A (en) * 2012-05-14 2012-12-26 乐金电子研发中心(上海)有限公司 Non-contact man-machine interaction method and device

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106843669A (en) * 2016-12-06 2017-06-13 北京小度信息科技有限公司 Application interface operating method and device
CN109828660A (en) * 2018-12-29 2019-05-31 深圳云天励飞技术有限公司 A kind of method and device of the control application operating based on augmented reality
CN112114732A (en) * 2020-09-18 2020-12-22 歌尔科技有限公司 Screen content amplifying method and device and computer readable storage medium
CN112532874A (en) * 2020-11-23 2021-03-19 北京三快在线科技有限公司 Method and device for generating plane thermodynamic diagram, storage medium and electronic equipment
CN114089836A (en) * 2022-01-20 2022-02-25 中兴通讯股份有限公司 Labeling method, terminal, server and storage medium

Similar Documents

Publication Publication Date Title
CN104656903A (en) Processing method for display image and electronic equipment
JP5422724B1 (en) Electronic apparatus and drawing method
US20160349983A1 (en) Terminal screen shot method and terminal
JP2016053799A5 (en)
US9047001B2 (en) Information processing apparatus, information processing method, and program
CN104020944A (en) Data input method based on sliding block
US20160196034A1 (en) Touchscreen Control Method and Terminal Device
JP5389241B1 (en) Electronic device and handwritten document processing method
US10956030B2 (en) Multi-touch based drawing input method and apparatus
US8954873B2 (en) Information processing apparatus, information processing method and computer readable medium
US9747025B2 (en) Modifying key size on a touch screen based on fingertip location
CN109806585B (en) Game display control method, device, equipment and storage medium
US20140006941A1 (en) Method, system and computer program product for editing a displayed rendering of symbols
JP6202874B2 (en) Electronic device, calibration method and program
EP2767897B1 (en) Method for generating writing data and an electronic device thereof
CN104007920A (en) Method for selecting waveforms on electronic test equipment
CN102750035B (en) The determination method and apparatus of display position of cursor
JP2009110135A (en) Object selecting device
US20190250814A1 (en) Segment Length Measurement Using a Touch Screen System in Response to Gesture Input
CN105511772A (en) Method, device and mobile terminal using gesture operation to trigger touch screen button
TWI419011B (en) Method and system for tracking touch point
CN111506280A (en) Graphical user interface for indicating off-screen points of interest
KR102133262B1 (en) Display integrated input Apparatus
JP5683764B1 (en) Data input system, data input method, data input program, and data input device
CN104808922B (en) control method and electronic equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20150527

RJ01 Rejection of invention patent application after publication