CN104346094A - Display processing method and display processing equipment - Google Patents

Display processing method and display processing equipment Download PDF

Info

Publication number
CN104346094A
CN104346094A CN201310341711.0A CN201310341711A CN104346094A CN 104346094 A CN104346094 A CN 104346094A CN 201310341711 A CN201310341711 A CN 201310341711A CN 104346094 A CN104346094 A CN 104346094A
Authority
CN
China
Prior art keywords
viewing area
display processing
area
operating body
contact area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201310341711.0A
Other languages
Chinese (zh)
Other versions
CN104346094B (en
Inventor
陈柯
曾志伟
谢晓辉
卢睿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201810153986.4A priority Critical patent/CN108132743B/en
Priority to CN201310341711.0A priority patent/CN104346094B/en
Publication of CN104346094A publication Critical patent/CN104346094A/en
Application granted granted Critical
Publication of CN104346094B publication Critical patent/CN104346094B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/70Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
    • G06F21/82Protecting input, output or interconnection devices
    • G06F21/84Protecting input, output or interconnection devices output devices, e.g. displays or monitors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The invention provides display processing equipment and a display processing method, which are applied to electronic equipment. The electronic equipment is provided with a touch display region. The display processing method comprises the following steps: obtaining display content; detecting whether an operating body is in contact with the touch display region or not; when the operating body is detected to be in contact with the touch display region, judging whether a specific part of the operating body leaves the touch display region or not; when the specific part is judged to leave the touch display region, determining a target display region in the touch display region; displaying the display content in the target display region.

Description

Display processing method and display processing device
Technical field
The present invention relates to the field of Graphics Processing, more specifically, the present invention relates to a kind of display processing method and display processing device.
Background technology
Recently, the use with the electronic equipment of touch display screen is increasingly extensive.Have in the electronic equipment of touch display screen at such, especially in the larger-size situation of described touch display screen, the content that described electronic equipment shows is easy to be seen by the user of described electronic equipment at one's side other people.Therefore, when user checks personal information by touch display screen, how to avoid individual privacy to be spied on by people at one's side, this point has become the problem attracted people's attention.
Summary of the invention
Because above-mentioned situation; the invention provides a kind of display processing method and display processing device; it can make user pass through naturally, meet the mode of operating habit and cognitive custom; check the privacy content of oneself, thus the personal information effectively protecting user to check is not seen by the people of surrounding.
According to one embodiment of the invention, provide a kind of display processing method, be applied to an electronic equipment, described electronic equipment has a touch viewing area, and described display processing method comprises: obtain displaying contents; Whether detect has operating body to contact with described touch viewing area; When detecting that operating body contacts with described touch viewing area, judge whether the specific part of described operating body leaves described touch viewing area; When judging that described specific part leaves described touch viewing area, determine the target viewing area in described touch viewing area; And described displaying contents is presented in described target viewing area.
Judge that the step whether specific part of described operating body leaves described touch viewing area comprises: obtain and preset template; Detect the profile of the contact area between described operating body and described touch viewing area; And described profile is mated with described default template, to determine whether described specific part leaves described touch viewing area.
Described display processing method also comprises: determine the initial contact area between described operating body and described touch viewing area; Further, determining that the step of target viewing area comprises: when detecting that described specific part leaves described touch viewing area, determining the current contact region between the remainder of described operating body except described specific part and described touch viewing area; And based on described initial contact area and described current contact region, determine described target viewing area.
Determine that the step of described target viewing area comprises: the region in described initial contact area except described current contact region is defined as described target viewing area.
Described display processing method also comprises: calculate the initial contact area between described operating body and described touch viewing area; Wherein, determining that the step of target viewing area also comprises: when detecting that described specific part leaves described touch viewing area, calculating the current contact area between the remainder of described operating body except described specific part and described touch viewing area; Judge whether meet predetermined condition between described current contact area and described initial contact area; And when meeting predetermined condition between described current contact area and described initial contact area, determine described target viewing area.
Judge that the step whether meeting predetermined condition between described current contact area and described initial contact area comprises: judge whether the ratio between described current contact area and described initial contact area is less than specific threshold.
Described display processing method also comprises: the profile detecting the contact area between described operating body and described touch viewing area; And based on described profile, determine the display direction of described displaying contents; Further, described by described displaying contents, the step be presented in described target viewing area comprises: be presented in described target viewing area according to described display direction by described displaying contents.
According to another embodiment of the present invention, provide a kind of display processing device, be applied to an electronic equipment, described electronic equipment has a touch viewing area, and described display processing device comprises: acquiring unit, obtains displaying contents; Whether detecting unit, detect and have operating body to contact with described touch viewing area; Judging unit, when detecting that operating body contacts with described touch viewing area, judges whether the specific part of described operating body leaves described touch viewing area; Viewing area determining unit, when judging that described specific part leaves described touch viewing area, determines the target viewing area in described touch viewing area; And display unit, described displaying contents is presented in described target viewing area.
Described judging unit comprises: template acquiring unit, obtains and presets template; Contour detecting unit, detects the profile of the contact area between described operating body and described touch viewing area; And matching unit, described profile is mated with described default template, to determine whether described specific part leaves described touch viewing area.
Described display processing device also comprises: the first contact area determining unit, determines the initial contact area between described operating body and described touch viewing area; And, described viewing area determining unit comprises: the second contact area determining unit, when detecting that described specific part leaves described touch viewing area, determine the current contact region between the remainder of described operating body except described specific part and described touch viewing area; And target viewing area determining unit, based on described initial contact area and described current contact region, determine described target viewing area.
Described target viewing area determining unit is configured to the region in described initial contact area except described current contact region to be defined as described target viewing area.
Described display processing device also comprises: the first computing unit, calculates the initial contact area between described operating body and described touch viewing area; Wherein, described viewing area determining unit also comprises: the second computing unit, when detecting that described specific part leaves described touch viewing area, calculate the current contact area between the remainder of described operating body except described specific part and described touch viewing area; Condition judgment unit, judges whether meet predetermined condition between described current contact area and described initial contact area; And target viewing area determining unit, when meeting predetermined condition between described current contact area and described initial contact area, determine described target viewing area.
Described condition judgment unit is configured to judge whether the ratio between described current contact area and described initial contact area is less than specific threshold.
Described display processing device also comprises: contour detecting unit, detects the profile of the contact area between described operating body and described touch viewing area; And direction-determining unit, based on described profile, determine the display direction of described displaying contents; Further, described display unit is configured to described displaying contents to be presented in described target viewing area according to described display direction.
In the display processing method and display processing device of the embodiment of the present invention; only when determine operating body (palm as user) and contact touch display screen and as described in the specific part (thumb etc. as user's palm) of operating body leave touch display screen time; just obtained displaying contents is presented in specific target area; thus effectively protecting the individual privacy of user, the personal information preventing user from checking is seen by other people near described electronic equipment.When many people use the electronic equipment of the touch display screen with large-size simultaneously, the display processing method of the embodiment of the present invention and display processing device are especially favourable.
Accompanying drawing explanation
Fig. 1 is the process flow diagram of diagram according to the display processing method of the embodiment of the present invention;
Fig. 2 A and Fig. 2 B is the schematic diagram of the user operation of the display processing method of the diagram application embodiment of the present invention;
Fig. 3 A and Fig. 3 B is the schematic diagram of the display processing method of the diagram application embodiment of the present invention; And
Fig. 4 is the schematic diagram of the main configuration of the display processing device of the diagram embodiment of the present invention.
Embodiment
The embodiment of the present invention is described in detail below with reference to accompanying drawing.
First, the display processing method according to the embodiment of the present invention will be described.
The display processing method of the embodiment of the present invention is applied to an electronic equipment.Described electronic equipment can be the electronic equipment that the size of such as panel computer, smart mobile phone etc. is relatively little, also can be the relatively large electronic equipment of the size of such as all-in-one, desktop computer etc.Described electronic equipment has touch sensitive display unit (touch display screen), and it has a touch viewing area.Described touch display screen for receiving the touch input of user, and can show various displaying contents on touch viewing area.Described touch sensitive display unit can be made up of the various forms of touch display screen such as capacitor type, resistor-type.
Below, the display processing method of the embodiment of the present invention is described in detail with reference to Fig. 1.
In the display processing method of the embodiment of the present invention, as shown in Figure 1, first, in step S101, described display processing method obtains displaying contents.Particularly, described display processing method can in response to the operational order of user (as display command) from as described in the built-in or external storage unit of electronic equipment obtain as described in displaying contents.Described displaying contents can comprise the various forms of content of still image, live image, text etc.Described displaying contents especially can be the personal information, address list etc. of the content relevant to privacy of user, such as user.
On the other hand, in step S102, whether described display processing method detects has operating body to contact with described touch viewing area.Described operating body is such as the palm of user.Described display processing method detects the contact of operating body mode by described touch sensitive display unit is known to those skilled in the art, is not described in detail in this.If do not detect that operating body contacts with described touch viewing area, then described display processing method terminates.Alternatively, described display processing method repeats the detection of step S102, until the contact of operating body detected.
When detecting that operating body contacts with described touch viewing area, described display processing method proceeds to step S103, judges whether the specific part of described operating body leaves described touch viewing area.Particularly, when described operating body is the palm of user, described display processing method such as judges whether the specific part (part that e.g., thumb is corresponding) of user's palm leaves described touch viewing area.
In a concrete example, one can be previously stored with in described electronic equipment and preset template, as the template of user's palm.Thus, on the one hand, described display processing method obtains described default template.On the other hand, described display processing method detects the profile of the contact area between described operating body and described touch viewing area.After this, described profile mates with described default template by described display processing method, to determine whether described specific part leaves described touch viewing area.
More specifically, described display processing method, can based on the some unique points of described contours extract after profile contact area being detected, as the characteristic information of described profile.Described display processing method based on existing any algorithm extract minutiae from described profile, can be not described in detail in this.Equally, described default template can comprise the characteristic information of multiple default unique point.After this, the characteristic information of described profile and the characteristic information preset compare, to judge whether described specific part leaves described touch viewing area by described display processing method.Such as, described default template can comprise the characteristic information corresponding to thumb part, the characteristic information corresponding to index finger portion, the characteristic information corresponding to middle finger portion, the characteristic information corresponding to ring finger portion and correspond to the characteristic information of little finger portion.When described display processing method based on the comparison, when judging the characteristic information that described profile only comprises corresponding to forefinger, middle finger, the third finger and little finger portion, described display processing method judges that described specific part has left described touch viewing area.In the above example, with described specific part for thumb section is divided into example to be illustrated.Certainly, need and user operation habits according to design, described specific part also can be such as thumb part and index finger portion or the part even except little finger portion etc.
Certainly, above-mentioned judgment mode is only example, those skilled in the art can adopt on the basis of the above teachings other modes judge described specific part whether leave touch viewing area.Such as, usually, thumb accounts for the ratio of the area of whole palm is substantially same.Described display processing method can based on described ratio preset first threshold value.After this, described display processing method, after the contact detecting described operating body, calculates the area of initial contact area.In addition, described display processing method calculates the area of contact area in real time along with the change of contact area, and judges whether the ratio between the area that described contact area reduces and the area of initial contact area meets or exceeds described first threshold.When judging that described ratio meets or exceeds described first threshold, described display processing method judges that described specific part leaves described touch viewing area.
Certainly, described display processing method also can calculate the current contact area between the remainder of described operating body except described specific part and described touch viewing area, and judges whether meet predetermined condition between described current contact area and described initial contact area.When meeting predetermined condition between described current contact area and described initial contact area, determine described target viewing area.Particularly, similar to the above, the predeterminable Second Threshold of described display processing method, and judge whether the ratio between described current contact area and described initial contact area is less than described Second Threshold.When judging that described ratio is less than described Second Threshold, described display processing method judges that described specific part leaves described touch viewing area.
When judging that described specific part leaves described touch viewing area, described display processing method proceeds to step S104, and determines the target viewing area in described touch viewing area.
Particularly, in one example, as mentioned above, described display processing method can determine the initial contact area between described operating body and described touch viewing area.After this, when detecting that described specific part leaves described touch viewing area, described display processing method determines the current contact region between the remainder of described operating body except described specific part and described touch viewing area, and based on described initial contact area and described current contact region, determine described target viewing area.More specifically, the region in described initial contact area except described current contact region can be defined as described target viewing area by described display processing method.
In another example, described display processing method by first-class element of such as making a video recording, can be determined the palm of user and touches angle current between viewing area, and determining described target viewing area based on initial contact area and described angle.Particularly, described palm Current projection can be defined as described target viewing area to the view field on described touch display screen by described display processing method.In another example, further, described display processing method can also determine target viewing area based on contact area current between the palm of described view field and user and touch display screen.Such as, described view field can be defined as target viewing area in the region removed after described current contact region by described display processing method.
Certainly, those skilled in the art can on the basis of the above teachings, and by other various mode determination target viewing areas, it is included in scope of the present invention.
After determining target viewing area, in step S105, described displaying contents is presented in described target viewing area by described display processing method.
Above, the display processing method of the embodiment of the present invention is described with reference to Fig. 1.By described display processing method; only when the specific part (thumb etc. as user's palm) determining operating body leaves touch display screen; just obtained displaying contents is presented in specific target area; thus effectively protecting the individual privacy of user, the personal information preventing user from checking is seen by other people near described electronic equipment.When many people use the electronic equipment of the touch display screen with large-size simultaneously, the display processing method of the embodiment of the present invention and display processing device are especially favourable.
In addition, for improving the operating experience of user further, described display processing method also according to described operating body and the contact touched between viewing area, can be determined the display direction of displaying contents, and shows in the corresponding direction.
Particularly, as mentioned above, described display processing method can detect the profile of the contact area between described operating body and described touch viewing area, and based on described profile, determines the display direction of described displaying contents.Particularly, the characteristic point information in the default template that obtains of described display processing method can be associated with specific directional information.Described display processing method, can based on the result determination display direction of described comparison when described profile and described default template being compared.Thus described displaying contents can be presented in described target viewing area according to described display direction by described display processing method.
In addition, for protecting the privacy content of user further, after detecting that the palm of user leaves described touch viewing area completely, described display processing method can terminate the display of displaying contents, that is, no longer show described privacy content.
Fig. 2 A and Fig. 2 B is the schematic diagram of the user operation of the display processing method of the diagram application embodiment of the present invention.
As shown in Figure 2 A, user hope display individual privacy content, do not seen by other people time, operating body (e.g., the palm 201 of user) entirety first can be used to cover on the touch viewing area 200 of electronic equipment when display will be performed.Described display processing method with when touching the contact of viewing area, judges whether the specific part (part that e.g., thumb is corresponding) of the palm of user leaves touch viewing area at palm user being detected.When detecting that the thumb section of user separates touch viewing area (as shown in Figure 2 B), described display processing method determines target viewing area by referring to the process described in Fig. 1, such as the region 202 in Fig. 2 B, and the privacy content " XXXX " of user is presented at described target viewing area.In addition, as mentioned above, described display processing method according to the direction of the palm of user, can also determine the display direction of privacy content, and described privacy content is presented in described target viewing area according to described display direction.
Fig. 3 A and Fig. 3 B is the schematic diagram of the display processing method of the diagram application embodiment of the present invention.Fig. 3 A illustrates when the user operation of Fig. 2 A, the determined initial contact area 301 of described display processing method.Fig. 3 B illustrates when the user operation of Fig. 2 B, and described display processing method is by displaying contents " XXXX " shown in the determined current contact region 302 of process as above and target viewing area 303 and described target viewing area 303.It is pointed out that in the example shown in Fig. 3 B, the region in initial contact area except current contact region is defined as target viewing area.Alternatively, as mentioned above, also can determine view field according to the angle of active user's palm and touch-screen, and described view field is defined as target viewing area.
Above, the display processing method of the embodiment of the present invention is described with reference to Fig. 1-Fig. 3.In the display processing method of the embodiment of the present invention, the privacy content that user can pass through naturally, the mode (such as, lifting the mode of palm gradually) of operating habit and cognitive custom that meets checks oneself, and do not seen by the people of surrounding.Operate many people in the electronic equipment with relatively large-sized touch display screen, the display processing method of the embodiment of the present invention is particularly favourable simultaneously.
Fig. 4 is the schematic diagram of the main configuration of the display processing device of the diagram embodiment of the present invention.
The display processing device of the embodiment of the present invention is applied to an electronic equipment.Described electronic equipment can be the electronic equipment that the size of such as panel computer, smart mobile phone etc. is relatively little, also can be the relatively large electronic equipment of the size of such as all-in-one, desktop computer etc.Described electronic equipment has touch sensitive display unit (touch display screen), and it has a touch viewing area.Described touch display screen for receiving the touch input of user, and can show various displaying contents on touch viewing area.Described touch sensitive display unit can be made up of the various forms of touch display screen such as capacitor type, resistor-type.
Particularly, as shown in Figure 4, the display processing device 400 of the embodiment of the present invention comprises: acquiring unit 401, detecting unit 402, judging unit 403, viewing area determining unit 404 and display unit 405.
Described acquiring unit 401 obtains displaying contents.Whether described detecting unit 402 detects has operating body to contact with described touch viewing area.Described judging unit 403, when detecting that operating body contacts with described touch viewing area, judges whether the specific part of described operating body leaves described touch viewing area.Described viewing area determining unit 404, when judging that described specific part leaves described touch viewing area, determines the target viewing area in described touch viewing area.Described displaying contents is presented in described target viewing area by described display unit 405.
In one embodiment, described judging unit 403 comprises: template acquiring unit, contour detecting unit and matching unit (not shown).Described template acquiring unit obtains presets template.The profile of the contact area described in described contour detecting unit inspection between operating body and described touch viewing area.Described profile mates with described default template by described matching unit, to determine whether described specific part leaves described touch viewing area.
In another embodiment, described display processing device 400 also comprises: the first contact area determining unit, determines the initial contact area between described operating body and described touch viewing area.Described viewing area determining unit 404 comprises: the second contact area determining unit, when detecting that described specific part leaves described touch viewing area, determine the current contact region between the remainder of described operating body except described specific part and described touch viewing area; And target viewing area determining unit, based on described initial contact area and described current contact region, determine described target viewing area.
In another embodiment, described target viewing area determining unit is configured to the region in described initial contact area except described current contact region to be defined as described target viewing area.
In another embodiment, described display processing device 400 also comprises: the first computing unit, calculates the initial contact area between described operating body and described touch viewing area; Wherein, described viewing area determining unit also comprises: the second computing unit, when detecting that described specific part leaves described touch viewing area, calculate the current contact area between the remainder of described operating body except described specific part and described touch viewing area; Condition judgment unit, judges whether meet predetermined condition between described current contact area and described initial contact area; And target viewing area determining unit, when meeting predetermined condition between described current contact area and described initial contact area, determine described target viewing area.
In another embodiment, described condition judgment unit is configured to judge whether the threshold value between described current contact area and described initial contact area is less than specific threshold.
In another embodiment, described display processing device 400 also comprises: contour detecting unit, detects the profile of the contact area between described operating body and described touch viewing area; And direction-determining unit, based on described profile, determine the display direction of described displaying contents; Further, described display unit is configured to described displaying contents to be presented in described target viewing area according to described display direction.
Configuration and the operation of each unit of described display processing device 400 are described in detail in reference to the display processing method of Fig. 1, no longer repeat at this.
Above, the display processing device of the embodiment of the present invention is described with reference to Fig. 4.By described display processing device; only when the specific part (thumb etc. as user's palm) determining operating body leaves touch display screen; just obtained displaying contents is presented in specific target area; thus effectively protecting the individual privacy of user, the personal information preventing user from checking is seen by other people near described electronic equipment.When many people use the electronic equipment of the touch display screen with large-size simultaneously, the display processing method of the embodiment of the present invention and display processing device are especially favourable.
Above, display processing method according to the embodiment of the present invention and display processing device is described referring to figs. 1 through Fig. 4.
It should be noted that, in this manual, term " comprises ", " comprising " or its any other variant are intended to contain comprising of nonexcludability, thus make to comprise the process of a series of key element, method, article or equipment and not only comprise those key elements, but also comprise other key elements clearly do not listed, or also comprise by the intrinsic key element of this process, method, article or equipment.When not more restrictions, the key element limited by statement " comprising ... ", and be not precluded within process, method, article or the equipment comprising described key element and also there is other identical element.
In addition, it should be noted that, in this manual, the statement of similar " first ... unit ", " second ... unit " is only distinguished in order to convenient when describing, and and do not mean that it must be embodied as two or more unit of physical separation.In fact, as required, described unit entirety can be embodied as a unit, also can be implemented as multiple unit.
Finally, also it should be noted that, above-mentioned a series of process not only comprises with the order described here temporally process that performs of sequence, and comprises process that is parallel or that perform respectively instead of in chronological order.Such as, in the display processing method shown in Fig. 1, step S101 and step S102 is not limited to perform with the order shown in Fig. 1, but can perform concurrently or reversedly.
Through the above description of the embodiments, those skilled in the art can be well understood to the mode that the present invention can add required hardware platform by software and realize, and can certainly all be implemented by hardware.Based on such understanding, what technical scheme of the present invention contributed to background technology can embody with the form of software product in whole or in part, this computer software product can be stored in storage medium, as ROM/RAM, magnetic disc, CD etc., comprising some instructions in order to make a computer equipment (can be personal computer, server, or the network equipment etc.) perform the method described in some part of each embodiment of the present invention or embodiment.
In embodiments of the present invention, units/modules can use software simulating, to be performed by various types of processor.For example, the executable code module of a mark can comprise one or more physics or the logical block of computer instruction, and for example, it can be built as object, process or function.However, the executable code of institute's identification module is does not have to be physically positioned at together, but the different instruction be stored in not coordination can be comprised, and when these command logics combine, its Component units/module and realize the regulation object of this units/modules.
When units/modules can utilize software simulating, consider the level of existing hardware technique, so can with the units/modules of software simulating, when not considering cost, those skilled in the art can build corresponding hardware circuit and realize corresponding function, and described hardware circuit comprises existing semiconductor or other discrete element of conventional ultra-large integrated (VLSI) circuit or gate array and such as logic chip, transistor and so on.Module can also use programmable hardware device, the realizations such as such as field programmable gate array, programmable logic array, programmable logic device.
Above to invention has been detailed introduction, applying specific case herein and setting forth principle of the present invention and embodiment, the explanation of above embodiment just understands method of the present invention and core concept thereof for helping; Meanwhile, for one of ordinary skill in the art, according to thought of the present invention, all will change in specific embodiments and applications, in sum, this description should not be construed as limitation of the present invention.

Claims (14)

1. a display processing method, is applied to an electronic equipment, and described electronic equipment has a touch viewing area, and described display processing method comprises:
Obtain displaying contents;
Whether detect has operating body to contact with described touch viewing area;
When detecting that operating body contacts with described touch viewing area, judge whether the specific part of described operating body leaves described touch viewing area;
When judging that described specific part leaves described touch viewing area, determine the target viewing area in described touch viewing area; And
Described displaying contents is presented in described target viewing area.
2. display processing method as claimed in claim 1, wherein, judges that the step whether specific part of described operating body leaves described touch viewing area comprises:
Obtain and preset template;
Detect the profile of the contact area between described operating body and described touch viewing area; And
Described profile is mated with described default template, to determine whether described specific part leaves described touch viewing area.
3. display processing method as claimed in claim 1, also comprises:
Determine the initial contact area between described operating body and described touch viewing area;
Further, determine that the step of target viewing area comprises:
When detecting that described specific part leaves described touch viewing area, determine the current contact region between the remainder of described operating body except described specific part and described touch viewing area; And
Based on described initial contact area and described current contact region, determine described target viewing area.
4. display processing method as claimed in claim 3, wherein, determine that the step of described target viewing area comprises:
Region in described initial contact area except described current contact region is defined as described target viewing area.
5. display processing method as claimed in claim 1, also comprises:
Calculate the initial contact area between described operating body and described touch viewing area;
Wherein, determine that the step of target viewing area also comprises:
When detecting that described specific part leaves described touch viewing area, calculate the current contact area between the remainder of described operating body except described specific part and described touch viewing area;
Judge whether meet predetermined condition between described current contact area and described initial contact area; And
When meeting predetermined condition between described current contact area and described initial contact area, determine described target viewing area.
6. display processing method as claimed in claim 5, wherein, judges that the step whether meeting predetermined condition between described current contact area and described initial contact area comprises:
Judge whether the ratio between described current contact area and described initial contact area is less than specific threshold.
7. display processing method as claimed in claim 1, also comprises:
Detect the profile of the contact area between described operating body and described touch viewing area; And
Based on described profile, determine the display direction of described displaying contents;
It is further, described that by described displaying contents, the step be presented in described target viewing area comprises:
Described displaying contents is presented in described target viewing area according to described display direction.
8. a display processing device, is applied to an electronic equipment, and described electronic equipment has a touch viewing area, and described display processing device comprises:
Acquiring unit, obtains displaying contents;
Whether detecting unit, detect and have operating body to contact with described touch viewing area;
Judging unit, when detecting that operating body contacts with described touch viewing area, judges whether the specific part of described operating body leaves described touch viewing area;
Viewing area determining unit, when judging that described specific part leaves described touch viewing area, determines the target viewing area in described touch viewing area; And
Display unit, is presented at described displaying contents in described target viewing area.
9. display processing device as claimed in claim 8, wherein, described judging unit comprises:
Template acquiring unit, obtains and presets template;
Contour detecting unit, detects the profile of the contact area between described operating body and described touch viewing area; And
Matching unit, mates described profile with described default template, to determine whether described specific part leaves described touch viewing area.
10. display processing device as claimed in claim 8, also comprises:
First contact area determining unit, determines the initial contact area between described operating body and described touch viewing area;
Further, described viewing area determining unit comprises:
Second contact area determining unit, when detecting that described specific part leaves described touch viewing area, determines the current contact region between the remainder of described operating body except described specific part and described touch viewing area; And
Target viewing area determining unit, based on described initial contact area and described current contact region, determines described target viewing area.
11. display processing devices as claimed in claim 10, wherein, described target viewing area determining unit is configured to the region in described initial contact area except described current contact region to be defined as described target viewing area.
12. display processing devices as claimed in claim 8, also comprise:
First computing unit, calculates the initial contact area between described operating body and described touch viewing area;
Wherein, described viewing area determining unit also comprises:
Second computing unit, when detecting that described specific part leaves described touch viewing area, calculates the current contact area between the remainder of described operating body except described specific part and described touch viewing area;
Condition judgment unit, judges whether meet predetermined condition between described current contact area and described initial contact area; And
Target viewing area determining unit, when meeting predetermined condition between described current contact area and described initial contact area, determines described target viewing area.
13. display processing devices as claimed in claim 12, wherein, described condition judgment unit is configured to judge whether the ratio between described current contact area and described initial contact area is less than specific threshold.
14. display processing devices as claimed in claim 8, also comprise:
Contour detecting unit, detects the profile of the contact area between described operating body and described touch viewing area; And
Direction-determining unit, based on described profile, determines the display direction of described displaying contents;
Further, described display unit is configured to described displaying contents to be presented in described target viewing area according to described display direction.
CN201310341711.0A 2013-08-07 2013-08-07 Display processing method and display processing device Active CN104346094B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201810153986.4A CN108132743B (en) 2013-08-07 2013-08-07 Display processing method and display processing apparatus
CN201310341711.0A CN104346094B (en) 2013-08-07 2013-08-07 Display processing method and display processing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310341711.0A CN104346094B (en) 2013-08-07 2013-08-07 Display processing method and display processing device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN201810153986.4A Division CN108132743B (en) 2013-08-07 2013-08-07 Display processing method and display processing apparatus

Publications (2)

Publication Number Publication Date
CN104346094A true CN104346094A (en) 2015-02-11
CN104346094B CN104346094B (en) 2018-02-27

Family

ID=52501821

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201310341711.0A Active CN104346094B (en) 2013-08-07 2013-08-07 Display processing method and display processing device
CN201810153986.4A Active CN108132743B (en) 2013-08-07 2013-08-07 Display processing method and display processing apparatus

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201810153986.4A Active CN108132743B (en) 2013-08-07 2013-08-07 Display processing method and display processing apparatus

Country Status (1)

Country Link
CN (2) CN104346094B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105808141A (en) * 2016-03-04 2016-07-27 联想(北京)有限公司 Information processing method and electronic device
CN106155535A (en) * 2015-03-23 2016-11-23 联想(北京)有限公司 Touch screen soft-keyboard input method and device
CN108932101A (en) * 2017-05-27 2018-12-04 南宁富桂精密工业有限公司 interface control method, electronic device and computer readable storage medium
CN111399691A (en) * 2020-04-26 2020-07-10 Oppo广东移动通信有限公司 Screen touch detection method, mobile terminal and computer storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080168403A1 (en) * 2007-01-06 2008-07-10 Appl Inc. Detecting and interpreting real-world and security gestures on touch and hover sensitive devices
US20100026649A1 (en) * 2008-07-31 2010-02-04 Canon Kabushiki Kaisha Information processing apparatus and control method thereof
US20110074703A1 (en) * 2009-09-30 2011-03-31 Black Jonathan S Multi-touch surface interaction
CN102591517A (en) * 2010-12-17 2012-07-18 Lg电子株式会社 Mobile terminal and method for controlling the same
CN103019563A (en) * 2012-12-11 2013-04-03 广东欧珀移动通信有限公司 Method or device for rapidly suspending or muting video player
CN103106040A (en) * 2013-02-25 2013-05-15 广东欧珀移动通信有限公司 Mobile terminal operation method and device using the same
CN103186331A (en) * 2011-12-28 2013-07-03 宇龙计算机通信科技(深圳)有限公司 Display method and system for intelligent terminal interface

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5764222A (en) * 1996-05-28 1998-06-09 International Business Machines Corporation Virtual pointing device for touchscreens
CN101464750B (en) * 2009-01-14 2011-07-13 苏州瀚瑞微电子有限公司 Method for gesture recognition through detecting induction area of touch control panel
CN101556524A (en) * 2009-05-06 2009-10-14 苏州瀚瑞微电子有限公司 Display method for controlling magnification by sensing area and gesture operation
JP2011134271A (en) * 2009-12-25 2011-07-07 Sony Corp Information processor, information processing method, and program
CN103164067B (en) * 2011-12-19 2016-04-27 联想(北京)有限公司 Judge the method and the electronic equipment that touch input
CN102999266B (en) * 2012-11-23 2015-09-30 广东欧珀移动通信有限公司 A kind of mobile terminal screen screenshotss method and system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080168403A1 (en) * 2007-01-06 2008-07-10 Appl Inc. Detecting and interpreting real-world and security gestures on touch and hover sensitive devices
US20100026649A1 (en) * 2008-07-31 2010-02-04 Canon Kabushiki Kaisha Information processing apparatus and control method thereof
US20110074703A1 (en) * 2009-09-30 2011-03-31 Black Jonathan S Multi-touch surface interaction
CN102591517A (en) * 2010-12-17 2012-07-18 Lg电子株式会社 Mobile terminal and method for controlling the same
CN103186331A (en) * 2011-12-28 2013-07-03 宇龙计算机通信科技(深圳)有限公司 Display method and system for intelligent terminal interface
CN103019563A (en) * 2012-12-11 2013-04-03 广东欧珀移动通信有限公司 Method or device for rapidly suspending or muting video player
CN103106040A (en) * 2013-02-25 2013-05-15 广东欧珀移动通信有限公司 Mobile terminal operation method and device using the same

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MIKE WU ET AL: "Multi-Finger and Whole Hand Gestural Interaction Techniques for Multi-User Tabletop Displays", 《UIST ’03 PROCEEDINGS OF THE 16TH ANNUAL ACM SYMPOSIUM ON USER INTERFACE SOFTWARE AND TECHNOLOGY》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106155535A (en) * 2015-03-23 2016-11-23 联想(北京)有限公司 Touch screen soft-keyboard input method and device
CN105808141A (en) * 2016-03-04 2016-07-27 联想(北京)有限公司 Information processing method and electronic device
CN108932101A (en) * 2017-05-27 2018-12-04 南宁富桂精密工业有限公司 interface control method, electronic device and computer readable storage medium
CN108932101B (en) * 2017-05-27 2020-08-28 南宁富桂精密工业有限公司 Interface control method, electronic device and computer readable storage medium
CN111399691A (en) * 2020-04-26 2020-07-10 Oppo广东移动通信有限公司 Screen touch detection method, mobile terminal and computer storage medium
CN111399691B (en) * 2020-04-26 2023-10-10 Oppo广东移动通信有限公司 Screen touch detection method, mobile terminal and computer storage medium

Also Published As

Publication number Publication date
CN104346094B (en) 2018-02-27
CN108132743A (en) 2018-06-08
CN108132743B (en) 2020-12-18

Similar Documents

Publication Publication Date Title
CN102819331B (en) Mobile terminal and touch inputting method thereof
US9514311B2 (en) System and method for unlocking screen
US20180203568A1 (en) Method for Enabling Function Module of Terminal, and Terminal Device
CN106775084A (en) A kind of false-touch prevention method of touch-screen, device and mobile terminal
US20130050133A1 (en) Method and apparatus for precluding operations associated with accidental touch inputs
US9778789B2 (en) Touch rejection
AU2017203910B2 (en) Glove touch detection
CN106372544B (en) Temporary secure access via an input object held in place
US20130191786A1 (en) Method of performing a switching operation through a gesture inputted to an electronic device
CN102707861B (en) Electronic equipment and display method thereof
US9298324B1 (en) Capacitive touch with tactile feedback
CN106572207B (en) Device and method for identifying single-hand mode of terminal
EP2703970A1 (en) Apparatus and method for processing input on touch screen
CN107077284B (en) Gripping mode determining device
CN106250876A (en) A kind of fingerprint identification method and terminal
CN104346094A (en) Display processing method and display processing equipment
JP2008165575A (en) Touch panel device
CN105389053A (en) Touch display device, controller, and operation method
CN103543933A (en) Method for selecting files and touch terminal
US20140218336A1 (en) Capacitive touch screen and information processing method therefor
CN103631505B (en) Information processing equipment and character input display method
JP6504937B2 (en) INPUT DETERMINATION DEVICE, CONTROL PROGRAM, ELECTRONIC DEVICE, AND INPUT THRESHOLD CALIBRATING METHOD OF INPUT DETERMINATION DEVICE
CN105320861A (en) Mobile terminal and unlocking method and system therefor
CN114816213A (en) Operation identification method and device, electronic equipment and readable storage medium
CN116708662A (en) Electronic equipment state prompting method and device and electronic equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant