CN102402282B - Information processor and information processing method - Google Patents

Information processor and information processing method Download PDF

Info

Publication number
CN102402282B
CN102402282B CN201110260390.2A CN201110260390A CN102402282B CN 102402282 B CN102402282 B CN 102402282B CN 201110260390 A CN201110260390 A CN 201110260390A CN 102402282 B CN102402282 B CN 102402282B
Authority
CN
China
Prior art keywords
display
touch pad
parts
user
district
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201110260390.2A
Other languages
Chinese (zh)
Other versions
CN102402282A (en
Inventor
笠原俊一
金野律子
成田智也
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2010199639A external-priority patent/JP5732784B2/en
Application filed by Sony Corp filed Critical Sony Corp
Publication of CN102402282A publication Critical patent/CN102402282A/en
Application granted granted Critical
Publication of CN102402282B publication Critical patent/CN102402282B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The present invention relates to information processor and information processing method.Messaging device includes the first detector unit, be configured to determine user control parts whether in the first threshold of touch pad apart from interior.This equipment also includes the second detector unit, be configured to when user control parts do not contact with touch pad time, determine that user controls the parts moving direction relative to touch pad, and display control unit, it is configured to be confirmed as just in threshold distance and when predetermined party moves up when user controls parts, generates signal and make the first altered display object seem that moving closer to user controls parts to change the first display object.

Description

Information processor and information processing method
Technical field
It relates to information processor, information processing method and computer program.
Background technology
Touch pad makes to realize directly perceived and wieldy user interface (hereinafter also referred to as " UI ") It is possibly realized, and is therefore having been used for the automatic machine of traffic system, bank in the past ATM etc..Recent touch pad can detect the movement of user, and make it possible to achieve and be different from Known push-button device operates.Thus, touch pad have been recently used for such as portable phone, The mancarried devices such as game device.Such as, JP-T-2010-506302 discloses a kind of device, this dress Put by touching the defeated of touch pad etc. based on there is object near the input area of device user Sense of touch feedback is started to generate haptic effect on device before entering region or district.
But, the information about finger that can sense at conventional touch pad is only concerned the hands of contact The state referred to.Thus, before finger touches touch pad, user can not operate device, and with Family is actual make finger touch before touch pad can not identification will by making finger touch touch pad What affects process.
Following situation will be considered, in this case, by operation on the touchscreen, moving from display Multiple objects on the display part of dynamic terminal (touch panel device of about A4 size) etc. select expectation Object.Now, the object selected when expectation is located remotely from finger and performs the position of touch screen operation Time, need extend finger to operate object, thus may increase user operation burden.
Additionally, in the case of the object group formed by multiple objects shows on display part, work as operation During this object group, a part of object of this object group exits the viewing area of display part.Do not show when existing When showing the object in viewing area, user needs once to show the object group in display part mobile Perform to select the operation of expectation object afterwards so that the operation burden of user may be increased.
Therefore, the disclosure has the most been made, it is desirable to provide one to pass through according to operation The display position of the object that the position change of object is to be operated reduces the operation burden of user, new The information processor of improvement, information processing method and computer program.
Summary of the invention
In one exemplary embodiment, it relates to a kind of messaging device, this equipment bag Include: the first detector unit, be configured to determine user and control parts on the direction being perpendicular to touch pad Whether in the first threshold away from touch pad apart from interior;And second detector unit, it is configured to work as user When control parts do not contact with touch pad, determine that user controls the parts mobile side relative to touch pad To.This equipment also includes showing control unit, is configured to be confirmed as when user controls parts First threshold, apart from interior and when the side being parallel to touch pad moves up, generates signal to change the One display object makes the first altered display object seem on the direction being perpendicular to touch pad Move closer to user and control parts.Wherein, the first detector unit is further configured to detection use Family controls parts close with the firstth district of touch pad on the direction be perpendicular to touch pad, and display Control unit is further configured to detecting that user controls parts in the direction being perpendicular to touch pad Upper with the firstth district close to time, generate signal to be moved to from the secondth district of touch pad by the first display object Firstth district.Precalculated position on the viewing area of touch pad is set to reference position, and by user It is the 3rd district that control parts separate the section definition of the distance pre-set with reference position, the first detection Unit is further configured to detect user, and to control parts mobile predetermined within the 3rd district of touch pad Distance and detection user control parts mobile preset distance outside the 3rd district, and display controls single Unit is further configured to when movement preset distance within the 3rd district being detected generate signal to incite somebody to action First display object moves the first object response distance, and shows that control unit is further configured to Signal is generated to be moved by the first display object when mobile preset distance outside the 3rd district being detected Second object response distance, the first object response distance responds apart from different from the second object.
In another exemplary embodiment, it relates to a kind of information processing method, the method Including: when user control parts do not contact with touch pad time, determine that user controls parts relative to touch The moving direction of plate.The method also includes determining that user controls parts on the direction being perpendicular to touch pad Whether in the threshold distance of touch pad.The method also includes just being confirmed as when user controls parts In threshold distance and when the side being parallel to described touch pad moves up, change the first display right As making the first altered display object seem that the side being perpendicular to touch pad is moved upwards up to more Parts are controlled close to user.Wherein, detection user control parts on the direction be perpendicular to touch pad with Firstth district of touch pad close, and detecting that user controls parts in the side being perpendicular to touch pad Upwards with the firstth district close to time, generate signal with by first display object move from the secondth district of touch pad To the firstth district.Precalculated position on the viewing area of touch pad is set to reference position, and will use It is the 3rd district that family control parts separate the section definition of the distance pre-set with reference position, and detection is used Family controls parts mobile preset distance and detection user within the 3rd district of touch pad and controls parts Mobile preset distance outside the 3rd district, and when mobile preset distance within the 3rd district being detected Generation signal to move the first object response distance by the first display object, and is detecting the 3rd Generate during mobile preset distance outside district signal with the first display object is moved the second object response away from From, the first object response distance responds apart from different from the second object.
In yet another embodiment, it relates to the non-transitory computer-readable of a kind of tangible embodiment Medium, is used for storing instruction, and this instruction is in the method being when executed by performing to include processing as follows: When user control parts do not contact with touch pad time, determine that user controls the parts shifting relative to touch pad Dynamic direction.The method also includes determining that user controls parts whether in the threshold distance of touch pad, with And when user control parts be confirmed as the most in a predetermined direction and move in threshold distance time, change First display object makes the first altered display object seem to move closer to user's control Parts.
As it has been described above, according to the disclosure, it is provided that a kind of position can passed through according to operation object Change to be reduced the information processor of operation burden of user, letter by the display position of operation object Breath processing method and computer program.
Accompanying drawing explanation
Fig. 1 shows showing of the hardware configuration according to the information processor that embodiment of the disclosure The block diagram of example;
Fig. 2 is the example of the hardware configuration of the information processor aiding in illustrating the embodiment according to Fig. 1 Diagram;
Fig. 3 is to aid in illustrating when the diagram of the operation burden distribution performed with a hands during operation input;
Fig. 4 is to aid in illustrating the object display position carried out according to the information processor of the embodiment of Fig. 1 Put the diagram of the overview of control;
Fig. 5 shows the block diagram of the functional configuration of the information processor of the embodiment according to Fig. 1;
Fig. 6 is at the object display position control that the information processor of the embodiment according to Fig. 1 is carried out The flow chart of reason;
Fig. 7 is the diagram aiding in illustrating the display position change carrying out object group and object;
Fig. 8 is aid in illustrating the object group that carries out according to the information processor of the embodiment of Fig. 1 aobvious Show the diagram of the example of the control of position;
Fig. 9 shows the example of the relation between the amount of movement of finger and the amount of movement of object group Curve chart;
Figure 10 shows another of the relation between the amount of movement of finger and the amount of movement of object group and shows The curve chart of example;
Figure 11 is the diagram aiding in illustrating and setting reference position in view of the operation burden on device;
Figure 12 is showing of the relation between width and the moving range of finger aiding in illustrating object group Figure;
Figure 13 is the diagram aiding in illustrating contact position correction process;And
Figure 14 is the diagram of the example aiding in illustrating the object group that display includes multiple object.
Detailed description of the invention
Hereinafter will be described in detail with reference to the attached drawings preferred embodiment of the present disclosure.By way of parenthesis, in this theory In bright book and accompanying drawing, there is the constitution element of substantially the same function composition by identical reference number Identify, and omit the repeat specification of these constitution elements.
Explanation will be given in the following order: 1. the overview of information processor, 2. information processor Functional configuration, 3. object display position control process.
1. the overview of information processor
Equipment includes according to an embodiment of the invention: the first detector unit, is configured to determine user Control member whether in the first threshold of touch pad apart from interior;Second detector unit, is configured to when using When family control member does not contacts with touch pad, determine user's control member mobile side relative to touch pad To;And display control unit, it is configured to when user's control member is confirmed as just at threshold distance In and time predetermined party moves up, generate signal to change the first display object and make altered the One display object seems to move closer to user's control member.
This equipment can also include: touch sensor, be configured to detect user's control member when with Touch pad contacts;Wherein show that control unit is further configured to when user's control member and touch pad During contact, generate signal to select the second display object.
The first the most altered display object and the selected second display are to liking same target.
Wherein at least one in the first detector unit and the second detector unit or touch sensor include Capacitive sensor.
This equipment can also include: the 3rd detector unit, is configured to when user's control member is not with tactile During template contact, detect user's control member moving horizontally relative to touch pad.
Wherein show that control unit is further configured to generate signal with in response to detected water Translation is moved and is moved the first altered display object.
This equipment may include that combine detection unit, for performing the first detector unit, the second detection Unit and the function of the 3rd detector unit.
Wherein show control unit be further configured to only when detected move horizontally detected For when less than Second Threshold apart from interior generation, generate signal right with mobile the first altered display As.
Wherein, the first detector unit is further configured to detect the of user's control member and touch pad One district close, and display control unit be further configured to detect user's control member with Firstth district close to time, generate signal with by first display object move to first from the secondth district of touch pad District.
Wherein, the first detector unit is further configured to detect user's control member at the of touch pad Mobile preset distance and detection user's control member mobile pre-spacing outside the 3rd district within 3 district From, and show that control unit is further configured to mobile pre-spacing within the 3rd district detected From time generate signal the first display object to be moved the first object response distance, and show that control is single Unit is further configured to when movement preset distance outside the 3rd district being detected generate signal to incite somebody to action First display object moves the second object response distance, and the first object response distance responds with the second object Distance is different.
Wherein, the first object response distance and preset distance have the first relation, the second object response distance There is the second relation with preset distance, and the first relation and the second relation are linear.
Wherein the slope of the linear relationship between the first object response distance and preset distance is more than second The slope of the linear relationship between object response distance and preset distance.
Wherein, display control unit be further configured to generate signal with display the 3rd display object and 4th display object, the first detector unit is further configured to detect user's control member and the 3rd and shows Show that object and the 4th shows that the close of object, display control unit are further configured to use detected Family control member and the 3rd object close to time, by between the 3rd object and user's control member virtual away from Change into the second pseudo range from from the first pseudo range, and user's control member and detected Four objects close to time, by the pseudo range between the 4th object and user's control member from first virtual away from From change into the second pseudo range and by the pseudo range between the 3rd object and user's control member from Second pseudo range changes into the first pseudo range.
Wherein the first pseudo range is more than the second pseudo range.
Wherein the first pseudo range and the second pseudo range at least on the direction be perpendicular to touch pad not With.
Wherein show that control unit is further configured to by between first size and the second size The outward appearance changing the 3rd object and the 4th object changes the first pseudo range and the second pseudo range.
Wherein the second size is more than first size.
First detector unit is further configured to detect user's control member from primary importance to second The movement of position and the translational speed from primary importance to the second position, and show that control unit is entered It is right that one step is configured to when translational speed is less than threshold value change first display corresponding with primary importance As.
Method includes according to an embodiment of the invention: do not contact with touch pad when user controls parts Time, determine that user controls the parts moving direction relative to touch pad;Determine whether user controls parts In the threshold distance of touch pad;And when user control parts be confirmed as the most in a predetermined direction with And when moving in threshold distance, change the first display object and the first altered display object has been seen Move closer to user and control parts.
Exemplary hardware arrangement
First hard by describe according to the information processor that embodiment of the disclosure with reference to Fig. 1 and Fig. 2 The example of part configuration.Incidentally, Fig. 1 shows the information processor according to the present embodiment The block diagram of the example of the hardware configuration of 100.Fig. 2 is to aid in illustrating the information processing according to the present embodiment The diagram of the example of the hardware configuration of device 100.
Information processor 100 according to the present embodiment is the device including following detector unit, this inspection Survey unit and can detect operation object (that is, user controls parts) on the display surface of display device Contact position (such as, via touch sensor) and the display surface of display device can be detected And above display surface operation object between close to distance.Such as, no matter include having little The device (such as portable data assistance and smart mobile phone) of type display device is at interior multiple device Function how, and these devices can be considered as information processor 100.
As it is shown in figure 1, include CPU 101, RAM according to the information processor 100 of the present embodiment (random access memory) 102, nonvolatile memory 103, display device 104 and close to touch Touching sensor 105, this can include touch sensor or and sensing contact close to touch sensor 105 Device combines work.
As it has been described above, CPU 101 plays arithmetic processing unit and controls the effect of device, and according to respectively The program of kind controls the operation in information processor 100 generally.CPU 101 can also be micro-place Reason device.RAM 102 is temporarily stored in the execution of CPU 101 program of use, at CPU 101 Execution in the parameter etc. that suitably changes.These parts are by the main frame formed by cpu bus etc. Bus interconnection.Nonvolatile memory 103 stores the program of CPU 101 use, operating parameter etc.. Such as ROM (read only memory) or flash memory can serve as nonvolatile memory 103.
Display device 104 is the example of the output device for exporting information.Such as liquid crystal display (LCD) device or OLED (Organic Light Emitting Diode) device can serve as display device 104. It is the example of the input equipment for user's input information close to touch sensor 105.Pass close to touching Sensor 105 includes such as inputting the input equipment of information and generating for input based on user Input signal is also exported the input control circuit of CPU 101 by input signal.
In the information processor 100 according to the present embodiment, as in figure 2 it is shown, show according to being laminated to The exemplary status of the display surface of showing device 104 provides close to touch sensor 105.Thus, When user makes user control the close display tables such as parts (such as, finger, stylus or other appurtenances) During face, the distance from display surface to parts can be detected close to touch sensor 105.
The concept of the change of the display position of GUI (graphic user interface)
When user operation GUI (ratio is as shown in the object etc. in viewing area (or district)), Information processor 100 according to the present embodiment dynamically changes GUI's according to moving of finger Display position is to contribute to the operation of user.Such as, as it is shown on figure 3, work as information processor 100 when being held by one, operates display by holding the thumb of the hands of information processor 100 GUI in viewing area 200, in region (the low burden of the mobile range corresponding to thumb Region) 200B easily performs this operation.But, the district separated with thumb on the opposite side of hands Territory 200A is not easily shifted with edge 200C (the high burden region) middle finger on that side held And operating difficulties.
Thus, when the precalculated position in viewing area (or district) is set to initial point (reference position) And thumb from initial point to the opposite side of the hands held extend time, according to the information processing apparatus of the present embodiment Put 100 and GUI is shown as GUI close to the hands side held and near the thumb extended.Additionally, When thumb is when initial point is mobile to the hands side (that is, the edge of the hands side held) held, GUI Display position be controlled such that GUI seems close to the opposite side of hands held and near thumb Refer to.
Such as, it is assumed that show in viewing area 200 and include multiple object 212 Object group 210.As shown in Figure 4, such as arrange with the form of grid on x/y plane and show Show object 212.First, as shown in state (A), such as, it is placed on and display surface when finger Separate preset distance close to time in sense side region, be located nearest in the position of finger position is right It is positioned at position forward as 212a (such as showing object) is shown on depth direction (z direction) Put (display surface side) place.Now, except the object in addition to the object 212a of finger 212 can be shown on depth direction sinking, to increase the separating distance with finger.
During it follows that move up in x-axis pros when finger position from state (A), letter Breath processing means 100 determines that operation is positioned at the object of the x-axis positive direction side of object 212a by user, And the opposite side that object group 210 moves to the moving direction of finger (moves up at x-axis losing side Object response distance).Thus, user wants the object 212 of operation near finger so that Yong Huke To operate desired object in the case of not the biggest mobile finger.Now, object group 210 is formed Each object 212 position in the depth direction also due to the moving and change of finger and object group 210 Become.Such as, when changing to object 212b closest to the object 212 of finger from object 212a, Object 212b is shown at position the most forward.Object 212 in addition to object 212b It is shown on depth direction sinking, to increase the separating distance with finger.
During additionally, move in x-axis positive direction when finger position from state (B), further Object group 210 is moved up, as shown in state (C) at x-axis losing side.Then, when closest to The object 212 of finger is when object 212b becomes object 212c, and object 212c is shown at Position the most forward.Object 212 in addition to object 212c is shown as in the depth direction Sink, to increase the separating distance with finger.
Hereafter, when user makes finger touch the object 212c to be operated, user can perform The function associated with object 212c.So, the information processor 100 according to the present embodiment is permissible The movement of the finger according to user dynamically moves object 212 to strengthen operability, and also with The mode being prone to visually identification shows the object just received publicity.Below, will be described in detail and pass through Such information processor 100 comes the position of control object 212.
2. the functional configuration of information processor
First the functional configuration of the information processor 100 according to the present embodiment will be described with reference to Fig. 5. Fig. 5 is the block diagram of the functional configuration illustrating the information processor 100 according to the present embodiment.
As shown in Figure 4, according to the information processor 100 of the present embodiment include input display part 110, Positional information obtaining portion 120, display control unit 130, execution process portion 140 and storage part is set 150。
Input display part 110 is for showing the functional unit that information and permission information are transfused to.Input Display part 110 includes detector unit 112 and display unit 114.Detector unit 112 is corresponding to Fig. 1 In close to touch sensor 105.Such as, Capacitive touch plate may be used for detector unit 112. But, any applicable technology can be used in combination with detector unit 212.Such as, detector unit 212 can utilize optics or other technologies to detect operation object and the display surface of display unit 114 Between close.In an example scenario, detector unit 112 detects according to operation object and display single The capacitance changed close to distance between the display surface of unit 114.
When operate object near display surface to preset distance or than preset distance closer to time, detector unit The electric capacity that 112 detect increases.When operating object further towards display surface, electric capacity is further Increase.Then, when operating object contact to display surface, the electric capacity that detector unit 112 detects Become maximum.Value based on the electric capacity that such detector unit 112 detects, will be discussed later Positional information obtaining portion 120 can obtain the operation object display surface relative to display unit 114 Positional information.The value of the electric capacity detected is exported, as testing result, the confidence that puts in place by detector unit 112 Breath obtaining portion 120.
Display unit 114 is the output device for showing information, and this device is corresponding to showing in Fig. 1 Showing device 104.Display unit 114 shows such as GUI object and the content etc. associated with this object. Additionally, when display control unit 130 has changed the display format of object, display unit 114 based on The object after change is shown from the object display change information of display control unit 130 notice.
Positional information obtaining portion 120 obtains instruction based on the testing result inputted from detector unit 112 The positional information of the position relationship between operation object and the display surface of display unit 114.Position is believed Breath obtaining portion can obtain positional information based on any applicable data type (such as capacitance data or Light data).As it has been described above, the value of electric capacity that detector unit 112 detects becomes the highest, it is right to operate As becoming closer to display surface, and when operating object contact to display surface, the value of electric capacity becomes For maximum.Corresponding relation between the value of electric capacity and close distance (or close to sense side region) is deposited in advance What storage will describe below is arranged in storage part 150.With reference to arranging storage part 150, positional information Obtaining portion 120 value based on the electric capacity inputted from detector unit 112 obtains finger in vertical direction Relative to the position of display surface on (such as, z direction).
Additionally, positional information obtaining portion 120 is known based on the testing result inputted from detector unit 112 Cao Zuo object position (such as on x/y plane) on the display surface of display unit 114. For example, it is assumed that detector unit 112 is formed by capacitive sensor substrate, formed in this substrate and be used for Detection x coordinate and the capacitance detecting grid of y-coordinate.In this case, detector unit 112 is permissible Identify that operation object exists according to each grid cell electric capacity in response to the change of contact of operation object Position (such as on display surface) on substrate.For example, it is possible to by the coordinate of the position of maximum capacity Location recognition is the finger coordinate closer to the position of display surface.Alternatively, it is possible to will detect The position of centre of gravity in the region of predetermined value electric capacity or more high capacitance is set to finger closer to display surface The coordinate of position.
Positional information obtaining portion 120 can be derived from the position of the display surface about display unit 114 Confidence ceases.The operation object location information obtained is output to display control unit 130 and execution process portion 140。
Display control unit 130 controls display based on the positional information that positional information obtaining portion 120 obtains The display position of the object on display unit 114.Above with reference to described by Fig. 3 and Fig. 4, aobvious Show that control portion 130 comes the display position of control object 212 according to the movement of the finger of user so that use Family easily operates display object 212 on display unit 114.When display control unit 130 is the most true When determining the change of display position of object 212, display control unit 130 generates the object after changing Image, and output image to display unit 114.Additionally, display control unit 130 is in response to following The instruction in the execution process portion 140 that will describe performs to show control, to change display at finger Object 212 at contact position.Details that this object display position correction process is described below.
In response to being input to the scheduled operation of information processor 100, execution process portion 140 perform with The function of operation input association.Such as, finger has been made to contact when detector unit 112 senses user When showing the special object 212 on display unit 114, execution process portion 140 believes based on from position The positional information of breath obtaining portion 120 input picks out finger and has touched object 212.Then, Execution process portion 140 identifies the object 212 that finger has touched, and performs to close with object 212 The function of connection.By way of parenthesis, selected object 212 can be according to the shifting of the finger selecting object 212 Move speed and change.It is described later the details that object display position correction processes.
Arrange storage part 150 storage calculate operation object and display surface between close to distance time make Information, generate about operation object position on a display surface positional information time use letter Breath and other information used when object display position control processes are used as configuration information.Example As, storage part 150 is set and stores the value of electric capacity and close to the corresponding relation between distance.Positional information Obtaining portion 120 is referred to such corresponding relation and obtains and the electric capacity from detector unit 112 input It is worth corresponding position.Additionally, arrange the behaviour that object 212 is performed by storage part 150 storage with user Make to input corresponding process content (function) to be performed.Arrange storage part 150 also to store for opening The translational speed (threshold value) of the finger that dynamic object display position correction processes.It is stored in and storage part is set Configuration information in 150 can prestore, it is also possible to by user setup.
Information processor 100 according to the present embodiment can include such as storing object for temporarily The memorizer of information required in display position control process etc..
3. object display position control processes
Information processor 100 according to the present embodiment can detect hands by having above-mentioned functions Refer to movement on a display surface.Then, utilizing these information, information processor 100 is according to hands The movement referred to controls the display position of display object 212 on display unit 114, the most permissible Improve operability.Below with reference to Fig. 6-13, the information processor 100 according to the present embodiment is described The object display position control carried out processes.
Fig. 6 is at the object display position control that the information processor 100 according to the present embodiment is carried out The flow chart of reason.Fig. 7 is to aid in illustrating the display position carrying out object group 210 and object 212 to change The diagram become.Fig. 8 is to aid in illustrating the object carried out according to the information processor 100 of the present embodiment The diagram of the example of the control of the display position of group.Fig. 9 and Figure 10 illustrates that finger is at touch pad The curve chart of two not exemplary relation between amount of movement and the amount of movement of object group in same district.Figure 11 is the diagram aiding in illustrating and bearing the setting to reference position in view of the operation on device.Figure 12 It it is the diagram of relation between width and the moving range of finger aiding in illustrating object group 210.Figure 13 is the diagram aiding in illustrating contact position correction process.
S100: process entry condition and determine
As shown in Figure 6, information processor 100 first determines whether to meet for starting control object The condition (S100) of the process of the display position of 212.For starting the display position of control object 212 The condition of the process put can be suitably arranged.Such as, finger locating is in sense side region Situation or from finger locating close to sense side region in through the scheduled time it may is that use Condition in the process of the display position starting control object 212.Additionally, such as, at display unit In the case of showing on 114 by arranging the on-screen keyboard that multiple key-likes become, can be defeated at execution key The process of the display position of fashionable execution control object 212 (such as key).
When starting to perform for selecting the operation of display object 212 on display unit 114 to input During operation, information processor 100 determines the display position met for starting control object 212 The condition processed, and start the process of step S110.On the other hand, when the most not When detecting as operation for the condition starting this process, repeat the process of step S100 until Operation detected.
The display position control of S110: object
When the process of the display position of control object 212 starts, according to finger relative to display surface Approximated position and change display position the mobile object 212 (S110) of object 212.As right As the controlled result in display position of 212, such as it is shown in fig. 7, the position of generating object group 210 That puts changes and the change of each object 212 position in the depth direction.
The control of the display position of object group
First, display control unit 130 on x/y plane in the relative direction of the moving direction of finger Mobile object group 210.Thus, object group 210 is moved for meeting the finger of movement and making Finger can touch desired object 212 with the little movement of finger.This is being performed behaviour by a hands It is particularly effective in the case of inputting, and compared with conventional situation, the finger of the hands held is permissible The number of the object 212 touched can increase.
It is discussed in greater detail the movement of object group 210, as shown in the state (A) of Fig. 8, to object Group 210 arranges reference position o (0,0).Based on this reference position, define from reference position o To distance df of finger and distance dc of the center P from reference position o to object group.Reference position Finger e.g. user for the first time can be placed on the position in object group 210 or pre-set by o Position.Additionally, as will be discussed later, can be according to the district with the low operation burden of user The relation in territory arranges reference position o.
It follows that as in the state (B) of Fig. 8, when user moves up hands in x-axis pros Referring to, object group 210 moves up at x-axis losing side.Hands now can be set the most as shown in Figure 9 The amount of movement df referred to and the amount of movement dc of object group 210.The amount of movement (df) of finger and object group The amount of movement (dc) of 210 is the most linear.When finger moves in the positive direction, object group 210 move in a negative direction.On the contrary, when finger moves in a negative direction, object group 210 exists Pros move up.Additionally, the movement for object group 210 arranges the limit (object amount of movement ultimate value) Allow to prevent that object group 210 has the part beyond framework.Such that make when finger is from benchmark Preset distance or farther distance are moved in position 0, and object group 210 also will not move and exceed object amount of movement Ultimate value.
The amount of movement (df) of finger and object group 210 can also be set the most as shown in Figure 10 Amount of movement (dc).In Fig. 10, dead band (dead zone) is set using reference position o as center. In the interval of the preset distance separated with reference position o (dead band) at finger, the shifting of object group 210 Momentum is less relative to the amount of movement of the finger situation than Fig. 9, say, that the movement of object group 210 It is little that the weight of amount is set.So, in the dead zone, even if when finger moves, object group 210 Only slightly react.By arranging dead band, it is possible to prevent when object group 210 according to finger at benchmark Movement near the o of position and when significantly moving, it is desirable to object 212 exceed the position of (pass) finger Put.By way of parenthesis, in the case of Figure 10 the most as in fig. 9, object amount of movement pole can be set Limit value allows to prevent that object group 210 has the part beyond framework.
What Fig. 9 and Figure 10 was exemplary shows amount of movement (df) and the shifting of object group 210 of finger Relation between momentum (dc), and the disclosure is not limited to such example.Such as, the movement of finger Relation between amount (df) and the amount of movement (dc) of object group 210 needs not to be linear relationship, and can Be arranged to the amount of movement (dc) of object group 210 along with the increase of the amount of movement (df) of finger and Index increases.
The control of the display position of object
Additionally, display control unit 130 arrives the degree of closeness of finger in z direction according to each object 212 Upper mobile object 212.Specifically, as it is shown in fig. 7, (such as, empty in the position closer to finger Quasi-distance) object 212 at place is displayed on more forward side, thus increases the close sound to finger (that is, the pseudo range between finger and object 212 should be reduced), and in the position farther away from finger The object 212 at place is displayed on more on rear side, thus reduces the close response to finger.Thus, In the way of being easily discernible, the object 212 received publicity can be presented to user.Object 212 is at z Displacement Z (index) on direction can be such as defined in equation 1 below.
Z (index)=fz (d (focus_index, index)) ... (equation 1)
Index represents the one number for identifying each object 212, d (focus_index, index) Represent the distance between object 212 and another object 212 just received publicity now.Equation 1 shows The depth function fz gone out can be set such that the object 212 closer to the object 212 received publicity It is displayed on position more forward.
Additionally, object 212 can also change size according to the position relationship with finger.Such as, exist Object 212 size closer to the position of finger increases, and the object of the position further from finger 212 sizes reduce.By so arranging the size of object 212, that can express finger is close Response, and prevent object 212 from falling into outside viewing area, namely it is moved to when object group 210 When the marginal portion of viewing area 200, prevent object 212 beyond beyond framework.
Specifically, size Size (index) of object 212 can be such as such as institute in equation 2 below Limit.Area function fs shown in equation 2 is configured substantially as so that closer to receiving publicity The object 212 of object 212 be shown as bigger size.
Size (index)=fs (d (focus_index, index)) ... (equation 2)
The relation controlled between operation burden of the display position of object group and object
Object group 210 and each object 212 is described according to finger above by reference to Fig. 7-10 The mobile movement carried out.Determine for object however, it is expected that bear according to operation as shown in Figure 3 The reference position o of the movement of group 210 and the amount of movement of finger as shown in figs. 9 and 10 And the relation between the amount of movement of object group 210.As shown in the left figure of Figure 11, information processing apparatus Put the viewing area 200 of 100 to perform the state of operation input according to user and be divided into high burden Region 200A and 200C and low burden region 200B.In this case, display control unit 130 Mobile object group 210 makes the object group 210 can low burden region 200B in viewing area 200 Interior operated, thus can be improved operability.
Specifically, as shown in the right figure of Figure 11, such as, reference position o is arranged on information processing The center of the low burden region 200B in the viewing area 200 of device 100.Then, finger Relation between the amount of movement of amount of movement and object group 210 is provided so that all of object group 210 Object 212 can be touched by finger movement in low burden region 200B.Based on device Shape and when operate device time hands and the layout of finger, consider region maneuverable for user, Can be by so arranging such as reference position o, the amount of movement of finger and the amount of movement of object group 210 Between relation etc. realize with low operation burden device.
By so arranging each parameter, as shown in figure 12, so that finger movement range comparison As the width of group 210 is less, and finger movement range can be contained in the region of low operation burden In.So, it is located relative to the object 212 of reference position o edge on the right when user view operation And when moving up finger to the right, position control is performed as so that object group 210 is to the left Move up so that the object 212 of edge can be touched in low burden region on the right.The opposing party Face, when user view operation is located relative to reference position o at the object 212 of left hand edge to the left When side moves up finger, position control is performed as so that object group 210 is moving up to the right So that the object 212 at left hand edge can be touched in low burden region.
S120: determine to exist and the most or not contacting of finger and display surface
Return to the description to Fig. 6, when carrying out control object group according to the movement of finger in step s 110 210 and during the display position of object 212, based on positional information, execution process portion 140 determines that finger is No touch display surface (S120).Execution process portion 140 performs the object touched with finger The function of 212 associations.To this end, according to positional information, execution process portion 140 determines that user is the most Make finger contact display surface to select the object 212 in object group 210.Repeat step S110 and The process of step S120, until finger touches display surface.
S130: for the determination of possible contact position correction
It follows that when execution process portion 140 determines that finger has touched display surface, at execution Reason portion 140 touches translational speed during display surface based on positional information acquisition finger, and determines Whether translational speed is more than predetermined speed (S130).Information processor 100 according to the present embodiment By identifying that user wants the object 212 operated more accurately to improve operability.In this situation Under, when the translational speed height of finger, being difficult to make finger contact exactly for a user to operate Object 212, and add the possibility occurrence of faulty operation.
Thus, in step s 130, it is thus achieved that the translational speed of finger, determine obtained finger Translational speed is higher than predetermined speed and thereby determines whether to be necessary that correction is by making finger connect Touch the object 212 selected by display surface.Specifically, when the translational speed of finger is higher than predetermined speed Time, information processor 100 determines the strong probability that there is faulty operation, and determining will be by correction institute The object 212 that the object 212 selected operates, and change the display position of object 212.
By way of parenthesis, it is being perpendicular to the aobvious of display unit 114 between display surface and finger when can obtain On the direction of presentation surface close to distance time, finger can be obtained by connecing in-plant time diffusion Translational speed.Additionally, be perpendicular to display unit when not obtaining between display surface and finger On the direction of the display surface of 114 close to distance time, execution process portion 140 can be examined by measurement Measure the finger of predetermined proximity state to touch time of being spent of display surface and obtain the shifting of this finger Dynamic speed.Specifically, execution process portion 140 can be by will be from display surface to detecting close to shape Distance d of the position of state obtains the shifting of finger divided by the time that finger touch display surface is spent Dynamic speed.
S140: perform the process corresponding with selected object
Assume as shown in figure 13, such as, move to finger inclined position above display surface Display surface, touch display surface and touch special object 212.Now, when the shifting of finger Dynamic speed is equal to or less than predetermined speed, it is believed that (such as, user is confirming object 212 to be operated Object 212 (b)) position when make finger contact.So, in this case, execution process portion The function (S140) that 140 objects 212 (b) performing to touch with finger associate.
S150: contact position correction process
On the other hand, when the translational speed of finger is more than predetermined speed, user may not confirm Contact finger in the case of the position of object 212 to be operated, and finger may contact fault Object 212.Thus, execution process portion 140 makes to arrange storage part 150 calibration object group 210 Display position makes the object 212 received publicity before finger touches display surface be chosen (S150)。
The history of the object 212 received publicity is stored in memorizer (not shown).Perform process Portion 140 is determined by query history will selected object 212.Such as, just at finger with in advance Constant speed degree or more speed move the object received publicity before can be configured so that will by correction quilt The object 212 selected.
Wanting selected object 212 after being determined by correction, execution process portion 140 makes display control unit 130 move object group 210 makes discussed object 212 be positioned at the position that finger touches.Such as Assuming in the example depicted in fig. 13, object 212 (a) wants selected object after being confirmed as correction. Now, when object 212 (b) display is in the position that finger touches, it is right that display control unit 130 moves As group 210, and it is corrected so that object 212 (a) is positioned at the position that finger touches.So, quilt Think that the object 212 to be selected by user is arranged on the selection state that can be visually detected In, thus user can perform operation input and not have any incongruity.Then, execution process portion 140 perform and the function (S140) wanting selected object 212 (a) to associate after correction.
It is described above the object that the information processor 100 according to the present embodiment carries out to show Position control processes.The display position control of such object processes the shifting according to the finger in proximity state The dynamic display position changing object group 210 and the object 212 of formation object group 210 so that Yong Huke Easily to predict possible phenomenon before operation object 212.Additionally, above-mentioned process display object 212 make object 212 near finger in proximity state.Such that make performing behaviour with a hands When making, it is also possible to be readily selected the object 212 to be operated.
Additionally, by changing object 212 at depth direction according to the position of the finger in proximity state On position, the object 212 that the most visually identification receives publicity.Additionally, by based on connect The relation of the position of the finger in nearly state arranges the ultimate value of the amount of movement of object group 210 and changes Become the size of object 212, be possible to prevent object group 210 to extend beyond viewing area 200.Thus can To prevent the decline of birds-eye view characteristic.
Although describing preferred embodiment of the present disclosure in detail by reference to accompanying drawing, but the disclosure being not It is limited to such example.Institute in the claims can occur for those of ordinary skills The multiple example being altered or modified in the range of technological concept described, is appreciated that these show naturally Example both falls within scope of the presently disclosed technology.
Such as, although carry out subject arranged group 210 the most as a grid, however these public affairs Open and be not limited to such example.Such as, as shown in Figure 14, can be by with circular form The object group that subject arranged 312 is formed.In this case, the most as in the preceding embodiment, root The display position of object group 310 and object 312 is changed deeply according to the position of the finger in proximity state Display position on degree direction.As shown in Figure 14 with the situation of circular form subject arranged 312 Under, such as can change the display position of object 312 by target rotation 312 and make object 312 Along the direction arranged near finger.
The disclosure comprises and within 7th, is submitted to the Japan of Japan Office in first patent JIUYUE in 2010 Apply for the subject content that the subject content disclosed in JP 2010-199639 is relevant, complete by quoting it Portion's content is hereby incorporated by.

Claims (16)

1. a messaging device, including:
First detector unit, is configured to determine user and controls parts on the direction being perpendicular to touch pad Whether in the first threshold away from described touch pad apart from interior;
Second detector unit, is configured to not contact with described touch pad when described user controls parts Time, determine that described user controls the parts moving direction relative to described touch pad;And
Display control unit, is configured to be confirmed as the most described first when described user controls parts In threshold distance and when the side being parallel to described touch pad moves up, generate signal to change the One display object makes the first altered display object seem in the side being perpendicular to described touch pad It is moved upwards up to control parts closer to described user,
Wherein, described first detector unit is further configured to detect described user and controls parts and hanging down Straight close with the firstth district of described touch pad on the direction of described touch pad,
Described display control unit is further configured to detecting that described user controls parts and hanging down Straight on the direction of described touch pad with described firstth district close to time, generate signal to show described first Show that object moves to described firstth district from the secondth district of described touch pad,
Wherein, the precalculated position on the viewing area of described touch pad is set to reference position, and Described user controlled parts separate the section definition of the distance pre-set with described reference position be 3rd district,
Described first detector unit is further configured to detect described user and controls parts described tactile Mobile preset distance and detect described user and control parts described within described 3rd district of template Mobile described preset distance outside 3rd district, and
Described display control unit is further configured to move within described 3rd district detecting During described preset distance, generation signal is to move the first object response distance by described first display object, And described display control unit is further configured to move outside described 3rd district detecting During described preset distance, generation signal is to move the second object response distance by described first display object, Described first object response distance responds apart from different from described second object.
Equipment the most according to claim 1, also includes:
Touch sensor, is configured to detect described user and controls when parts connect with described touch pad Touch;Wherein
Described display control unit is further configured to when described user controls parts and described touch During plate contact, generate signal to select the second display object.
Equipment the most according to claim 2, the most altered first display object and selected The second display selected is to liking same target.
Equipment the most according to claim 2, wherein said first detector unit and described second At least one or described touch sensor in detector unit include capacitive sensor.
Equipment the most according to claim 1, also includes:
3rd detector unit, is configured to not contact with described touch pad when described user controls parts Time, detection user controls component parallel moving horizontally in described touch pad.
Equipment the most according to claim 5, including:
Combine detection unit, is used for performing described first detector unit, described second detector unit and institute State the function of the 3rd detector unit.
Equipment the most according to claim 5, wherein said display control unit is joined further It is set to only when detected moving horizontally is detected as when less than Second Threshold apart from interior generation, raw Become signal with mobile the first altered display object.
Equipment the most according to claim 1, wherein,
Described first object response distance has the first relation with described preset distance,
Described second object response distance has the second relation with described preset distance, and
Described first relation and described second relation are linear.
Equipment the most according to claim 8, wherein said first object response distance is with described The slope of the linear relationship between preset distance is predetermined with described more than described second object response distance The slope of the linear relationship between Ju Li.
Equipment the most according to claim 1, wherein,
Described display control unit be further configured to generate signal with display the 3rd display object and 4th display object,
Described first detector unit is further configured to detect described user and controls parts and described the Three display objects and the 4th show the close of object,
Described display control unit is further configured to detecting that described user controls parts and institute State the 3rd display object close to time, by described 3rd display object and described user control between parts Pseudo range changes into the second pseudo range from the first pseudo range, and is detecting that described user is controlled Parts processed with described 4th display object close to time, will described 4th display object and described user control Pseudo range between parts is changed into described second pseudo range from described first pseudo range and incites somebody to action Described 3rd display object and described user control the described pseudo range between parts from described second Pseudo range changes into described first pseudo range.
11. equipment according to claim 10, wherein said first pseudo range is more than described Second pseudo range.
12. equipment according to claim 10, wherein said first pseudo range and described Two pseudo ranges are at least different on the direction being perpendicular to described touch pad.
13. equipment according to claim 10, wherein said display control unit further by It is configured through changing described 3rd display object and described the between first size and the second size The outward appearance of four display objects changes described first pseudo range and described second pseudo range.
14. equipment according to claim 13, wherein said second size is more than described first Size.
15. equipment according to claim 1, wherein,
Described first detector unit is further configured to detect described user and controls parts from first Put the movement of the second position and from described primary importance to the translational speed of the described second position, with And
Described display control unit is further configured to when described translational speed is less than threshold value change The first display object corresponding with described primary importance.
16. 1 kinds of information processing methods, including:
When user control parts do not contact with touch pad time, determine that described user controls parts relative to institute State the moving direction of touch pad;
Determine that whether described user controls parts on the direction being perpendicular to described touch pad away from described In the threshold distance of touch pad;And
When described user controls in parts are confirmed as the most described threshold distance and being parallel to State the side of touch pad when moving up, change the first display object and make the first altered display object Seem that moving closer to described user on the direction being perpendicular to described touch pad controls parts,
Wherein, detect described user control parts on the direction being perpendicular to described touch pad with described touch Firstth district of template close,
Detecting that described user controls parts on the direction being perpendicular to described touch pad with described One district close to time, generate signal with by described first display object move from the secondth district of described touch pad To described firstth district,
Wherein, the precalculated position on the viewing area of described touch pad is set to reference position, and Described user controlled parts separate the section definition of the distance pre-set with described reference position be 3rd district,
Detect described user and control parts mobile pre-spacing within described 3rd district of described touch pad From and detect described user and control parts mobile described preset distance outside described 3rd district, and
Signal is generated with by described when mobile described preset distance within described 3rd district being detected First display object moves the first object response distance, and moves outside described 3rd district detecting Generate during dynamic described preset distance signal with described first display object is moved the second object response away from From, described first object response distance responds apart from different from described second object.
CN201110260390.2A 2010-09-07 2011-08-31 Information processor and information processing method Expired - Fee Related CN102402282B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2010-199639 2010-09-07
JP2010199639A JP5732784B2 (en) 2010-09-07 2010-09-07 Information processing apparatus, information processing method, and computer program

Publications (2)

Publication Number Publication Date
CN102402282A CN102402282A (en) 2012-04-04
CN102402282B true CN102402282B (en) 2016-12-14

Family

ID=

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1997958A (en) * 2004-06-29 2007-07-11 皇家飞利浦电子股份有限公司 Multi-layered display of a graphical user interface
CN101571789A (en) * 2008-04-30 2009-11-04 宏达国际电子股份有限公司 Operating method, operating device and storage media for graphic menu bar
CN101616213A (en) * 2008-06-25 2009-12-30 Lg电子株式会社 Provide the sense of touch effect at mobile communication terminal
CN101727236A (en) * 2008-10-10 2010-06-09 索尼株式会社 Information processing apparatus, information processing method, information processing system and information processing program
CN202433855U (en) * 2010-09-07 2012-09-12 索尼公司 Information processing apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1997958A (en) * 2004-06-29 2007-07-11 皇家飞利浦电子股份有限公司 Multi-layered display of a graphical user interface
CN101571789A (en) * 2008-04-30 2009-11-04 宏达国际电子股份有限公司 Operating method, operating device and storage media for graphic menu bar
CN101616213A (en) * 2008-06-25 2009-12-30 Lg电子株式会社 Provide the sense of touch effect at mobile communication terminal
CN101727236A (en) * 2008-10-10 2010-06-09 索尼株式会社 Information processing apparatus, information processing method, information processing system and information processing program
CN202433855U (en) * 2010-09-07 2012-09-12 索尼公司 Information processing apparatus

Similar Documents

Publication Publication Date Title
US10503316B1 (en) Information processor, information processing method, and computer program
CN202433855U (en) Information processing apparatus
US9395917B2 (en) Electronic display with a virtual bezel
JP5721662B2 (en) Input receiving method, input receiving program, and input device
KR101597844B1 (en) Interpreting ambiguous inputs on a touch-screen
US8466934B2 (en) Touchscreen interface
EP2722730B1 (en) Mobile terminal and method for moving cursor thereof
US10168861B2 (en) Menu display device, menu display control method, program and information storage medium
EP2426585A2 (en) Information processing apparatus, information processing method, and computer program.
CN104238887B (en) The icon lookup method and device of conventional application program
US20150253925A1 (en) Display control device, display control method and program
JPWO2009031214A1 (en) Portable terminal device and display control method
CN102197356A (en) Touch panel device operating as if in the equivalent mode even when detected region is smaller than display region of display device
US20200356226A1 (en) Electronic apparatus and display method for touch proximity detection
CN105474164B (en) The ambiguity inputted indirectly is eliminated
JP2010204781A (en) Input device
CN102402282B (en) Information processor and information processing method
KR20150122021A (en) A method for adjusting moving direction of displaying object and a terminal thereof
JP5841023B2 (en) Information processing apparatus, information processing method, program, and information storage medium
KR101573287B1 (en) Apparatus and method for pointing in displaying touch position electronic device
KR20130026646A (en) Touch based mobile terminal and method for controlling soft keyboard in touch type mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20161214

Termination date: 20190831