WO2015090092A1 - 一种用于生成个性化输入面板的方法和装置 - Google Patents

一种用于生成个性化输入面板的方法和装置 Download PDF

Info

Publication number
WO2015090092A1
WO2015090092A1 PCT/CN2014/086846 CN2014086846W WO2015090092A1 WO 2015090092 A1 WO2015090092 A1 WO 2015090092A1 CN 2014086846 W CN2014086846 W CN 2014086846W WO 2015090092 A1 WO2015090092 A1 WO 2015090092A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
input panel
information
hand shape
hand
Prior art date
Application number
PCT/CN2014/086846
Other languages
English (en)
French (fr)
Inventor
王铁彬
王轶翔
Original Assignee
百度在线网络技术(北京)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 百度在线网络技术(北京)有限公司 filed Critical 百度在线网络技术(北京)有限公司
Priority to JP2016559481A priority Critical patent/JP6397508B2/ja
Priority to US14/412,379 priority patent/US10379659B2/en
Priority to EP14814671.5A priority patent/EP3086210A4/en
Publication of WO2015090092A1 publication Critical patent/WO2015090092A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • G06V40/11Hand-related biometrics; Hand pose recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person

Definitions

  • the present invention relates to the field of computer technologies, and in particular, to a method and apparatus for generating a personalized input panel.
  • a method for generating a personalized input panel in a user device comprises the following steps:
  • c generates a personalized input panel suitable for the user based on the hand shape feature information.
  • an input panel generating apparatus for generating a personalized input panel in a user device, wherein the input panel generating means comprises the following means:
  • a first obtaining device configured to acquire user's hand shape information
  • a comparing device configured to compare the hand shape information of the user with the predetermined hand shape information to obtain the hand shape feature information of the user
  • the present invention has the following advantages: 1) The user equipment obtains an input panel that can meet the individual needs of the user by comparing the hand shape information of the user with the standard hand shape information, thereby improving the user's input panel. The accuracy of the touch operation, and can reduce the possibility of an input operation caused by a user's accidental touch; 2) the user device presents the position indication information on the screen, so that the user places the palm as much as possible when shooting the hand image Within the position indication information, the distance between the palm and the lens can be better controlled, thereby reducing the influence of the position of the hand on the user's device to analyze the acquired hand image when the hand image is captured; 3) the user The device can adjust the input panel according to the user's hand shape feature information and the contact information to generate a personalized input panel that meets the user's needs, so that the generated personalized input panel can better conform to the user's usage habits.
  • FIG. 1 is a schematic flow chart of a method for generating a personalized input panel in a user equipment according to a preferred embodiment of the present invention
  • FIG. 2 is a schematic flow chart of a method for generating a personalized input panel in a user equipment according to another preferred embodiment of the present invention
  • FIG. 3 is a schematic structural diagram of an input panel generating apparatus for generating a personalized input panel in a user equipment according to a preferred embodiment of the present invention
  • FIG. 4 is a schematic structural diagram of an input panel generating apparatus for generating a personalized input panel in a user equipment according to another preferred embodiment of the present invention.
  • FIG. 1 is a flow chart of a method for generating a personalized input panel in a user equipment according to a preferred embodiment of the present invention.
  • the method of this embodiment is mainly implemented by a user equipment; preferably, the user equipment has a touch input device, for example, the display screen of the user equipment is a touch screen, the user equipment has a virtual keyboard, etc.; preferably, the user Devices include, but are not limited to, PCs, tablets, smart phones, PDAs, IPTV, etc.; preferably, the user device is a mobile device.
  • user equipment is only an example, and other existing or future user equipments that may be applicable to the present invention are also included in the scope of the present invention and are incorporated herein by reference.
  • the method according to the present embodiment includes step S1, step S2, and step S3.
  • step S1 the user equipment acquires the hand shape information of the user.
  • the hand shape information includes any information capable of reflecting the shape of the hand.
  • the hand shape information includes, but is not limited to, an overall aspect ratio of the hand; a ratio of the finger to the length of the entire hand; a thickness of the finger; a width of the palm; a ratio of the length between the fingers.
  • the implementation manner in which the user equipment acquires the user's hand shape information includes but is not limited to:
  • the user equipment directly acquires the hand shape information of the user stored in the user equipment.
  • the user may store the hand shape information in the user equipment in advance, and the user equipment may directly acquire the hand shape information of the user stored in the user equipment when the user uses the user equipment to generate the personalized input panel.
  • the user equipment acquires the hand shape information of the user stored in the network device.
  • a user may store his hand shape information in a network device in advance, and the user device When the user uses the user device to generate a personalized input panel, the user's hand shape information stored in the user device is obtained by accessing the network device.
  • step S1 further includes step S11 and step S12.
  • step S11 the user equipment acquires a hand shape image taken by the user.
  • the implementation manner of the user equipment acquiring the hand shape image captured by the user includes but is not limited to:
  • the user device extracts a hand-shaped image taken by the user from the image already stored in the user device.
  • the user device extracts a certain hand shape image from its album as a hand shape image taken by the user according to the user's selection.
  • Step S11 further includes step S11-1 and step S11-2.
  • step S11-1 the user equipment invokes the camera of the user equipment and presents location indication information on the screen.
  • the position indication information is used to indicate a suitable position of the hand on the screen.
  • the position indication information may be expressed as a shape of a hand or a box; preferably, the position indication information appears as a standard hand shape.
  • the camera of the user equipment invoked by the user equipment is a front camera of the user equipment.
  • the user can place the hand in front of the camera and cause the hand displayed on the screen to be in the proper position indicated by the position indication information to perform the shooting operation.
  • step S11-2 the user equipment obtains the hand shape image according to the photographing operation of the user.
  • step S11-1 the user equipment invokes its front camera and presents a box indicating the appropriate position of the hand on the screen; in step S11-2, the user equipment passes the front camera according to the user The shooting operation of the image of the hand in the box obtains the hand image in the box.
  • step S12 the user equipment extracts the hand shape information from the hand shape image acquired in step S11.
  • the user equipment can extract the hand shape information from the hand shape image in various ways. For example, by analyzing an image and extracting a hand profile, hand shape information and the like are obtained.
  • step S2 the user equipment compares the user's hand shape information with the predetermined hand shape information to obtain the user's hand shape feature information.
  • the predetermined hand shape information may be a hand shape information of a standard hand shape.
  • the hand shape feature information may include feature information corresponding to the hand shape information.
  • the hand shape feature information includes, but is not limited to, an overall aspect ratio feature of the hand; a proportional feature of the finger occupying the length of the hand; a thickness feature of the finger; a width feature of the palm; a length ratio feature between the fingers; Wait.
  • the user equipment compares the user's hand shape information with the predetermined hand shape information, and obtains that the ratio of the user's finger to the length of the entire hand is lower than the ratio of the finger in the standard hand shape to the length of the entire hand, and the user equipment obtains the user's hand shape feature.
  • the information is at a lower ratio to the length of the finger that occupies the entire hand.
  • step S3 the user equipment generates, according to the hand shape feature information, the User's personalized input panel.
  • the implementation manner of the user equipment generating the personalized input panel applicable to the user according to the hand shape feature information includes but is not limited to:
  • the user equipment directly calculates the location of each button in the personalized input panel according to the user's hand shape feature information to generate a personalized input panel suitable for the user.
  • the user equipment first calculates the location of the hotspot button in the input panel according to the overall length and width ratio of the user's hand, and then based on The determined location of the hotspot button is calculated, and the location of other non-hotspot buttons is calculated to generate a personalized input panel suitable for the user.
  • the hotspot button is a button that is more easily clicked by the user, such as a character button that is clicked by the user.
  • the user equipment can also calculate the location of the area where the user is likely to be accidentally touched according to the hand shape feature information of the user, and reduce the touch sensitivity of each button located in the area that is easily misunderstood. For example, if the hand shape feature information acquired by the user equipment includes that the user's little finger is short, the user equipment can calculate an area that the little finger is easy to touch, as an area that the user easily touches, and reduce each of the easily misunderstood areas.
  • the touch sensitivity of the button to prevent erroneous input caused by user's accidental touch.
  • the touch sensitivity indicates the sensitivity of the user device to the touch operation of the user. Generally, the higher the touch sensitivity, the easier the touch operation performed by the user is detected, and the lower the touch sensitivity, the user performs. The touch operation is less likely to be detected.
  • the user equipment adjusts the predetermined input panel according to the hand shape feature information to obtain the personalized input panel.
  • the predetermined input panel is adapted to predetermined hand shape information.
  • the operation of adjusting the predetermined input panel may include any pair of predetermined inputs
  • the layout of the panel is adjusted.
  • the operation of adjusting the predetermined input panel includes, but is not limited to: a) adjusting a position of the hot button of the predetermined input panel, for example, adjusting the position of the hot button to the left or right; b) adjusting The touch sensitivity of the partial area in the predetermined input panel, for example, the touch sensitivity is lowered, so that the user needs a heavier touch, so that the user equipment determines that the button is pressed by the user, etc.; preferably, the touch sensitivity is adjusted
  • the area is an area of the predetermined input panel that is more easily touched by the user based on the hand shape feature information.
  • the user equipment can adopt various manners to adjust the predetermined input panel according to the hand shape feature information, thereby obtaining the personalized input panel.
  • the user equipment determines that the user's little finger and the index finger are shorter according to the length ratio feature between the fingers in the hand shape feature information, and the user equipment moves the position of the button that the little finger and the index finger are easy to touch, so that the positions of the cases are more Close to the position of the button that the middle finger and ring finger are easy to touch.
  • the user equipment may further determine the position of each button in the personalized input panel by combining the size of the area capable of presenting the personalized input panel.
  • the user equipment is a mobile device, and the size of the area on the mobile device capable of presenting the personalized input panel is 12 ⁇ 6 cm, and each button in the personalized input panel determined by the user equipment should be located in the area.
  • the user equipment obtains an input panel that can meet the personalized requirements of the user by comparing the hand shape information of the user with the standard hand shape information, thereby improving the accuracy of the user's touch operation on the input panel, and The possibility of an input operation caused by a user's accidental collision can be reduced; and the user equipment presents the position indication information on the screen, so that the user places the palm in the position indication information as much as possible when shooting the hand shape image, thereby being better Controls the distance between the palm and the lens, thereby reducing the impact of the position of the hand on the user's device when analyzing the hand image. ring.
  • FIG. 2 is a schematic flow chart of a method for generating a personalized input panel in a user equipment according to another preferred embodiment of the present invention.
  • the method of this embodiment is mainly implemented by a user equipment; wherein any description of the user equipment described in the embodiment shown in FIG. 1 is included in the embodiment in a reference manner.
  • the method of this embodiment includes step S1, step S2, step S3, and step S4; wherein step S3 further includes step S31. Steps S1 and S2 have been described in detail in the embodiment shown in FIG. 1, and details are not described herein again.
  • Step S4 and step S31 of the present embodiment will be described in detail below.
  • step S4 the user equipment acquires contact information of the user on the user equipment.
  • the contact information includes any information of contacts on the user equipment.
  • the contact information comprises at least one of the following:
  • the hotspot button on the previously presented input panel can be determined by the location information of the contacts on the user device and in conjunction with the information of the input panel previously presented by the user device.
  • a button corresponding to a position with a high touch rate is used as a hot spot button or the like.
  • the user equipment acquires shape information of the contact of the user's thumb on the user equipment; since the contact of the thumb on the screen is generally not a perfect circle or a shape that is not close to a perfect circle, the user equipment can pass the contact shape To determine whether the contact is a thumb contact, and based on the shape information of the thumb on the user device to determine whether the user is using the left or right hand.
  • the implementation manner of the user equipment acquiring the contact information of the user on the user equipment includes but is not limited to:
  • the user equipment obtains historical contact information of the user on the user equipment.
  • the user device acquires contact information or the like recorded by the user on a predetermined input panel.
  • the user equipment prompts the user to pre-enter one or more letters on the predetermined input panel, and uses the contact information in the pre-input operation as the contact information of the user on the user equipment.
  • the user equipment prompts the user to make an input randomly on the predetermined input panel, and uses the contact information input by the user for a predetermined period of time as the contact information of the user on the user equipment.
  • step S31 the user equipment generates a personalized input panel suitable for the user according to the hand shape feature information and the contact information.
  • the implementation manner of the user equipment generating the personalized input panel applicable to the user according to the hand shape feature information and the contact information includes but is not limited to:
  • the user equipment directly calculates the location of each button in the personalized input panel according to the user's hand shape feature information and the contact information to generate a personalized input panel suitable for the user.
  • the user's hand shape characteristic information includes that the overall length and width ratio of the user's hand is large, and the contact information includes shape information of the plurality of contacts; then the user equipment identifies the non-positive according to the shape information of the plurality of contacts.
  • the shape of the circle or non-near perfect circle is taken as the shape of the contact pressed by the thumb, and based on the shape, it is determined that the user uses the right hand, and the user equipment calculates the hot spot in the input panel according to the overall aspect ratio of the user's hand.
  • the initial position of the button then, the user device shifts each hot button from the initial position to the right, and then the user device calculates the position of the other non-hot button based on the position of the hot button after the right shift. Create a personalized input panel that fits the user.
  • the user equipment may also calculate an area that the user may easily touch according to the user's hand shape feature information, and reduce the touch sensitivity of the button in the area.
  • the user equipment adjusts the predetermined input panel according to the hand shape feature information and the contact information to obtain the personalized input panel.
  • the operation that can be performed on the predetermined input panel has been described in detail in step S3 in FIG. 1, and details are not described herein again.
  • the user equipment can adopt various manners to adjust the predetermined input panel according to the hand shape feature information and the contact information, thereby obtaining the personalized input panel.
  • the hand shape feature information includes a low proportion of fingers occupying the length of the entire hand, and the user The device determines, based on the contact information, that the user is using the right hand, and the user device shifts the position of the hot button of the predetermined input panel to the right.
  • the user device determines, based on the contact information, that the user is using the right hand, and the user device shifts the position of the hot button of the predetermined input panel to the right.
  • the hand shape feature information includes a wide width of the palm
  • the user equipment determines that the user uses the left hand according to the hand shape feature information, and the user equipment reduces the touch sensitivity of the leftmost column of the predetermined input panel. It can reduce the possibility of the user accidentally touching the edge area in the input panel.
  • the user equipment can adjust the input panel according to the user's hand shape feature information and the contact information to generate a personalized input panel that meets the user's needs, so that the generated personalized input panel can better conform to the user's usage habits. .
  • FIG. 3 is a schematic structural diagram of an input panel generating apparatus for generating a personalized input panel in a user equipment according to a preferred embodiment of the present invention.
  • the input panel generating device includes a first acquiring device 1, a comparing device 2, and a generating device 3.
  • the input panel generating device is included in the user equipment.
  • the first acquisition device 1 acquires the user's hand shape information.
  • the hand shape information includes any information capable of reflecting the shape of the hand.
  • the hand shape information includes, but is not limited to, an overall aspect ratio of the hand; a ratio of the finger to the length of the entire hand; a thickness of the finger; a width of the palm; a ratio of the length between the fingers.
  • the implementation manner of acquiring the hand shape information of the user by the first acquiring device 1 includes but is not limited to:
  • the first acquisition device 1 directly acquires the hand shape information of the user stored in the user device.
  • the user may store the hand shape information in the user equipment in advance, and the first obtaining device 1 may directly acquire the personalized input panel when the user uses the user device to generate the personalized input panel.
  • the user's hand information stored in the user device may be stored in the user device.
  • the first acquisition device 1 acquires the hand shape information of the user stored in the network device.
  • the user may store the hand shape information in the network device in advance, and the first obtaining device 1 may acquire the user stored in the user device by accessing the network device when the user uses the user device to generate the personalized input panel. User's hand information.
  • the first acquisition device 1 acquires the user's hand shape information according to the image taken by the user.
  • the first obtaining device 1 further includes a sub-acquisition device (not shown) and an extracting device (not shown).
  • the sub-acquisition device acquires a hand-shaped image taken by the user.
  • the implementation manner of the sub-acquisition device acquiring the hand-shaped image captured by the user includes but is not limited to:
  • the sub-acquisition device extracts a hand-shaped image taken by the user from the image already stored in the user device.
  • the sub-acquisition device extracts a certain hand shape image from the album of the user device as a hand-shaped image photographed by the user according to the user's selection.
  • the sub-acquisition device further includes a presentation device (not shown) and an image acquisition device (not shown).
  • the rendering device invokes the camera of the user device and presents location indication information on the screen.
  • the position indication information is used to indicate a suitable position of the hand on the screen.
  • the position indication information may be expressed as a shape of a hand or a box; preferably, the position indication information appears as a standard hand shape.
  • the camera of the user equipment invoked by the presentation device is a front end of the user equipment camera.
  • the user can place the hand in front of the camera and cause the hand displayed on the screen to be in the proper position indicated by the position indication information to perform the shooting operation.
  • the image acquisition device obtains the hand shape image according to the photographing operation of the user.
  • the rendering device invokes its front camera and presents a box on the screen indicating the appropriate location of the hand; the image acquisition device operates according to the user's image of the image of the hand placed in the box through the front camera, Get the hand image in the box.
  • the extracting means extracts the hand shape information from the hand shape image acquired by the child acquiring means.
  • the extracting device can extract the hand shape information from the hand shape image in various ways. For example, by analyzing an image and extracting a hand profile, hand shape information and the like are obtained.
  • the comparing device 2 compares the user's hand shape information with the predetermined hand shape information to obtain the user's hand shape feature information.
  • the predetermined hand shape information may be a hand shape information of a standard hand shape.
  • the hand shape feature information may include feature information corresponding to the hand shape information.
  • the hand shape feature information includes, but is not limited to, an overall aspect ratio feature of the hand; a proportional feature of the finger occupying the length of the hand; a thickness feature of the finger; a width feature of the palm; a length ratio feature between the fingers; Wait.
  • the comparing device 2 compares the user's hand shape information with the predetermined hand shape information, and obtains that the ratio of the user's finger to the length of the entire hand is longer than the finger in the standard hand shape.
  • the ratio of the degree is low, and the comparison device 2 obtains the user's hand shape characteristic information to have a lower ratio of the finger to the length of the entire hand.
  • the generating device 3 generates a personalized input panel suitable for the user based on the hand shape feature information.
  • the generating device 3 generates, according to the hand shape feature information, an implementation manner suitable for the personalized input panel of the user, including but not limited to:
  • the generating device 3 directly calculates the position of each button in the personalized input panel according to the user's hand shape feature information to generate a personalized input panel suitable for the user.
  • the generating device 3 first calculates the position of the hot spot button in the input panel according to the overall aspect ratio of the user's hand. Based on the determined location of the hotspot button, the location of the other non-hotspot buttons is calculated to generate a personalized input panel suitable for the user.
  • the hotspot button is a button that is more easily clicked by the user, such as a character button that is clicked by the user.
  • the generating device 3 can also calculate the location of the area where the user is likely to be accidentally touched according to the hand shape characteristic information of the user, and reduce the touch sensitivity of each button located in the area that is easily mistyped. For example, if the hand shape feature information acquired by the comparing device 2 includes the user's little finger is short, the generating device 3 can calculate an area that the little finger is easy to touch, as an area that the user easily touches, and reduce the easily touched area.
  • the touch sensitivity indicates the sensitivity of the user device to the touch operation of the user. Generally, the higher the touch sensitivity, the easier the touch operation performed by the user is detected, and the lower the touch sensitivity, the user performs. The touch operation is less likely to be detected.
  • the generating device 3 includes a first adjusting device (not shown).
  • the first adjusting device adjusts the predetermined input panel according to the hand shape feature information to obtain the personalized input panel.
  • the predetermined input panel is adapted to predetermined hand shape information.
  • the operation of adjusting the predetermined input panel may include any operation of adjusting the layout of the predetermined input panel.
  • the operation of adjusting the predetermined input panel includes, but is not limited to: a) adjusting a position of the hot button of the predetermined input panel, for example, adjusting the position of the hot button to the left or right; b) adjusting The touch sensitivity of the partial area in the predetermined input panel, for example, the touch sensitivity is lowered, so that the user needs a heavier touch, so that the user equipment determines that the button is pressed by the user, etc.; preferably, the touch sensitivity is adjusted
  • the area is an area of the predetermined input panel that is more easily touched by the user based on the hand shape feature information.
  • the first adjusting device can adopt various manners to adjust the predetermined input panel according to the hand shape feature information, thereby obtaining the personalized input panel.
  • the first adjusting device determines that the user's little finger and the index finger are shorter according to the length proportional feature between the fingers in the hand shape feature information, and the first adjusting device moves the position of the button that the little finger and the index finger are easy to touch, so that the first adjusting device moves the button
  • the position of the case is closer to the position of the button that the middle finger and the ring finger are easy to touch.
  • the generating device 3 may further determine the position of each button in the personalized input panel in combination with the size of the region capable of presenting the personalized input panel.
  • the user equipment is a mobile device
  • the size of the area on the mobile device capable of presenting the personalized input panel is 12 ⁇ 6 cm
  • each button in the personalized input panel determined by the generating device 3 should be located in the area.
  • the user equipment obtains an input panel that can meet the personalized requirements of the user by comparing the hand shape information of the user with the standard hand shape information, thereby improving the accuracy of the user's touch operation on the input panel, and The possibility of an input operation caused by a user's accidental collision can be reduced; and the user equipment presents the position indication information on the screen, so that the user places the palm in the position indication information as much as possible when shooting the hand shape image, thereby being better
  • the distance between the palm and the lens is controlled, thereby reducing the influence of the position of the hand on the user's device to analyze the acquired hand image when the hand image is captured.
  • the input panel generating device of the present embodiment includes a first obtaining device 1, a comparing device 2, a generating device 3, and a second obtaining device 4; wherein the generating device 3 further includes a sub-generating device 31.
  • the first obtaining device 1 and the comparing device 2 have been described in detail in the embodiment shown in FIG. 3, and details are not described herein again.
  • the second acquisition device 4 and the sub-generation device 31 of the present embodiment will be described in detail below.
  • the second obtaining means 4 acquires contact information of the user on the user equipment.
  • the contact information includes any information of contacts on the user equipment.
  • the contact information comprises at least one of the following:
  • the hotspot button on the previously presented input panel can be determined by the location information of the contacts on the user device and in conjunction with the information of the input panel previously presented by the user device.
  • a button corresponding to a position with a high touch rate is used as a hot spot button or the like.
  • the second obtaining means 4 acquires shape information of the contact of the thumb of the user on the user equipment; since the contact of the thumb on the screen is generally not a perfect circle or a shape close to a perfect circle, the second obtaining means 4
  • the contact shape can be used to determine whether the contact is a thumb contact, and the user can determine whether the user is using the left or right hand according to the shape information of the thumb on the user device.
  • the second obtaining device 4 acquires the implementation of the contact information of the user on the user equipment. Ways include but are not limited to:
  • the second obtaining means 4 acquires historical contact information of the user on the user equipment.
  • the second obtaining means 4 acquires the contact information or the like which the user has previously recorded on the predetermined input panel.
  • the second obtaining means 4 prompts the user to pre-enter one or more letters on the predetermined input panel, and uses the contact information in the pre-input operation as the contact information of the user on the user equipment.
  • the second obtaining means 4 prompts the user to make an input on the predetermined input panel at random, and uses the contact information input by the user for a predetermined period of time as the contact information of the user on the user device.
  • the child generating device 31 generates a personalized input panel suitable for the user based on the hand shape feature information and the contact information.
  • the implementation manner of the child generating device 31 for generating the personalized input panel suitable for the user according to the hand shape feature information and the contact information includes but is not limited to:
  • the child generating device 31 directly calculates the position of each button in the personalized input panel based on the user's hand shape feature information and the contact information to generate a personalized input panel suitable for the user.
  • the user's hand shape characteristic information includes that the overall length and width ratio of the user's hand is large, and the contact information includes shape information of the plurality of contacts; then the sub-generation device 31 identifies from the shape information of the plurality of contacts The shape of the non-circular or non-near perfect circle is taken as the shape of the contact pressed by the thumb, and based on the shape, it is determined that the user uses the right hand, and the sub-generation device 31 calculates the input according to the overall aspect ratio of the user's hand.
  • the location of the non-hotspot button to generate a personalized one for the user Input panel.
  • the sub-generation device 31 can also calculate an area that the user easily touches according to the hand shape feature information of the user, and reduce the touch sensitivity of the button in the area.
  • the sub-generation device 31 includes a second adjustment device (not shown).
  • the second adjusting device adjusts the predetermined input panel according to the hand shape feature information and the contact information to obtain the personalized input panel.
  • the operation that can be performed on the predetermined input panel has been described in detail in the description of the generating device 3 with reference to FIG. 3, and details are not described herein again.
  • the second adjusting device can adopt various manners to adjust the predetermined input panel according to the hand shape feature information and the contact information, thereby obtaining the personalized input panel.
  • the hand shape feature information includes a proportion of the finger occupying the length of the entire hand is low, and the second adjusting device determines that the user uses the right hand according to the contact information, and the second adjusting device shifts the position of the hot button of the predetermined input panel to the right.
  • the second adjusting device determines that the user uses the right hand according to the contact information, and the second adjusting device shifts the position of the hot button of the predetermined input panel to the right.
  • the hand shape feature information includes a wide width of the palm
  • the second adjusting device determines that the user uses the left hand according to the hand shape feature information, and the second adjusting device reduces the touch sensitivity of the leftmost column of the predetermined input panel. Thereby, the possibility of the user accidentally touching the edge area in the input panel can be reduced.
  • the input panel generating device can adjust the input panel in combination with the user's hand shape feature information and the contact information to generate a personalized input panel that meets the user's requirements, so that the generated personalized input panel can be more in line with the user's usage habit.
  • the present invention can be implemented in software and/or a combination of software and hardware.
  • the various devices of the present invention can be implemented using an application specific integrated circuit (ASIC) or any other similar hardware device.
  • the software program of the present invention may be executed by a processor to implement the steps or functions described above.
  • the software program of the present invention The sequence (including related data structures) may be stored in a computer readable recording medium such as a RAM memory, a magnetic or optical drive or a floppy disk and the like.
  • some of the steps or functions of the present invention may be implemented in hardware, for example, as a circuit that cooperates with a processor to perform various steps or functions.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Geometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)
  • Input From Keyboards Or The Like (AREA)

Abstract

 本发明提供了一种在用户设备中用于生成个性化输入面板的方法,其中,该方法包括以下步骤:a获取用户的手形信息;b将所述用户的手形信息与预定手形信息进行比对,获得所述用户的手形特征信息;c根据所述手形特征信息,生成适用于所述用户的个性化输入面板。根据本发明的方案,可以为用户提供符合其手形特征以及使用习惯的个性化输入面板。

Description

一种用于生成个性化输入面板的方法和装置 技术领域
本发明涉及计算机技术领域,尤其涉及一种用于生成个性化输入面板的方法和装置。
背景技术
现有技术中,用户设备上,特别是触屏设备上的输入面板的可用样式通常是固定的,然而,不同的用户在使用输入面板时,往往存在差异,使得该等固定的输入面板难以较好地与用户的输入习惯相适配。
发明内容
本发明的目的是提供一种用于生成个性化输入面板的方法和装置。
根据本发明的一个方面,提供一种在用户设备中用于生成个性化输入面板的方法,其中,该方法包括以下步骤:
a获取用户的手形信息;
b将所述用户的手形信息与预定手形信息进行比对,获得所述用户的手形特征信息;
c根据所述手形特征信息,生成适用于所述用户的个性化输入面板。
根据本发明的另一个方面,还提供了一种在用户设备中用于生成个性化输入面板的输入面板生成装置,其中,该输入面板生成装置包括以下装置:
第一获取装置,用于获取用户的手形信息;
比对装置,用于将所述用户的手形信息与预定手形信息进行比对,获得所述用户的手形特征信息;
生成装置,用于根据所述手形特征信息,生成适用于所述用户的个性化输入面板。
与现有技术相比,本发明具有以下优点:1)用户设备通过将用户的手形信息与标准手形信息相比对,获得能够符合用户的个性化需求的输入面板,提高了用户对输入面板的触碰操作的准确性,并能够降低由于用户误碰而引起的输入操作发生的可能性;2)用户设备通过在屏幕上呈现位置指示信息,让用户在拍摄手形图像时尽量将手掌置于该位置指示信息内,从而可以较好的控制手掌与镜头的距离,从而减少了用户在拍摄手形图像时,手所处的位置对用户设备分析其所获取的手形图像带来的影响;3)用户设备可结合用户的手形特征信息以及触点信息来调整输入面板,以生成符合用户需求的个性化输入面板,使得生成的个性化输入面板能够更符合用户的使用习惯。
附图说明
通过阅读参照以下附图所作的对非限制性实施例所作的详细描述,本发明的其它特征、目的和优点将会变得更明显:
图1为本发明的一个优选实施例的在用户设备中用于生成个性化输入面板的方法的流程示意图;
图2为本发明的另一个优选实施例的在用户设备中用于生成个性化输入面板的方法的流程示意图;
图3为本发明的一个优选实施例的在用户设备中用于生成个性化输入面板的输入面板生成装置的结构示意图;
图4为本发明的另一个优选实施例的在用户设备中用于生成个性化输入面板的输入面板生成装置的结构示意图。
附图中相同或相似的附图标记代表相同或相似的部件。
具体实施方式
下面结合附图对本发明作进一步详细描述。
图1为本发明一个优选实施例的在用户设备中用于生成个性化输入面板的方法的流程示意图。
其中,本实施例的方法主要通过用户设备来实现;优选地,所述用户设备具有触摸式输入设备,例如,用户设备的显示屏为触摸屏,用户设备具有虚拟键盘等;优选地,所述用户设备包括但不限于PC机、平板电脑、智能手机、PDA、IPTV等;优选地,所述用户设备为移动设备。
需要说明的是,所述用户设备仅为举例,其他现有的或今后可能出现的用户设备如可适用于本发明,也应包含在本发明保护范围以内,并以引用方式包含于此。
根据本实施例的方法包括步骤S1、步骤S2和步骤S3。
在步骤S1中,用户设备获取用户的手形信息。
其中,所述手形信息包括任何能够反应手的形状的信息。优选地,所述手形信息包括但不限于:手的整体长宽比例;手指占整个手的长度的比例;手指的粗细;手掌的宽度;各个手指之间的长度比例等信息。
具体地,用户设备获取用户的手形信息的实现方式包括但不限于:
1)用户设备直接获取在用户设备中存储的用户的手形信息。
例如,用户可预先将其手形信息存储在用户设备中,则用户设备可在该用户使用该用户设备来生成个性化输入面板时,直接获取在用户设备中存储的该用户的手形信息。
需要说明的是,上述举例仅为更好地说明本发明的技术方案,而非对本发明的限制,本领域技术人员应该理解,任何直接获取在用户设备中存储的用户的手形信息的实现方式,均应包含在本发明的范围内。
2)用户设备获取网络设备中存储的用户的手形信息。
例如,用户可预先将其手形信息存储在网络设备中,则用户设备 可在该用户使用该用户设备来生成个性化输入面板时,通过访问网络设备来获取在用户设备中存储的该用户的手形信息。
需要说明的是,上述举例仅为更好地说明本发明的技术方案,而非对本发明的限制,本领域技术人员应该理解,任何从网络设备中获取用户的手形信息的实现方式,均应包含在本发明的范围内。
3)用户设备根据用户拍摄的图像获取用户的手形信息。在该实现方式中,步骤S1进一步包括步骤S11和步骤S12。
在步骤S11中,用户设备获取所述用户拍摄的手形图像。
其中,用户设备获取所述用户拍摄的手形图像的实现方式包括但不限于:
a)用户设备从用户设备中已存储的图像中提取用户拍摄的手形图像。
例如,用户设备根据用户的选择,从其相册中提取某个手形图像作为用户拍摄的手形图像。
b)步骤S11进一步包括步骤S11-1和步骤S11-2。
在步骤S11-1中,用户设备调用所述用户设备的摄像头,并在屏幕上呈现位置指示信息。其中,所述位置指示信息用于指示手在屏幕上的合适位置。例如,位置指示信息可表现为一个手的形状或一个方框;优选地,所述位置指示信息表现为一个标准手形的形状。
优选地,用户设备调用的所述用户设备的摄像头为用户设备的前置摄像头。
接着,用户可将其手置于摄像头前,并使得屏幕上显示的手位于位置指示信息所指示的合适位置后,执行拍摄操作。
接着,在步骤S11-2中,用户设备根据所述用户的拍摄操作,获得所述手形图像。
例如,在步骤S11-1中,用户设备调用其前置摄像头,并在屏幕上呈现一个指示手的合适位置的方框;在步骤S11-2中,用户设备根据用户对其通过前置摄像头置于该方框内的手的图像的拍摄操作,获得方框中的手形图像。
需要说明的是,上述举例仅为更好地说明本发明的技术方案,而非对本发明的限制,本领域技术人员应该理解,任何获取所述用户拍摄的手形图像的实现方式,均应包含在本发明的范围内。
在步骤S12中,用户设备从步骤S11中获取的手形图像中提取出手形信息。
其中,用户设备可以采用多种方式从手形图像中提取出手形信息。例如,通过分析图像,提取手形轮廓,来获得手形信息等。
需要说明的是,上述举例仅为更好地说明本发明的技术方案,而非对本发明的限制,本领域技术人员应该理解,任何从手形图像中提取出手形信息的实现方式,均应包含在本发明的范围内。
需要说明的是,上述举例仅为更好地说明本发明的技术方案,而非对本发明的限制,本领域技术人员应该理解,任何获取用户的手形信息的实现方式,均应包含在本发明的范围内。
在步骤S2中,用户设备将用户的手形信息与预定手形信息进行比对,获得用户的手形特征信息。其中,所述预定手形信息可为一个标准手形的手形信息。
其中,所述手形特征信息可包含与所述手形信息相应的特征信息。优选的,所述手形特征信息包括但不限于:手的整体长宽比例特征;手指占整个手的长度的比例特征;手指的粗细特征;手掌的宽度特征;各个手指之间的长度比例特征等等。
例如,用户设备将用户的手形信息与预定手形信息进行比对,得到用户的手指占整个手的长度的比例比标准手形中手指占整个手的长度的比例低,则用户设备获得用户的手形特征信息为与手指占整个手的长度的比例较低。
需要说明的是,上述举例仅为更好地说明本发明的技术方案,而非对本发明的限制,本领域技术人员应该理解,任何将用户的手形信息与预定手形信息进行比对,获得用户的手形特征信息的实现方式,均应包含在本发明的范围内。
在步骤S3中,用户设备根据所述手形特征信息,生成适用于所述 用户的个性化输入面板。
具体地,用户设备根据所述手形特征信息,生成适用于所述用户的个性化输入面板的实现方式包括但不限于:
1)用户设备根据用户的手形特征信息,直接计算个性化输入面板中的各个按键所在的位置,以生成适合该用户的个性化输入面板。
例如,用户设备得到的用户的手形特征信息为用户的手的整体长宽比例较大,则用户设备先根据用户的手的整体长宽比例来计算输入面板中的热点按键所在的位置,再基于已确定的热点按键所在的位置,计算其他非热点按键所在的位置,从而生成适合该用户的个性化输入面板。其中,所述热点按键为更容易被用户点击的按键,如被用户点击频率较高的字符按键等。
优选地,用户设备还可根据用户的手形特征信息,计算用户容易误触的区域所在的位置,并降低位于容易误触的区域中的各个按键的触摸敏感度。例如,用户设备获取的手形特征信息包括该用户的小指较短,则用户设备可计算该小指容易触碰的区域,作为该用户容易误触的区域,并降低该容易误触的区域中的各个按键的触摸敏感度,以防止因用户误触而造成的错误输入。其中,所述触摸敏感度表示用户设备对用户的触摸操作的敏感程度,通常情况下,触摸敏感度越高,则用户执行的触摸操作越容易被检测到,触摸敏感度越低,则用户执行的触摸操作越不容易被检测到。
需要说明的是,上述举例仅为更好地说明本发明的技术方案,而非对本发明的限制,本领域技术人员应该理解,任何根据用户的手形特征信息,直接计算个性化输入面板中的各个按键所在的位置,以生成适合该用户的个性化输入面板的实现方式,均应包含在本发明的范围内。
2)用户设备根据所述手形特征信息,对预定输入面板进行调整,获得所述个性化输入面板。优选地,所述预定输入面板与预定手形信息适配。
其中,所述对预定输入面板进行调整的操作可包括任何对预定输入 面板的布局进行调整的操作。优选地,所述对预定输入面板进行调整的操作包括但不限于:a)调整所述预定输入面板的热点按键的位置,例如,让热点按键的位置向左或向右调整等;b)调整所述预定输入面板中部分区域的触摸敏感度,例如,将触摸敏感度下调,以使用户需要更重的触摸,才能使得用户设备确定按键被用户按下等;优选地,被调整触敏感度的区域为基于手形特征信息确定的、预定输入面板中更容易被用户误触的区域。
其中,用户设备可采用多种方式,来根据手形特征信息,对预定输入面板进行调整,从而获得所述个性化输入面板。
例如,用户设备根据手形特征信息中各个手指之间的长度比例特征,确定用户的小指和食指较短,则用户设备移动小指和食指容易触碰的按键的位置,以使该等案件的位置更靠近中指和无名指容易触碰的按键的位置等。
需要说明的是,在生成个性化输入面板的过程中,用户设备还可进一步结合能够呈现个性化输入面板的区域的大小,来确定个性化输入面板中各个按键的位置。例如,用户设备为移动设备,该移动设备上能够呈现个性化输入面板的区域的大小为12×6cm,则用户设备所确定的个性化输入面板中的各个按键,均应位于该区域中。
需要说明的是,上述举例仅为更好地说明本发明的技术方案,而非对本发明的限制,本领域技术人员应该理解,任何根据所述手形特征信息,对预定输入面板进行调整,获得所述个性化输入面板的实现方式,均应包含在本发明的范围内。
根据本实施例的方案,用户设备通过将用户的手形信息与标准手形信息相比对,获得能够符合用户的个性化需求的输入面板,提高了用户对输入面板的触碰操作的准确性,并能够降低由于用户误碰而引起的输入操作发生的可能性;且用户设备通过在屏幕上呈现位置指示信息,让用户在拍摄手形图像时尽量将手掌置于该位置指示信息内,从而可以较好的控制手掌与镜头的距离,从而减少了用户在拍摄手形图像时,手所处的位置对用户设备分析其所获取的手形图像带来的影 响。
图2为本发明的另一个优选实施例的在用户设备中用于生成个性化输入面板的方法的流程示意图。本实施例的方法主要通过用户设备来实现;其中,对参照图1所示实施例中所述的用户设备所作的任何说明,均以引用的方式包含于本实施例中。其中,本实施例的方法包括步骤S1、步骤S2、步骤S3和步骤S4;其中,步骤S3进一步包括步骤S31。其中,步骤S1和步骤S2已在参照图1所示实施例中予以详述,在此不再赘述。
以下详细说明本实施例的步骤S4和步骤S31。
在步骤S4中,用户设备获取用户在用户设备上的触点信息。
其中,所述触点信息包括任何在用户设备上的触点的信息。优选地,所述触点信息包括以下至少一项:
1)触点在所述用户设备上的位置信息。
优选地,通过触点在所述用户设备上的位置信息,并结合用户设备之前呈现的输入面板的信息,可确定先前呈现的输入面板上的热点按键。例如,将触碰率高的位置所对应的按键,作为热点按键等。
2)触点的形状信息。
优选地,用户设备获取用户的大拇指在用户设备上的触点的形状信息;由于大拇在屏幕上的触点通常并非正圆或并非接近正圆的形状,故用户设备可通过触点形状来判断该触点是否为大拇指触点,并可根据大拇指在用户设备上的触点的形状信息来判断用户使用的是左手还是右手。
具体地,用户设备获取用户在用户设备上的触点信息的实现方式包括但不限于:
a)用户设备获取用户在用户设备上的历史触点信息。
例如,用户设备获取其之前记录的、用户在预定输入面板上的触点信息等。
b)用户设备提示用户在预定输入面板上预输入一个或多个字母,且将该预输入操作中的触点信息作为用户在用户设备上的触点信息。
例如,用户设备提示用户随机在预定输入面板上进行输入,且将用户在一个预定时间段内输入的触点信息作为用户在用户设备上的触点信息。
需要说明的是,上述举例仅为更好地说明本发明的技术方案,而非对本发明的限制,本领域技术人员应该理解,任何获取用户在用户设备上的触点信息的实现方式,均应包含在本发明的范围内。
在步骤S31中,用户设备根据所述手形特征信息以及所述触点信息,生成适用于所述用户的个性化输入面板。
具体地,用户设备根据所述手形特征信息以及所述触点信息,生成适用于所述用户的个性化输入面板的实现方式包括但不限于:
1)用户设备根据用户的手形特征信息以及触点信息,直接计算个性化输入面板中的各个按键所在的位置,以生成适合该用户的个性化输入面板。
例如,用户的手形特征信息包括用户的手的整体长宽比例较大,触点信息包括多个触点的形状信息;则用户设备根据该多个触点的形状信息,从其中识别出非正圆或非接近正圆的形状作为大拇指按下的触点的形状,并基于该形状确定用户使用的是右手,并且,用户设备根据用户的手的整体长宽比例来计算输入面板中的热点按键所在的初始位置;接着,用户设备将各个热点按键自初始位置起向右偏移,接着,用户设备再基于位置右移后的热点按键所在的位置,计算其他非热点按键所在的位置,进而生成适合该用户的个性化输入面板。
需要说明的是,用户设备还可根据用户的手形特征信息,计算用户容易误触的区域,并将该区域内的按键的触摸敏感度降低。
2)用户设备根据所述手形特征信息以及触点信息,对预定输入面板进行调整,获得所述个性化输入面板。其中,可对预定输入面板执行的操作已在参照图1中的步骤S3中已予以详述,在此不再赘述。
其中,用户设备可采用多种方式,来根据手形特征信息以及触点信息,对预定输入面板进行调整,从而获得所述个性化输入面板。
例如,手形特征信息包括手指占整个手的长度的比例低,且用户 设备根据触点信息判断用户使用的是右手,则用户设备将预定输入面板的热点按键的位置向右偏移。由此,对于因手指较短故手指移动距离受限的用户,通过向右偏移热点按键,能够减少用户移动手指的距离,使其更容易触摸到按键。
又例如,手形特征信息包括手掌的宽度较宽,且用户设备根据手形特征信息,判断用户使用的是左手,则用户设备将预定输入面板中最左边的一列案件的触摸敏感度降低,由此,可减少用户误触输入面板中边缘区域的可能。
需要说明的是,上述举例仅为更好地说明本发明的技术方案,而非对本发明的限制,本领域技术人员应该理解,任何根据所述手形特征信息以及所述触点信息,生成适用于所述用户的个性化输入面板的实现方式,均应包含在本发明的范围内。
根据本实施例的方案,用户设备可结合用户的手形特征信息以及触点信息来调整输入面板,以生成符合用户需求的个性化输入面板,使得生成的个性化输入面板能够更符合用户的使用习惯。
图3为本发明一个优选实施例的在用户设备中用于生成个性化输入面板的输入面板生成装置的结构示意图。其中,该输入面板生成装置包括第一获取装置1、比对装置2和生成装置3。优选地,所述输入面板生成装置包含于用户设备中。
第一获取装置1获取用户的手形信息。
其中,所述手形信息包括任何能够反应手的形状的信息。优选地,所述手形信息包括但不限于:手的整体长宽比例;手指占整个手的长度的比例;手指的粗细;手掌的宽度;各个手指之间的长度比例等信息。
具体地,第一获取装置1获取用户的手形信息的实现方式包括但不限于:
1)第一获取装置1直接获取在用户设备中存储的用户的手形信息。
例如,用户可预先将其手形信息存储在用户设备中,则第一获取装置1可在该用户使用该用户设备来生成个性化输入面板时,直接获取在 用户设备中存储的该用户的手形信息。
需要说明的是,上述举例仅为更好地说明本发明的技术方案,而非对本发明的限制,本领域技术人员应该理解,任何直接获取在用户设备中存储的用户的手形信息的实现方式,均应包含在本发明的范围内。
2)第一获取装置1获取网络设备中存储的用户的手形信息。
例如,用户可预先将其手形信息存储在网络设备中,则第一获取装置1可在该用户使用该用户设备来生成个性化输入面板时,通过访问网络设备来获取在用户设备中存储的该用户的手形信息。
需要说明的是,上述举例仅为更好地说明本发明的技术方案,而非对本发明的限制,本领域技术人员应该理解,任何从网络设备中获取用户的手形信息的实现方式,均应包含在本发明的范围内。
3)用第一获取装置1根据用户拍摄的图像获取用户的手形信息。在该实现方式中,第一获取装置1进一步包括子获取装置(图未示)和提取装置(图未示)。
子获取装置获取所述用户拍摄的手形图像。
其中,子获取装置获取所述用户拍摄的手形图像的实现方式包括但不限于:
a)子获取装置从用户设备中已存储的图像中提取用户拍摄的手形图像。
例如,子获取装置根据用户的选择,从用户设备的相册中提取某个手形图像作为用户拍摄的手形图像。
b)子获取装置进一步包括呈现装置(图未示)和图像获取装置(图未示)。
呈现装置调用所述用户设备的摄像头,并在屏幕上呈现位置指示信息。其中,所述位置指示信息用于指示手在屏幕上的合适位置。例如,位置指示信息可表现为一个手的形状或一个方框;优选地,所述位置指示信息表现为一个标准手形的形状。
优选地,呈现装置调用的所述用户设备的摄像头为用户设备的前置 摄像头。
接着,用户可将其手置于摄像头前,并使得屏幕上显示的手位于位置指示信息所指示的合适位置后,执行拍摄操作。
接着,图像获取装置根据所述用户的拍摄操作,获得所述手形图像。
例如,呈现装置调用其前置摄像头,并在屏幕上呈现一个指示手的合适位置的方框;图像获取装置根据用户对其通过前置摄像头置于该方框内的手的图像的拍摄操作,获得方框中的手形图像。
需要说明的是,上述举例仅为更好地说明本发明的技术方案,而非对本发明的限制,本领域技术人员应该理解,任何获取所述用户拍摄的手形图像的实现方式,均应包含在本发明的范围内。
提取装置从子获取装置获取的手形图像中提取出手形信息。
其中,提取装置可以采用多种方式从手形图像中提取出手形信息。例如,通过分析图像,提取手形轮廓,来获得手形信息等。
需要说明的是,上述举例仅为更好地说明本发明的技术方案,而非对本发明的限制,本领域技术人员应该理解,任何从手形图像中提取出手形信息的实现方式,均应包含在本发明的范围内。
需要说明的是,上述举例仅为更好地说明本发明的技术方案,而非对本发明的限制,本领域技术人员应该理解,任何获取用户的手形信息的实现方式,均应包含在本发明的范围内。
比对装置2将用户的手形信息与预定手形信息进行比对,获得用户的手形特征信息。其中,所述预定手形信息可为一个标准手形的手形信息。
其中,所述手形特征信息可包含与所述手形信息相应的特征信息。优选的,所述手形特征信息包括但不限于:手的整体长宽比例特征;手指占整个手的长度的比例特征;手指的粗细特征;手掌的宽度特征;各个手指之间的长度比例特征等等。
例如,比对装置2将用户的手形信息与预定手形信息进行比对,得到用户的手指占整个手的长度的比例比标准手形中手指占整个手的长 度的比例低,则比对装置2获得用户的手形特征信息为与手指占整个手的长度的比例较低。
需要说明的是,上述举例仅为更好地说明本发明的技术方案,而非对本发明的限制,本领域技术人员应该理解,任何将用户的手形信息与预定手形信息进行比对,获得用户的手形特征信息的实现方式,均应包含在本发明的范围内。
生成装置3根据所述手形特征信息,生成适用于所述用户的个性化输入面板。
具体地,生成装置3根据所述手形特征信息,生成适用于所述用户的个性化输入面板的实现方式包括但不限于:
1)生成装置3根据用户的手形特征信息,直接计算个性化输入面板中的各个按键所在的位置,以生成适合该用户的个性化输入面板。
例如,生成装置3得到的用户的手形特征信息为用户的手的整体长宽比例较大,则生成装置3先根据用户的手的整体长宽比例来计算输入面板中的热点按键所在的位置,再基于已确定的热点按键所在的位置,计算其他非热点按键所在的位置,从而生成适合该用户的个性化输入面板。其中,所述热点按键为更容易被用户点击的按键,如被用户点击频率较高的字符按键等。
优选地,生成装置3还可根据用户的手形特征信息,计算用户容易误触的区域所在的位置,并降低位于容易误触的区域中的各个按键的触摸敏感度。例如,比对装置2获取的手形特征信息包括该用户的小指较短,则生成装置3可计算该小指容易触碰的区域,作为该用户容易误触的区域,并降低该容易误触的区域中的各个按键的触摸敏感度,以防止因用户误触而造成的错误输入。其中,所述触摸敏感度表示用户设备对用户的触摸操作的敏感程度,通常情况下,触摸敏感度越高,则用户执行的触摸操作越容易被检测到,触摸敏感度越低,则用户执行的触摸操作越不容易被检测到。
需要说明的是,上述举例仅为更好地说明本发明的技术方案,而非对本发明的限制,本领域技术人员应该理解,任何根据用户的手形特 征信息,直接计算个性化输入面板中的各个按键所在的位置,以生成适合该用户的个性化输入面板的实现方式,均应包含在本发明的范围内。
2)生成装置3包括第一调整装置(图未示)。第一调整装置根据所述手形特征信息,对预定输入面板进行调整,获得所述个性化输入面板。优选地,所述预定输入面板与预定手形信息适配。
其中,所述对预定输入面板进行调整的操作可包括任何对预定输入面板的布局进行调整的操作。优选地,所述对预定输入面板进行调整的操作包括但不限于:a)调整所述预定输入面板的热点按键的位置,例如,让热点按键的位置向左或向右调整等;b)调整所述预定输入面板中部分区域的触摸敏感度,例如,将触摸敏感度下调,以使用户需要更重的触摸,才能使得用户设备确定按键被用户按下等;优选地,被调整触敏感度的区域为基于手形特征信息确定的、预定输入面板中更容易被用户误触的区域。
其中,第一调整装置可采用多种方式,来根据手形特征信息,对预定输入面板进行调整,从而获得所述个性化输入面板。
例如,第一调整装置根据手形特征信息中各个手指之间的长度比例特征,确定用户的小指和食指较短,则第一调整装置移动小指和食指容易触碰的按键的位置,以使该等案件的位置更靠近中指和无名指容易触碰的按键的位置等。
需要说明的是,在生成个性化输入面板的过程中,生成装置3还可进一步结合能够呈现个性化输入面板的区域的大小,来确定个性化输入面板中各个按键的位置。例如,用户设备为移动设备,该移动设备上能够呈现个性化输入面板的区域的大小为12×6cm,则生成装置3所确定的个性化输入面板中的各个按键,均应位于该区域中。
需要说明的是,上述举例仅为更好地说明本发明的技术方案,而非对本发明的限制,本领域技术人员应该理解,任何根据所述手形特征信息,对预定输入面板进行调整,获得所述个性化输入面板的实现方式,均应包含在本发明的范围内。
根据本实施例的方案,用户设备通过将用户的手形信息与标准手形信息相比对,获得能够符合用户的个性化需求的输入面板,提高了用户对输入面板的触碰操作的准确性,并能够降低由于用户误碰而引起的输入操作发生的可能性;且用户设备通过在屏幕上呈现位置指示信息,让用户在拍摄手形图像时尽量将手掌置于该位置指示信息内,从而可以较好的控制手掌与镜头的距离,从而减少了用户在拍摄手形图像时,手所处的位置对用户设备分析其所获取的手形图像带来的影响。
图4为本发明的另一个优选实施例的在用户设备中用于生成个性化输入面板的输入面板生成装置的结构示意图。其中,本实施例的输入面板生成装置包括第一获取装置1、比对装置2、生成装置3和第二获取装置4;其中,生成装置3进一步包括子生成装置31。其中,第一获取装置1和比对装置2已在参照图3所示实施例中予以详述,在此不再赘述。
以下详细说明本实施例的第二获取装置4和子生成装置31。
第二获取装置4获取用户在用户设备上的触点信息。
其中,所述触点信息包括任何在用户设备上的触点的信息。优选地,所述触点信息包括以下至少一项:
1)触点在所述用户设备上的位置信息。
优选地,通过触点在所述用户设备上的位置信息,并结合用户设备之前呈现的输入面板的信息,可确定先前呈现的输入面板上的热点按键。例如,将触碰率高的位置所对应的按键,作为热点按键等。
2)触点的形状信息。
优选地,第二获取装置4获取用户的大拇指在用户设备上的触点的形状信息;由于大拇在屏幕上的触点通常并非正圆或并非接近正圆的形状,故第二获取装置4可通过触点形状来判断该触点是否为大拇指触点,并可根据大拇指在用户设备上的触点的形状信息来判断用户使用的是左手还是右手。
具体地,第二获取装置4获取用户在用户设备上的触点信息的实现 方式包括但不限于:
a)第二获取装置4获取用户在用户设备上的历史触点信息。
例如,第二获取装置4获取其之前记录的、用户在预定输入面板上的触点信息等。
b)第二获取装置4提示用户在预定输入面板上预输入一个或多个字母,且将该预输入操作中的触点信息作为用户在用户设备上的触点信息。
例如,第二获取装置4提示用户随机在预定输入面板上进行输入,且将用户在一个预定时间段内输入的触点信息作为用户在用户设备上的触点信息。
需要说明的是,上述举例仅为更好地说明本发明的技术方案,而非对本发明的限制,本领域技术人员应该理解,任何获取用户在用户设备上的触点信息的实现方式,均应包含在本发明的范围内。
子生成装置31根据所述手形特征信息以及所述触点信息,生成适用于所述用户的个性化输入面板。
具体地,子生成装置31根据所述手形特征信息以及所述触点信息,生成适用于所述用户的个性化输入面板的实现方式包括但不限于:
1)子生成装置31根据用户的手形特征信息以及触点信息,直接计算个性化输入面板中的各个按键所在的位置,以生成适合该用户的个性化输入面板。
例如,用户的手形特征信息包括用户的手的整体长宽比例较大,触点信息包括多个触点的形状信息;则子生成装置31根据该多个触点的形状信息,从其中识别出非正圆或非接近正圆的形状作为大拇指按下的触点的形状,并基于该形状确定用户使用的是右手,并且,子生成装置31根据用户的手的整体长宽比例来计算输入面板中的热点按键所在的初始位置;接着,子生成装置31将各个热点按键自初始位置起向右偏移,接着,子生成装置31再基于位置右移后的热点按键所在的位置,计算其他非热点按键所在的位置,进而生成适合该用户的个性化 输入面板。
需要说明的是,子生成装置31还可根据用户的手形特征信息,计算用户容易误触的区域,并将该区域内的按键的触摸敏感度降低。
2)子生成装置31包括第二调整装置(图未示)。第二调整装置根据所述手形特征信息以及触点信息,对预定输入面板进行调整,获得所述个性化输入面板。其中,可对预定输入面板执行的操作已在对参照图3中的生成装置3的说明中已予以详述,在此不再赘述。
其中,第二调整装置可采用多种方式,来根据手形特征信息以及触点信息,对预定输入面板进行调整,从而获得所述个性化输入面板。
例如,手形特征信息包括手指占整个手的长度的比例低,且第二调整装置根据触点信息判断用户使用的是右手,则第二调整装置将预定输入面板的热点按键的位置向右偏移。由此,对于因手指较短故手指移动距离受限的用户,通过向右偏移热点按键,能够减少用户移动手指的距离,使其更容易触摸到按键。
又例如,手形特征信息包括手掌的宽度较宽,且第二调整装置根据手形特征信息,判断用户使用的是左手,则第二调整装置将预定输入面板中最左边的一列案件的触摸敏感度降低,由此,可减少用户误触输入面板中边缘区域的可能。
需要说明的是,上述举例仅为更好地说明本发明的技术方案,而非对本发明的限制,本领域技术人员应该理解,任何根据所述手形特征信息以及所述触点信息,生成适用于所述用户的个性化输入面板的实现方式,均应包含在本发明的范围内。
根据本实施例的方案,输入面板生成装置可结合用户的手形特征信息以及触点信息来调整输入面板,以生成符合用户需求的个性化输入面板,使得生成的个性化输入面板能够更符合用户的使用习惯。
需要注意的是,本发明可在软件和/或软件与硬件的组合体中被实施,例如,本发明的各个装置可采用专用集成电路(ASIC)或任何其他类似硬件设备来实现。在一个实施例中,本发明的软件程序可以通过处理器执行以实现上文所述步骤或功能。同样地,本发明的软件程 序(包括相关的数据结构)可以被存储到计算机可读记录介质中,例如,RAM存储器,磁或光驱动器或软磁盘及类似设备。另外,本发明的一些步骤或功能可采用硬件来实现,例如,作为与处理器配合从而执行各个步骤或功能的电路。
对于本领域技术人员而言,显然本发明不限于上述示范性实施例的细节,而且在不背离本发明的精神或基本特征的情况下,能够以其他的具体形式实现本发明。因此,无论从哪一点来看,均应将实施例看作是示范性的,而且是非限制性的,本发明的范围由所附权利要求而不是上述说明限定,因此旨在将落在权利要求的等同要件的含义和范围内的所有变化涵括在本发明内。不应将权利要求中的任何附图标记视为限制所涉及的权利要求。此外,显然“包括”一词不排除其他单元或步骤,单数不排除复数。系统权利要求中陈述的多个单元或装置也可以由一个单元或装置通过软件或者硬件来实现。第一,第二等词语用来表示名称,而并不表示任何特定的顺序。

Claims (19)

  1. 一种在用户设备中用于生成个性化输入面板的方法,其中,该方法包括以下步骤:
    a获取用户的手形信息;
    b将所述用户的手形信息与预定手形信息进行比对,获得所述用户的手形特征信息;
    c根据所述手形特征信息,生成适用于所述用户的个性化输入面板。
  2. 根据权利要求1所述的方法,其中,所述步骤c包括以下步骤:
    -根据所述手形特征信息,对预定输入面板进行调整,获得所述个性化输入面板。
  3. 根据权利要求1所述的方法,其中,该方法还包括以下步骤:
    -获取所述用户在所述用户设备上的触点信息;
    其中,所述步骤c包括以下步骤:
    c’根据所述手形特征信息以及所述触点信息,生成适用于所述用户的个性化输入面板。
  4. 根据权利要求3所述的方法,其中,所述步骤c’包括以下步骤:
    -根据所述手形特征信息以及所述触点信息,对预定输入面板进行调整,获得所述个性化输入面板。
  5. 根据权利要求3或4所述的方法,其中,所述触点信息包括以下至少一项:
    -触点在所述用户设备上的位置信息;
    -触点的形状信息。
  6. 根据权利要求2或4所述的方法,其中,所述对预定输入面板进行调整的操作包括:
    -调整所述预定输入面板的热点按键的位置;
    -调整所述预定输入面板中部分区域的触摸敏感度。
  7. 根据权利要求1至6中任一项所述的方法,其中,所述步骤a包 括以下步骤:
    a1获取所述用户拍摄的手形图像;
    a2从所述手形图像中提取出所述手形信息。
  8. 根据权利要求7所述的方法,其中,所述步骤a1包括以下步骤:
    -调用所述用户设备的摄像头,并在屏幕上呈现位置指示信息,该位置指示信息用于指示手在屏幕上的合适位置;
    -根据所述用户的拍摄操作,获得所述手形图像。
  9. 一种在用户设备中用于生成个性化输入面板的输入面板生成装置,其中,该输入面板生成装置包括以下装置:
    第一获取装置,用于获取用户的手形信息;
    比对装置,用于将所述用户的手形信息与预定手形信息进行比对,获得所述用户的手形特征信息;
    生成装置,用于根据所述手形特征信息,生成适用于所述用户的个性化输入面板。
  10. 根据权利要求9所述的输入面板生成装置,其中,所述生成装置包括以下装置:
    第一调整装置,用于根据所述手形特征信息,对预定输入面板进行调整,获得所述个性化输入面板。
  11. 根据权利要求9所述的输入面板生成装置,其中,该输入面板生成装置还包括以下装置:
    第二获取装置,用于获取所述用户在所述用户设备上的触点信息;
    其中,所述生成装置包括以下装置:
    子生成装置,用于根据所述手形特征信息以及所述触点信息,生成适用于所述用户的个性化输入面板。
  12. 根据权利要求11所述的输入面板生成装置,其中,所述子生成装置包括以下装置:
    第二调整装置,用于根据所述手形特征信息以及所述触点信息,对预定输入面板进行调整,获得所述个性化输入面板。
  13. 根据权利要求11或12所述的输入面板生成装置,其中,所述触点信息包括以下至少一项:
    -触点在所述用户设备上的位置信息;
    -触点的形状信息。
  14. 根据权利要求10或12所述的输入面板生成装置,其中,所述对预定输入面板进行调整的操作包括:
    -调整所述预定输入面板的热点按键的位置;
    -调整所述预定输入面板中部分区域的触摸敏感度。
  15. 根据权利要求9至14中任一项所述的输入面板生成装置,其中,所述第一获取装置包括以下装置:
    子获取装置,用于获取所述用户拍摄的手形图像;
    提取装置,用于从所述手形图像中提取出所述手形信息。
  16. 根据权利要求15所述的输入面板生成装置,其中,所述子获取装置包括以下装置:
    呈现装置,用于调用所述用户设备的摄像头,并在屏幕上呈现位置指示信息,该位置指示信息用于指示手在屏幕上的合适位置;
    图像获取装置,用于根据所述用户的拍摄操作,获得所述手形图像。
  17. 一种计算机可读存储介质,所述计算机可读存储介质包括计算机指令,当所述计算机指令被执行时,如权利要求1至8中任一项所述的方法被执行。
  18. 一种计算机程序产品,当所述计算机程序产品被运行时,如权利要求1至8中任一项所述的方法被执行。
  19. 一种计算机设备,所述计算机设备包括存储器和处理器,所述存储器中存储有计算机代码,所述处理器被配置来通过执行所述计算机代码以执行如权利要求1至8中任一项所述的方法。
PCT/CN2014/086846 2013-12-17 2014-09-18 一种用于生成个性化输入面板的方法和装置 WO2015090092A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2016559481A JP6397508B2 (ja) 2013-12-17 2014-09-18 個人用入力パネルを生成する方法および装置
US14/412,379 US10379659B2 (en) 2013-12-17 2014-09-18 Method and apparatus for generating a personalized input panel
EP14814671.5A EP3086210A4 (en) 2013-12-17 2014-09-18 Method and device for generating individualized input panel

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201310692810.3 2013-12-17
CN201310692810.3A CN103699882A (zh) 2013-12-17 2013-12-17 一种用于生成个性化输入面板的方法和装置

Publications (1)

Publication Number Publication Date
WO2015090092A1 true WO2015090092A1 (zh) 2015-06-25

Family

ID=50361405

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/086846 WO2015090092A1 (zh) 2013-12-17 2014-09-18 一种用于生成个性化输入面板的方法和装置

Country Status (5)

Country Link
US (1) US10379659B2 (zh)
EP (1) EP3086210A4 (zh)
JP (1) JP6397508B2 (zh)
CN (1) CN103699882A (zh)
WO (1) WO2015090092A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103699882A (zh) * 2013-12-17 2014-04-02 百度在线网络技术(北京)有限公司 一种用于生成个性化输入面板的方法和装置
US9626020B2 (en) * 2014-09-12 2017-04-18 Microsoft Corporation Handedness detection from touch input
CN104391646B (zh) 2014-11-19 2017-12-26 百度在线网络技术(北京)有限公司 调整对象属性信息的方法及装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090237361A1 (en) * 2008-03-18 2009-09-24 Microsoft Corporation Virtual keyboard based activation and dismissal
CN102467330A (zh) * 2010-11-16 2012-05-23 吉易高科股份有限公司 一种虚拟键盘装置及其操作方法
CN102736829A (zh) * 2011-04-03 2012-10-17 苏州达方电子有限公司 具有虚拟键盘的触控装置及其形成虚拟键盘的方法
CN103329070A (zh) * 2010-11-24 2013-09-25 日本电气株式会社 输入装置以及输入装置的控制方法
CN103699882A (zh) * 2013-12-17 2014-04-02 百度在线网络技术(北京)有限公司 一种用于生成个性化输入面板的方法和装置

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB0201074D0 (en) 2002-01-18 2002-03-06 3G Lab Ltd Graphic user interface for data processing device
JP4071550B2 (ja) * 2002-06-05 2008-04-02 一好 小谷 仮想キー片手入力装置における仮想キー配列方法
JP4786483B2 (ja) * 2006-09-14 2011-10-05 富士通株式会社 生体認証装置の生体誘導制御方法及び生体認証装置
WO2009147901A1 (ja) * 2008-06-02 2009-12-10 シャープ株式会社 入力装置、入力方法、プログラムおよび記録媒体
JP5172485B2 (ja) * 2008-06-10 2013-03-27 シャープ株式会社 入力装置及び入力装置の制御方法
US8100425B2 (en) * 2008-08-01 2012-01-24 Maurice Raynor Bicycle having independent rear wheel steering
CN101727535A (zh) * 2008-10-30 2010-06-09 北大方正集团有限公司 一种跨系统患者交叉索引方法及其系统
US8502787B2 (en) 2008-11-26 2013-08-06 Panasonic Corporation System and method for differentiating between intended and unintended user input on a touchpad
US8368658B2 (en) * 2008-12-02 2013-02-05 At&T Mobility Ii Llc Automatic soft key adaptation with left-right hand edge sensing
US8300023B2 (en) * 2009-04-10 2012-10-30 Qualcomm Incorporated Virtual keypad generator with learning capabilities
US8493346B2 (en) * 2009-12-31 2013-07-23 International Business Machines Corporation Morphing touchscreen keyboard interface
JP5646896B2 (ja) 2010-07-21 2014-12-24 Kddi株式会社 携帯端末およびキー表示方法
US8718334B2 (en) 2011-05-05 2014-05-06 Honeywell International Inc. System for biometric hand analysis
US9448724B2 (en) * 2011-07-11 2016-09-20 International Business Machines Corporation Dynamically customizable touch screen keyboard for adapting to user physiology
US8750852B2 (en) * 2011-10-27 2014-06-10 Qualcomm Incorporated Controlling access to a mobile device
JP6006487B2 (ja) 2011-12-19 2016-10-12 ミネベア株式会社 入力装置
US20130300668A1 (en) 2012-01-17 2013-11-14 Microsoft Corporation Grip-Based Device Adaptations
US10168784B2 (en) * 2012-09-20 2019-01-01 Sony Corporation Information processing apparatus and method, and program
JP2014137627A (ja) * 2013-01-15 2014-07-28 Sony Corp 入力装置、出力装置および記憶媒体
JP6219100B2 (ja) * 2013-08-29 2017-10-25 シャープ株式会社 ソフトウェアキーボードを表示可能な画像表示装置及びその制御方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090237361A1 (en) * 2008-03-18 2009-09-24 Microsoft Corporation Virtual keyboard based activation and dismissal
CN102467330A (zh) * 2010-11-16 2012-05-23 吉易高科股份有限公司 一种虚拟键盘装置及其操作方法
CN103329070A (zh) * 2010-11-24 2013-09-25 日本电气株式会社 输入装置以及输入装置的控制方法
CN102736829A (zh) * 2011-04-03 2012-10-17 苏州达方电子有限公司 具有虚拟键盘的触控装置及其形成虚拟键盘的方法
CN103699882A (zh) * 2013-12-17 2014-04-02 百度在线网络技术(北京)有限公司 一种用于生成个性化输入面板的方法和装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3086210A4 *

Also Published As

Publication number Publication date
CN103699882A (zh) 2014-04-02
US10379659B2 (en) 2019-08-13
EP3086210A4 (en) 2017-08-23
EP3086210A1 (en) 2016-10-26
US20160266698A1 (en) 2016-09-15
JP6397508B2 (ja) 2018-09-26
JP2017504917A (ja) 2017-02-09

Similar Documents

Publication Publication Date Title
US9965039B2 (en) Device and method for displaying user interface of virtual input device based on motion recognition
US20150268789A1 (en) Method for preventing accidentally triggering edge swipe gesture and gesture triggering
US20140165013A1 (en) Electronic device and page zooming method thereof
US20130152002A1 (en) Data collection and analysis for adaptive user interfaces
CN105843440B (zh) 电子显示器的配准
BR112013006616B1 (pt) aparelho e método para detectar objeto com base em proximidade da superfície de entrada, item de informação associado e de distância do objeto
US11630576B2 (en) Electronic device and method for processing letter input in electronic device
EP3709147B1 (en) Method and apparatus for determining fingerprint collection region
TW201621612A (zh) 用於在一使用者介面中進行導覽之欄介面
US20150242118A1 (en) Method and device for inputting
TW201227460A (en) Method and apparatus for providing different user interface effects for different implementation characteristics of a touch event
EP3015997A1 (en) Method and device for facilitating selection of blocks of information
KR20140078629A (ko) 인플레이스 방식으로 값을 편집하는 사용자 인터페이스
US20150286356A1 (en) Method, apparatus, and terminal device for controlling display of application interface
US20170371525A1 (en) Method and device for setting identity image
WO2017032078A1 (zh) 一种界面控制方法及移动终端
WO2017032020A1 (zh) 一种图片处理方法及电子终端
US10152496B2 (en) User interface device, search method, and program
WO2015090092A1 (zh) 一种用于生成个性化输入面板的方法和装置
US20130229427A1 (en) Animated transition from an application window to another application window
US10394442B2 (en) Adjustment of user interface elements based on user accuracy and content consumption
US11144178B2 (en) Method for providing contents for mobile terminal on the basis of user touch and hold time
US9148537B1 (en) Facial cues as commands
US20180239440A1 (en) Information processing apparatus, information processing method, and program
WO2015067166A1 (zh) 一种触摸式输入方法及装置

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2016559481

Country of ref document: JP

Kind code of ref document: A

REEP Request for entry into the european phase

Ref document number: 2014814671

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2014814671

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 14412379

Country of ref document: US

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14814671

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE