WO2018098668A1 - 电子装置及其软键盘显示方法 - Google Patents

电子装置及其软键盘显示方法 Download PDF

Info

Publication number
WO2018098668A1
WO2018098668A1 PCT/CN2016/107939 CN2016107939W WO2018098668A1 WO 2018098668 A1 WO2018098668 A1 WO 2018098668A1 CN 2016107939 W CN2016107939 W CN 2016107939W WO 2018098668 A1 WO2018098668 A1 WO 2018098668A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
image
keyboard
layout information
electronic device
Prior art date
Application number
PCT/CN2016/107939
Other languages
English (en)
French (fr)
Inventor
李金鑫
Original Assignee
深圳市柔宇科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市柔宇科技有限公司 filed Critical 深圳市柔宇科技有限公司
Priority to EP16922948.1A priority Critical patent/EP3550417A1/en
Priority to CN201680038938.XA priority patent/CN107995969A/zh
Priority to US16/330,185 priority patent/US20190227707A1/en
Priority to PCT/CN2016/107939 priority patent/WO2018098668A1/zh
Publication of WO2018098668A1 publication Critical patent/WO2018098668A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/34Smoothing or thinning of the pattern; Morphological operations; Skeletonisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1365Matching; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/197Matching; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/70Multimodal biometrics, e.g. combining information from different biometric modalities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/178Human faces, e.g. facial parts, sketches or expressions estimating age from face image; using age information for improving recognition

Definitions

  • the present invention relates to an electronic device, and more particularly to an electronic device having a soft keyboard input function and a soft keyboard display method thereof.
  • the existing electronic devices with touch screens generally support handwriting input and provide a soft keyboard (virtual keyboard) for the user to click input.
  • the soft keyboard is the most used input method.
  • the layout of the existing soft keyboard usually requires the user to adjust some settings of the soft keyboard through menu options, etc. after the soft keyboard is called up, and the adjustment degree is limited, for example, the nine-key keyboard is usually adjusted to a full keyboard, etc., the soft keyboard
  • the position of the button and the size of the button are usually fixed.
  • the embodiment of the invention discloses an electronic device and a soft keyboard display method, which can automatically identify the category of the user to display a soft keyboard having a layout corresponding to the user category, so as to facilitate the use of the user of the category.
  • the electronic device disclosed in the embodiment of the invention includes a display screen, a processor, and a biometrics acquiring unit.
  • the biometric acquiring unit is configured to acquire a biometric image of a current user of the electronic device.
  • the processor includes: an analysis module, configured to analyze the biometric image acquired by the biometric acquiring unit by using an image processing technology, and determine a category of the current user according to the analysis result; and a keyboard layout determining module, configured to use the current user according to the current user The category determines the corresponding keyboard layout information; and the display control module is configured to, after receiving a soft keyboard call request, control the display screen to display a soft keyboard having a corresponding layout according to the corresponding keyboard layout information.
  • the method for displaying a soft keyboard disclosed in the embodiment of the present invention is applied to an electronic device, the method comprising the steps of: acquiring a biometric image of a current user of the electronic device; and analyzing the acquired biometric image by using an image processing technology, And determine the current user's category based on the analysis results; The category of the former user determines corresponding keyboard layout information; and after receiving a soft keyboard retrieval request, the display screen is controlled to display a soft keyboard having a corresponding layout according to the corresponding keyboard layout information.
  • the electronic device and the soft keyboard display method of the present invention can first determine the category of the current user, and then display a soft keyboard that conforms to the user category for use by the user, which is more user-friendly.
  • FIG. 1 is a block diagram showing the structure of an electronic device according to an embodiment of the present invention.
  • FIG. 2 is a schematic diagram of a soft keyboard displayed on a display screen of an electronic device according to an embodiment of the invention.
  • FIG. 3 is a schematic diagram of an input box displayed on a display screen of an electronic device according to an embodiment of the invention.
  • FIG. 4 is a schematic diagram showing the steps of an image processing technique in an embodiment of the present invention.
  • FIG. 5 is a schematic diagram of a correspondence table between user categories and keyboard layout information according to an embodiment of the present invention.
  • FIG. 6 is a flowchart of a method for displaying a soft keyboard according to an embodiment of the present invention.
  • Figure 7 is a sub-flow diagram of an embodiment of step S605 of Figure 6.
  • FIG. 1 is a structural block diagram of an electronic device 100 according to an embodiment of the invention.
  • the electronic device 100 includes a display screen 10, a biometric acquisition unit 20, a processor 30, and a memory 40.
  • the biometric acquiring unit 20 is configured to acquire biometric characteristics of a current user of the electronic device 100. image.
  • the current user of the electronic device 100 refers to a user who currently operates the electronic device 100.
  • the processor 30 includes an analysis module 31, a keyboard layout determination module 32, and a display control module 33.
  • the analysis module 31 is configured to analyze the biometric image acquired by the biometrics acquiring unit 20 by using an image processing technology, and determine a category of the current user according to the image analysis result.
  • the keyboard layout determining module 32 is configured to determine corresponding keyboard layout information according to a category of the current user. Specifically, the memory 40 stores a correspondence table between the user category and the keyboard layout information. The keyboard layout determining module 32 determines keyboard layout information corresponding to the user category according to the correspondence table between the user category and the keyboard layout information.
  • the display control module 33 is configured to control the display 10 to display the layout/correspondence according to the keyboard layout information determined by the keyboard layout determining module 32 after receiving a soft keyboard retrieval request.
  • the layout of the soft keyboard R1 is defined in the keyboard layout information, and the layout of the soft keyboard R1 includes the size of the soft keyboard R1, the full keyboard display mode, the nine-key display mode, the style of the keyboard, and the arrangement of the keys A1 of the soft keyboard R1. Location and more.
  • the size of the soft keyboard R1 includes the overall size of the keyboard and the size of the button A1 included in the soft keyboard R1.
  • the category of the current user can be determined first, and then the soft keyboard R1 conforming to the user category is displayed for use by the user, which is more user-friendly.
  • the biometric image of the user may be a facial image of the user, an iris image of the eye, a fingerprint image, or the like.
  • the biometrics acquiring unit 20 may include a camera module 21 for capturing a facial image of the user, an iris image of the eye, and the like.
  • the biometric acquisition unit 20 may further include a fingerprint recognition module 22 for acquiring a fingerprint image of the user in response to a user's finger touch.
  • the camera module 21 can be a camera.
  • the fingerprint identification module 22 can be a fingerprint identification chip.
  • the biometric acquisition unit 20 acquires the biometric image of the user after the user unlocks the display screen 10 and saves it in the memory 40.
  • the analysis module 31 acquires the biometric image of the user from the memory 40, and analyzes the biometric image acquired by the biometrics acquiring unit 20 by using an image processing technology, and determines a category of the current user according to the analysis result.
  • the keyboard layout determining module 32 determines the corresponding soft keyboard R1 according to the category of the current user.
  • the keyboard layout information is stored in the memory 40 of the determined keyboard layout information of the soft keyboard R1.
  • the display control module 33 acquires corresponding keyboard layout information from the memory 40, and controls the display screen 10 to display a soft keyboard R1 having a corresponding layout.
  • the biometric acquisition unit 20 acquires the biometric image of the user immediately after the user inputs the soft keyboard retrieval request.
  • the analysis module 31 analyzes the biometric image acquired by the biometrics acquiring unit 20 in an instant by an image processing technology, and determines a category of the current user according to the analysis result.
  • the keyboard layout determining module 32 determines the category according to the current user.
  • the display control module 33 controls the display screen 10 to display the soft keyboard R1 having the layout according to the keyboard layout information. Since the speed acquired by the biometrics acquiring unit 20 and the speed of processing by each module of the processor 30 are fast, the time between the user inputting the soft keyboard retrieval request and displaying the soft keyboard R1 The interval is short and does not affect the user's use.
  • the display screen 10 is further configured to display an input box K1 of a certain application software or system software, and the soft keyboard retrieval request is generated after the user clicks on the input box K1.
  • FIG. 4 is a schematic diagram of a step of analyzing the biometric image acquired by the biometric acquiring unit 20 by the analyzing module 31 by using an image processing technology.
  • the image processing technique includes three main steps of image preprocessing S1, feature extraction S2, and feature matching S3.
  • the image preprocessing S1 includes background separation, image enhancement, image binarization, and image refinement.
  • Background separation separates the image area from the background, thereby avoiding feature extraction in areas without valid information, speeding up subsequent processing, and improving image feature extraction and matching accuracy.
  • image enhancement is to improve image quality and restore its original structure.
  • the binarization of an image is the conversion of an image from a grayscale image to a binary image.
  • Image refinement is the conversion of a clear but uneven binary image into a dotted line image with a line width of only one pixel.
  • Feature extraction S2 includes extraction of general features and/or extraction of local detail features to derive extracted features including global features and/or local detail features.
  • the overall features include features such as patterns, singular points, etc.
  • the local detail features include some endpoints and bifurcation points of the image. For example, for a fingerprint image, the overall features include a singular point such as a center point and a triangle point of the fingerprint, and the local detail features include an island, a termination point, an orphan point, a closed loop, a bifurcation point, and the like.
  • the memory 40 further pre-stores feature templates of different user categories.
  • the feature The S3 is configured to compare the extracted features extracted in the feature extraction with the pre-stored feature templates, and determine matching feature templates according to the comparison result.
  • the analysis module 31 determines a user category corresponding to the feature template according to the feature template matched by the extracted feature.
  • the analysis module 31 performs image pre-processing on the biometric image such as background separation, image enhancement, image binarization, and image refinement, and performs feature extraction on the image-preprocessed biometric image by feature extraction and/or Extracting the local detail feature to obtain the extracted feature, comparing the extracted feature with the pre-stored feature template, and determining the matched feature template according to the comparison result, and determining the user category corresponding to the feature template.
  • the user category includes, but is not limited to, a user's authentication identity, the user's age, gender, and the like.
  • the feature template may include an authentication user feature template, a feature template of a different age group, a gender feature template, and the like.
  • the feature template of the different age groups includes a plurality of feature templates corresponding to different age stages, such as a child, an adult or an elderly feature template, and the gender feature template includes a male feature template and a female feature template.
  • the layout of the soft keyboard includes, but is not limited to, the size of the soft keyboard, the position layout of the keys of the soft keyboard, and the display interface style of the soft keyboard.
  • the comparing the extracted feature with the pre-stored feature template, and determining the matched feature template according to the comparison result, and determining the user category corresponding to the feature template specifically includes: the analyzing module 31 first Comparing the extracted feature of the biometric image analysis with the authenticated user feature template stored in the memory 40, and determining whether it is an authenticated user according to the comparison result, and if not, continuing to extract the extracted feature differently At least one of the feature template and the gender feature template of the age group is compared, and at least one of the age and gender of the user is determined to determine the category of the user.
  • the user category includes an authenticated user and a non-authenticated user.
  • the authenticated user is the holder of the electronic device 100 or an authorized user who retains the authentication information in the electronic device 100.
  • the user has the identity authentication information such as fingerprint authentication information, face image information, age information, gender information, and the like in the electronic device 100.
  • the non-authenticated user refers to an unauthorized user of the electronic device 100, that is, any identity authentication information of the user is not stored in the electronic device 100.
  • Each authenticated user recorded in the correspondence table T1 includes corresponding keyboard layout information, or does not include a corresponding Keyboard layout information.
  • Non-authenticated users include users of different ages and genders, and each category of non-authenticated users corresponds to corresponding keyboard layout information in advance.
  • the identity information A1 of the authenticated user A and the corresponding keyboard layout information are recorded in the correspondence table T1 as a full keyboard and a medium size.
  • the correspondence relationship table T1 also records keyboard layout information corresponding to each age stage of the non-authenticated user and keyboard layout information corresponding to the gender of the non-authenticated user. For example, a gender male corresponds to a large-sized, minimalist-style interface, and a gender female corresponds to a medium-sized, romantic-style interface, and the like.
  • the keyboard layout determining module 32 determines the corresponding keyboard layout information according to the current user category, and includes: determining whether the current user is an authenticated user. When determining that the current user is an authenticated user, determining, according to the correspondence relationship table, whether the current user has a corresponding The keyboard layout information, if any, determines, according to the correspondence relationship table, keyboard layout information corresponding to the authentication user as keyboard layout information corresponding to the user category. If not, the current user is treated as a non-authenticated user, that is, the current user's age, gender, and the like are determined, and the keyboard layout information corresponding to the current user's age and/or gender is determined.
  • the keyboard layout determining module 32 determines the age, gender, and the like of the current user, and determines that the keyboard layout information corresponding to the current user's age and/or gender is the keyboard layout corresponding to the user category. information.
  • the analysis module 31 compares the feature extracted after the biometric image analysis with the authenticated user feature template stored in the memory 40, and then determines whether it is The result of an authenticated user.
  • the priority of the age is greater than the priority of the gender.
  • the keyboard layout determining module 32 determines that the current user's age is 0-10 years old, and for the male, the priority corresponding to the age corresponding keyboard layout information is small. Size, cartoon style interface. Among them, the style in this article refers to the style of the skin interface of the keyboard.
  • FIG. 5 is merely an illustrative example.
  • the keyboard layout information corresponding to age and gender may obviously include other information, and may also correspond to other keyboard layout information.
  • the processor 30 further includes a setting module 34, where the setting module 34 is further configured to determine the adjusted keyboard layout information according to the adjustment operation of the authentication user to the soft keyboard, and the keyboard layout information and the authentication.
  • the user information is correspondingly stored in the correspondence table T1. Obviously, when the authentication user does not perform an adjustment operation on the soft keyboard, in the correspondence table T1, the authentication user letter
  • the keyboard layout information corresponding to the information is the default keyboard layout of the electronic device 10.
  • the setting module 34 further sets the correspondence between the users of different age stages and the keyboard layout information, and sets the correspondence between the users of different genders and the keyboard layout information in response to the operation of the authenticated user, and stores the corresponding relationship with the keyboard layout information.
  • Correspondence table T1 Obviously, in some embodiments, the correspondence between the user of different ages and the user of different genders and the keyboard layout information may also be set by the electronic device 100 by default before leaving the factory.
  • the processor 30 can be a processing chip such as a central processing unit, a microcontroller, a microprocessor, a single chip microcomputer, or a digital signal processor.
  • the analysis module 31, the keyboard layout determination module 32, the display control module 33, and the setup module 34 invoke the executed program instructions for the processor 30.
  • the analysis module 31, the keyboard layout determining module 32, the display control module 33, and the setting module 34 may be hardware circuits or firmware in the processor 30.
  • the memory 40 can be a storage device such as a flash memory, a solid state memory, a hard disk, or the like.
  • the display screen 10 can be a touch display screen.
  • the electronic device 100 can be a mobile phone, a tablet computer, a notebook computer, or the like.
  • FIG. 6 illustrates a method for displaying a soft keyboard according to an embodiment of the present invention.
  • the method is applied to the aforementioned electronic device 100.
  • the method includes the steps of:
  • the biometric image of the current user of the electronic device 100 is acquired (S601).
  • the biometric image may be a user's facial image, an eye iris image, a fingerprint image, or the like.
  • the step S601 includes: after the user unlocks the display screen 10, the biometric image of the user is acquired and saved in the memory 40.
  • the step S601 includes: acquiring the biometric image of the user in real time when the user inputs the soft keyboard retrieval request.
  • the acquired biometric image is analyzed by image processing technology, and the category of the current user is determined according to the analysis result (S603).
  • the step S603 includes: performing image pre-processing such as background separation, image enhancement, image binarization, and image refinement on the biometric image, and performing feature pre-processing on the image pre-processed biometric image. Extracting the overall feature and/or the local detail feature to obtain the extracted feature, comparing the extracted feature with the pre-stored feature template, determining the matched feature template according to the comparison result, and determining the user category corresponding to the feature template.
  • the method of comparing the extracted feature with the pre-stored feature template, determining the matched feature template according to the comparison result, and determining the user category corresponding to the feature template includes: first comparing the extracted feature with the authenticated user feature template , to determine whether it is an authenticated user, if not, continue to extract the The feature is compared with at least one of a non-authenticated user template such as a feature template and a gender feature template of a different age group, and at least one of the user's age and gender is determined to determine the user's category.
  • a non-authenticated user template such as a feature template and a gender feature template of a different age group
  • Corresponding keyboard layout information is determined according to the category of the current user (S605). Specifically, the keyboard layout information corresponding to the user category is determined according to a correspondence table between the user category and the keyboard layout information.
  • the display screen 10 After receiving a soft keyboard retrieval request, the display screen 10 is controlled to display a soft keyboard R1 having a corresponding layout according to the corresponding keyboard layout information (S607).
  • the identity relationship information of each authentication user and the keyboard layout information corresponding to the authentication user are recorded in the correspondence relationship table T1.
  • the method further includes the step of: the processor 30 is further configured to determine the adjusted keyboard layout information according to the adjustment operation of the authentication user to the soft keyboard, and the keyboard layout information and the authentication The correspondence between the user information is stored in the correspondence table T1.
  • the method further includes the step of: the processor 30 further setting a correspondence between the users of different age stages and the keyboard layout information, and setting the users and keyboards of different genders in response to the operation of the authenticated user.
  • the correspondence of the layout information is stored in the correspondence table T1.
  • each authenticated user includes corresponding keyboard layout information or does not include corresponding keyboard layout information.
  • Non-authenticated users include users of different ages and genders, and each category of non-authenticated users corresponds to corresponding keyboard layout information.
  • the step S605 specifically includes the steps of:
  • step S6051 It is judged whether the current user is an authenticated user (S6051). If yes, go to step S6053, if no, go to step S6057.
  • step S6055 is performed, and if no, step 6057 is performed.
  • the keyboard layout information corresponding to the authentication user is determined to be keyboard layout information corresponding to the current user category (S6055).
  • the age and/or gender of the current user is determined, and the keyboard layout information corresponding to the age and/or gender of the current user is determined as the keyboard layout information corresponding to the current user category according to the correspondence table T1 (S6057).
  • the electronic device 100 and the soft keyboard display method of the present invention can automatically determine the category of the user and display a soft keyboard that conforms to the user category.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Computer Security & Cryptography (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Ophthalmology & Optometry (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Collating Specific Patterns (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本发明公开一种软键盘显示方法,应用于电子装置中,所述方法包括:获取电子装置当前的用户的生物特征图像;通过图像处理技术对所述获取的生物特征图像进行分析,并根据分析结果确定当前用户的类别;根据当前用户的类别确定对应的键盘布局信息;以及在接收到软键盘调取请求后,根据所述键盘布局信息控制所述显示屏显示具有对应布局的软键盘。本发明还公开所述电子装置,本发明的电子装置及软键盘显示方法,可根据当前用户的类别而显示符合所述用户类别的软键盘。

Description

电子装置及其软键盘显示方法 技术领域
本发明涉及一种电子装置,尤其涉及一种具有软键盘输入功能的电子装置及其软键盘显示方法。
背景技术
目前,具有触摸屏的手机、平板电脑等电子装置已经非常普遍,由于触摸屏的交互方式更直接,极大地方便了人们的生活。为了方便人们敲字,现有的具有触摸屏的电子装置通常支持手写输入以及提供软键盘(虚拟键盘)供用户点击输入等方式。其中,软键盘为使用最多的输入方式。现有的软键盘的布局通常需要用户在调出软键盘后,通过菜单选项等方式去调整软键盘的一些设置,且调整程度有限,例如通常为九键键盘调整为全键盘等,软键盘的按键的位置和按键的大小通常是固定不变的。
发明内容
本发明实施例公开一种电子装置及软键盘显示方法,能够自动识别用户的类别,来显示具有与所述用户类别对应的布局的软键盘,以方便所述类别用户的使用。
本发明实施例公开的电子装置,包括显示屏、处理器、生物特征获取单元。所述生物特征获取单元用于获取电子装置当前用户的生物特征图像。所述处理器包括:分析模块,用于通过图像处理技术对所述生物特征获取单元获取的生物特征图像进行分析,并根据分析结果确定当前用户的类别;键盘布局确定模块,用于根据当前用户的类别确定对应的键盘布局信息;以及显示控制模块,用于在接收一软键盘调取请求后,根据所述对应的键盘布局信息控制所述显示屏显示具有对应布局的软键盘。
本发明实施例公开的软键盘显示方法,应用于一电子装置中,所述方法包括步骤:获取电子装置当前的用户的生物特征图像;通过图像处理技术对所述获取的生物特征图像进行分析,并根据分析结果确定当前用户的类别;根据当 前用户的类别确定对应的键盘布局信息;以及在接收一软键盘调取请求后,根据所述对应的键盘布局信息控制所述显示屏显示具有对应布局的软键盘。
本发明的电子装置及软键盘显示方法,可先判断当前用户的类别,然后显示符合所述用户类别的软键盘供用户使用,更加人性化。
附图说明
为了更清楚地说明本发明实施例中的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本发明一实施例中的电子装置的结构框图。
图2为本发明一实施例中的电子装置的显示屏显示的软键盘的示意图。
图3为本发明一实施例中的电子装置的显示屏显示的输入框的示意图。
图4为本发明一实施例中的图像处理技术的步骤示意图。
图5为本发明一实施例中的用户类别与键盘布局信息的对应关系表的示意图。
图6为本发明一实施例中的软键盘显示方法的流程图。
图7为图6中步骤S605在一实施例中的子流程图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
请参阅图1,为本发明一实施例中的电子装置100的结构框图。所述电子装置100包括显示屏10、生物特征获取单元20、处理器30及存储器40。
所述生物特征获取单元20用于获取电子装置100当前的用户的生物特征 图像。其中,所述电子装置100当前的用户指的是当前操作所述电子装置100的使用者。
所述处理器30包括分析模块31、键盘布局确定模块32及显示控制模块33。
所述分析模块31用于通过图像处理技术对所述生物特征获取单元20获取的生物特征图像进行分析,并根据图像分析结果确定当前用户的类别。
所述键盘布局确定模块32用于根据当前用户的类别确定对应键盘布局信息。具体的,所述存储器40中存储有用户类别与键盘布局信息的对应关系表。所述键盘布局确定模块32根据所述用户类别与键盘布局信息的对应关系表确定所述用户类别对应的键盘布局信息。
请一并参阅图2,所述显示控制模块33用于在接收到一软键盘调取请求后,根据键盘布局确定模块32所确定的键盘布局信息控制所述显示屏10显示具有该布局/对应布局的软键盘(虚拟键盘)R1。其中,键盘布局信息中定义有软键盘R1的布局,所述软键盘R1的布局包括软键盘R1的尺寸、全键盘显示模式、九键显示模式、键盘的风格、软键盘R1的按键A1的排列位置等等。其中,软键盘R1的尺寸包括键盘的整体尺寸以及软键盘R1包括的按键A1的尺寸。
从而,本发明中,可先判断当前用户的类别,然后显示符合所述用户类别的软键盘R1供用户使用,更加人性化。
其中,所述用户的生物特征图像可为用户的脸部图像、眼部虹膜图像、指纹图像等。如图1所示,所述生物特征获取单元20可包括摄像模块21,用于拍摄获取用户的脸部图像、眼部虹膜图像等。所述生物特征获取单元20还可包括指纹识别模块22,用于响应用户的手指触摸,获取用户的指纹图像。所述摄像模块21可为摄像头。所述指纹识别模块22可为指纹识别芯片。
在一些实施例中,所述生物特征获取单元20在用户对显示屏10解锁后,则获取用户的生物特征图像,并保存在存储器40中。所述分析模块31从存储器40中获取所述用户的生物特征图像,并通过图像处理技术对所述生物特征获取单元20获取的生物特征图像进行分析,并根据该分析结果确定当前用户的类别。所述键盘布局确定模块32根据当前用户的类别确定对应的软键盘R1 的键盘布局信息,并将所确定的软键盘R1的键盘布局信息存储于存储器40中。所述显示控制模块33接收到所述软键盘调取请求后,则从存储器40中获取对应的键盘布局信息,并控制所述显示屏10显示具有对应布局的软键盘R1。
在另一些实施例中,所述生物特征获取单元20在用户输入软键盘调取请求后,即时获取用户的生物特征图像。所述分析模块31通过图像处理技术即时对所述生物特征获取单元20获取的生物特征图像进行分析,并根据该分析结果确定当前用户的类别,所述键盘布局确定模块32根据当前用户的类别确定对应的软键盘R1的键盘布局信息。所述显示控制模块33并根据所述键盘布局信息控制所述显示屏10显示具有所述布局的软键盘R1。由于所述生物特征获取单元20获取的速度及所述处理器30的各个模块处理的速度很快,因此,用户在输入所述软键盘调取请求后到显示所述软键盘R1之间的时间间隔很短,不会对用户的使用造成影响。
请一并参阅图3,其中,所述显示屏10还用于显示某一应用软件或系统软件的输入框K1,所述软键盘调取请求为用户对输入框K1点击后产生。
请一并参阅图4,为所述分析模块31通过图像处理技术对所述生物特征获取单元20获取的生物特征图像进行分析的步骤示意图。所述图像处理技术包括图像预处理S1、特征提取S2、特征匹配S3三个主要步骤。
其中,图像预处理S1包括背景分离、图像增强、图像二值化及图像细化。背景分离是将图像区与背景分离,从而避免在没有有效信息的区域进行特征提取,加速后续处理的速度,提高图像特征提取和匹配的精度。图像增强的目的是改善图像质量,恢复其原来的结构。图像的二值化是将图像从灰度图像转换为二值图像。图像细化是把清晰但不均匀的二值图像转化成线宽仅为一个像素的点线图像。
特征提取S2包括总体特征的提取和/或局部细节特征的提取而得出包括总体特征和/或局部细节特征的提取特征。总体特征包括方向图、奇异点等特征,局部细节特征包括图像的一些端点和分歧点等。例如,对于指纹图像来说,总体特征包括指纹的中心点、三角点等奇异点,局部细节特征包括孤岛、终止点、孤点、闭环、分叉点等端点或分歧点。
其中,所述存储器40中还预存有不同用户类别的特征模板。所述特征匹 配S3为将所述特征提取中提取出的提取特征与预存的特征模板对比,并根据该对比结果确定匹配的特征模板。所述分析模块31根据提取特征所匹配的特征模板确定所述特征模板对应的用户类别。
即,所述分析模块31对生物特征图像进行背景分离、图像增强、图像二值化及图像细化等图像预处理,通过特征提取来对经过图像预处理的生物特征图像进行总体特征和/或局部细节特征的提取而得出提取特征,将提取特征与预存的特征模板对比,并根据该对比结果确定匹配的特征模板,并确定所述特征模板对应的用户类别。
其中,所述用户类别包括但不限定为:用户的认证身份、用户的年龄、性别等。所述特征模板可以包括认证用户特征模板、不同年龄阶段人群的特征模板、性别特征模板等。其中,不同年龄阶段人群的特征模板包括多个对应不同年龄阶段的特征模板例如儿童、成人或老人特征模板,性别特征模板包括男性特征模板和女性特征模板。所述软键盘的布局包括但不限定为:软键盘的尺寸大小、软键盘的按键的位置布局、软键盘的显示界面风格等。
在一些实施例中,所述“将提取特征与预存的特征模板对比,并根据该对比结果确定匹配的特征模板,并确定所述特征模板对应的用户类别”具体包括:所述分析模块31首先将生物特征图像分析后提取出的特征与存储器40中存储的认证用户特征模板进行对比,并根据该对比结果判断是否为某一认证用户,如果不是,则继续将所述提取出的特征与不同年龄阶段人群的特征模板、性别特征模板中的至少一种进行对比,判断出所述用户的年龄、性别中的至少一个来确定所述用户的类别。
请一并参阅图5,为用户类别与键盘布局信息的对应关系表T1的示意图。所述对应关系表T1中,定义有多个用户类别与键盘布局信息的对应关系。具体的,如图5所示,所述用户类别包括认证用户及非认证用户。认证用户为电子装置100的持有者或者在电子装置100留存有认证信息的授权用户,例如,用户在电子装置100中留有指纹认证信息、面部图像信息、年龄信息、性别信息等身份认证信息,而成为认证用户。非认证用户指的是电子装置100的非授权用户,即电子装置100内未存储有该用户的任何身份认证信息。所述对应关系表T1中记录的每一认证用户包括有对应的键盘布局信息,或者不包括对应 的键盘布局信息。非认证用户包括不同年龄、不同性别的用户,每一类别的非认证用户预先对应有相应的键盘布局信息。
例如,如图5所示,所述对应关系表T1中记录了认证用户A的身份信息A1及对应的键盘布局信息为全键盘、中等尺寸。如图5所示,所述对应关系表T1中还记录了非认证用户的各个年龄阶段所对应的键盘布局信息以及非认证用户的性别所对应的键盘布局信息。例如,性别男对应的为大尺寸、简约风格界面,性别女对应的是中等尺寸、浪漫风格界面等等。
所述键盘布局确定模块32根据当前用户的类别确定对应的键盘布局信息,包括:判断当前用户是否为认证用户,在确定当前用户为认证用户时,根据所述对应关系表判断当前用户是否有对应的键盘布局信息,如果有,则根据所述对应关系表确定所述认证用户对应的键盘布局信息为所述用户类别对应的键盘布局信息。如果没有,则以所述当前用户为非认证用户对待,即,确定当前用户的年龄、性别等,并确定当前用户的年龄和/或性别对应的键盘布局信息。如果当前用户为非认证用户,所述键盘布局确定模块32也为确定当前用户的年龄、性别等,并确定当前用户的年龄和/或性别对应的键盘布局信息为所述用户类别对应的键盘布局信息。
其中,如前所述,判断当前用户是否为认证用户,为所述分析模块31将生物特征图像分析后提取出的特征与存储器40中存储的认证用户特征模板进行对比后,判断得出是否为某一认证用户的结果。
在一些实施例中,年龄的优先级别大于性别的优先级别,例如,所述键盘布局确定模块32确定当前用户的年龄为0-10岁,为男,则优先确定年龄对应的键盘布局信息为小尺寸、卡通风格界面。其中,本文中的风格指的是键盘的皮肤界面的风格。
显然,图5仅仅是一个说明性的示例,年龄、性别对应的键盘布局信息显然可以包括其他的信息,也可以对应到其他的键盘布局信息。
其中,所述处理器30还包括设置模块34,所述设置模块34还用于根据认证用户对软键盘的调整操作,确定调整后的键盘布局信息,并将所述键盘布局信息及所述认证用户信息对应存储至所述对应关系表T1中。显然,当认证用户没有对软键盘进行过调整操作,所述对应关系表T1中,所述认证用户信 息对应的键盘布局信息为电子装置10所默认的键盘布局。
其中,所述设置模块34还响应认证用户的操作,而设定不同年龄阶段的用户与键盘布局信息的对应关系,以及设定不同性别的用户与键盘布局信息的对应关系,并存储至所述对应关系表T1中。显然,在一些实施例中,所述不同年龄阶段的用户及不同性别的用户与键盘布局信息的对应关系也可由电子装置100出厂前系统默认设置好。
其中,所述处理器30可为中央处理器、微控制器、微处理器、单片机、数字信号处理器等处理芯片。在一些实施例中,所述分析模块31、键盘布局确定模块32、显示控制模块33及设置模块34为所述处理器30调用执行的程序指令。在另一些实施例中,所述分析模块31、键盘布局确定模块32、显示控制模块33及设置模块34可为处理器30中的硬件电路或固件。
所述存储器40可为闪存、固态存储器、硬盘等存储设备。所述显示屏10可为触摸显示屏。所述电子装置100可为手机、平板电脑、笔记本电脑等。
请参阅图6,为本发明一实施例中的软键盘显示方法。所述方法应用于前述的电子装置100中。所述方法包括步骤:
获取电子装置100当前的用户的生物特征图像(S601)。其中,所述生物特征图像可为用户的脸部图像、眼部虹膜图像、指纹图像等。在一些实施例中,所述步骤S601包括:在用户对显示屏10解锁后,则获取用户的生物特征图像,并保存在存储器40中。在另一些实施例中,所述步骤S601包括:在用户输入软键盘调取请求时,即时获取用户的生物特征图像。
通过图像处理技术对所述获取的生物特征图像进行分析,并根据该分析结果确定当前用户的类别(S603)。在一些实施例中,所述步骤S603包括:对生物特征图像进行背景分离、图像增强、图像二值化及图像细化等图像预处理,通过特征提取来对经过图像预处理的生物特征图像进行总体特征和/或局部细节特征的提取而得出提取特征,将提取特征与预存的特征模板对比,根据对比结果确定匹配的特征模板,并确定所述特征模板对应的用户类别。其中,所述“将提取特征与预存的特征模板对比,根据对比结果确定匹配的特征模板,并确定所述特征模板对应的用户类别”具体包括:首先将提取出特征与认证用户特征模板进行对比,判断是否为某一认证用户,如果不是,则继续将所述提取 出的特征与不同年龄阶段人群的特征模板、性别特征模板等非认证用户模板中的至少一种进行对比,判断出所述用户的年龄、性别中的至少一个来确定所述用户的类别。
根据当前用户的类别确定对应的键盘布局信息(S605)。具体的,根据用户类别与键盘布局信息的对应关系表确定所述用户类别对应的键盘布局信息。
接收到一软键盘调取请求后,根据所述对应的键盘布局信息控制所述显示屏10显示具有对应布局的软键盘R1(S607)。所述对应关系表T1中记录有每一认证用户的身份信息及认证用户对应的键盘布局信息。
在一些实施例中,所述方法还包括步骤:所述处理器30还用于根据认证用户对软键盘的调整操作,确定调整后的键盘布局信息,并将所述键盘布局信息及所述认证用户信息之间的对应关系存储至所述对应关系表T1中。
在一些实施例中,所述方法还包括步骤:所述处理器30还响应认证用户的操作,而设定不同年龄阶段的用户与键盘布局信息的对应关系,以及设定不同性别的用户与键盘布局信息的对应关系,并存储至所述对应关系表T1中。
请参阅图7,为本发明一实施例中的步骤S605的子流程图。在一些实施例中,每一认证用户包括有对应的键盘布局信息,或者不包括对应的键盘布局信息。非认证用户包括不同年龄、不同性别的用户,每一类别的非认证用户对应有相应的键盘布局信息。所述步骤S605具体包括步骤:
判断当前用户是否为认证用户(S6051)。如果是,则执行步骤S6053,如果否,则执行步骤S6057。
根据所述对应关系表T1判断当前用户是否有对应的键盘布局信息(S6053)。如果有,则执行步骤S6055,如果否,则执行步骤6057。
确定所述认证用户对应的键盘布局信息为当前用户类别对应的键盘布局信息(S6055)。
确定当前用户的年龄和/或性别等,并根据对应关系表T1确定当前用户的年龄和/或性别对应的键盘布局信息为当前用户类别对应的键盘布局信息(S6057)。
从而,本发明的电子装置100及软键盘显示方法,可自动判断用户的类别而显示符合所述用户类别的软键盘。
以上所述是本发明的优选实施例,应当指出,对于本技术领域的普通技术人员来说,在不脱离本发明原理的前提下,还可以做出若干改进和润饰,这些改进和润饰也视为本发明的保护范围。

Claims (20)

  1. 一种电子装置,包括显示屏、处理器,其特征在于,所述电子装置还包括生物特征获取单元,用于获取电子装置当前用户的生物特征图像,所述处理器包括:
    分析模块,用于通过图像处理技术对所述生物特征获取单元获取的生物特征图像进行分析,并根据分析结果确定当前用户的类别;
    键盘布局确定模块,用于根据当前用户的类别确定对应的键盘布局信息;以及
    显示控制模块,用于在接收到软键盘调取请求后,根据所述键盘布局信息控制所述显示屏显示具有对应布局的软键盘。
  2. 如权利要求1所述的电子装置,其特征在于,所述用户的生物特征图像包括用户的脸部图像、眼部虹膜图像及指纹图像,所述生物特征获取单元包括摄像模块及指纹识别模块,所述摄像模块用于拍摄获取用户的脸部图像、眼部虹膜图像,所述指纹识别模块用于响应用户的手指触摸,获取用户的指纹图像。
  3. 如权利要求1所述的电子装置,其特征在于,所述生物特征获取单元在用户对显示屏解锁后或所述软键盘调取请求输入时,获取当前用户的生物特征图像。
  4. 如权利要求1-3任一项所述的电子装置,其特征在于,所述电子装置还包括存储器,所述存储器中还存储有不同用户类别对应的特征模板,所述分析模块对生物特征图像进行包括背景分离、图像增强、图像二值化及图像细化的图像预处理,通过特征提取对经过图像预处理的生物特征图像进行总体特征和/或局部细节特征的提取而得出提取特征,将提取特征与预存的特征模板对比,并根据对比结果确定匹配的特征模板,并确定所述特征模板对应的用户类别。
  5. 如权利要求4所述的电子装置,其特征在于,所述特征模板包括认证用户特征模板、不同年龄阶段人群的特征模板、性别特征模板,所述分析模块首先将生物特征图像分析后提取出的特征与存储器中存储的认证用户特征模 板进行对比,判断所述用户是否为某一认证用户,如果不是,则继续将所述提取出的特征与不同年龄阶段人群的特征模板、性别特征模板中的至少一种进行对比,并根据对比结果识别用户的年龄或/和性别,及根据所述用户的年龄和/或性别确定所述用户的类别。
  6. 如权利要求4所述的电子装置,其特征在于,所述存储器中还存储有用户类别与键盘布局信息的对应关系表,所述键盘布局确定模块根据所述用户类别与键盘布局信息的对应关系表确定所述用户类别对应的键盘布局信息。
  7. 如权利要求6所述的电子装置,其特征在于,所述用户包括认证用户及非认证用户,所述对应关系表中记录有认证用户对应的键盘布局信息,以及记录有不同年龄、不同性别的用户对应的键盘布局信息,所述键盘布局确定模块在当前用户为认证用户时,根据所述对应关系表判断当前用户是否有对应的键盘布局信息,如果有,则根据所述对应关系表确定所述认证用户对应的键盘布局信息为当前用户的类别确定对应的键盘布局信息,如果没有或当前用户的类别为非认证用户,则确定当前用户的年龄和/或性别,并确定当前用户的年龄和/或性别对应的键盘布局信息为当前用户的类别确定对应的键盘布局信息。
  8. 如权利要求7所述的电子装置,其特征在于,所述认证用户为电子装置的持有者或者在电子装置留存有认证信息的授权用户,所述认证信息包括指纹认证信息、面部图像信息、年龄信息、性别信息在内的身份信息。
  9. 如权利要求7所述的电子装置,其特征在于,所述处理器还包括设置模块,所述设置模块用于根据认证用户对软键盘的调整操作,确定调整后的键盘布局信息,并将所述键盘布局信息及所述认证用户信息对应存储至所述对应关系表。
  10. 如权利要求1-3任一项所述的电子装置,其特征在于,所述键盘布局信息包括软键盘的尺寸、全键盘显示模式、九键显示模式、键盘的风格、软键盘的按键的排列位置。
  11. 如权利要求1-3任一项所述的电子装置,其特征在于,所述显示屏还用于显示输入框,所述软键盘调取请求为对输入框点击后产生。
  12. 一种软键盘显示方法,应用于一电子装置中,其特征在于,所述方法 包括步骤:
    获取电子装置当前的用户的生物特征图像;
    通过图像处理技术对所述获取的生物特征图像进行分析,并根据分析结果确定当前用户的类别;
    根据当前用户的类别确定对应的键盘布局信息;以及
    在接到软键盘调取请求后,根据所述键盘布局信息控制所述显示屏显示具有对应布局的软键盘。
  13. 如权利要求12所述的软键盘显示方法,其特征在于,所述步骤“获取电子装置当前的用户的生物特征图像”包括:
    获取用户的脸部图像、眼部虹膜图像或指纹图像。
  14. 如权利要求12所述的软键盘显示方法,其特征在于,所述步骤“获取电子装置当前的用户的生物特征图像”包括:
    在对显示屏解锁后或所述软键盘调取请求输入时,获取当前用户的生物特征图像。
  15. 如权利要求12-14任一项所述的软键盘显示方法,其特征在于,所述电子装置中存储不同用户类别对应的特征模板,所述步骤“通过图像处理技术对所述获取的生物特征图像进行分析,并根据分析结果确定当前用户的类别”包括:
    对生物特征图像进行包括背景分离、图像增强、图像二值化及图像细化的图像预处理;
    通过特征提取来对经过图像预处理的生物特征图像进行总体特征和/或局部细节特征的提取而得出提取特征;以及
    将提取特征与预存的特征模板对比,根据对比结果确定匹配的特征模板,并确定所述特征模板对应的用户类别。
  16. 如权利要求15所述的软键盘显示方法,其特征在于,所述特征模板包括认证用户特征模板、不同年龄阶段人群的特征模板、性别特征模板,所述步骤“将提取特征与预存的特征模板对比,根据对比结果确定匹配的特征模板,并确定所述特征模板对应的用户类别”包括:
    将所述提取特征与认证用户特征模板进行对比,判断是否为某一认证用 户;以及
    如果不是,则继续将所述提取特征与不同年龄阶段人群的特征模板、性别特征模板中的至少一种进行对比,并根据对比结果识别用户的年龄或/和性别,及根据所述用户的年龄和/或性别确定所述用户的类别。
  17. 如权利要求12-14任一项所述的软键盘显示方法,其特征在于,所述电子装置还存储有用户类别与键盘布局信息的对应关系表,所述步骤“根据当前用户的类别确定对应的键盘布局信息”包括:
    根据所述用户类别与键盘布局信息的对应关系表确定所述用户类别对应的键盘布局信息。
  18. 如权利要求17所述的软键盘显示方法,其特征在于,所述用户包括认证用户及非认证用户,所述对应关系表中记录有认证用户对应的键盘布局信息,以及记录有不同年龄、不同性别的用户对应的键盘布局信息,所述步骤“根据所述用户类别与键盘布局信息的对应关系表确定所述用户类别对应的键盘布局信息”包括:
    在当前用户为认证用户时,根据所述对应关系表判断当前用户是否有对应的键盘布局信息;
    如果有,则根据所述对应关系表确定所述认证用户对应的键盘布局信息为当前用户的类别确定对应的键盘布局信息;以及
    如果没有或当前用户的类别为非认证用户,则确定当前用户的年龄和/或性别,并确定当前用户的年龄和/或性别对应的键盘布局信息为当前用户的类别确定对应的键盘布局信息。
  19. 如权利要求18所述的软键盘显示方法,其特征在于,所述认证用户为电子装置的持有者或者在电子装置留存有认证信息的授权用户,所述认证信息包括指纹认证信息、面部图像信息、年龄信息、性别信息在内的身份信息;所述非认证用户为电子装置的非授权用户。
  20. 如权利要求18所述的软键盘显示方法,其特征在于,所述方法还包括:
    根据认证用户对软键盘的调整操作,确定调整后的键盘布局信息,并将所述键盘布局信息及所述认证用户信息对应存储至所述对应关系表。
PCT/CN2016/107939 2016-11-30 2016-11-30 电子装置及其软键盘显示方法 WO2018098668A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP16922948.1A EP3550417A1 (en) 2016-11-30 2016-11-30 Electronic device and soft keyboard display method
CN201680038938.XA CN107995969A (zh) 2016-11-30 2016-11-30 电子装置及其软键盘显示方法
US16/330,185 US20190227707A1 (en) 2016-11-30 2016-11-30 Electronic device and soft keyboard display method thereof
PCT/CN2016/107939 WO2018098668A1 (zh) 2016-11-30 2016-11-30 电子装置及其软键盘显示方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/107939 WO2018098668A1 (zh) 2016-11-30 2016-11-30 电子装置及其软键盘显示方法

Publications (1)

Publication Number Publication Date
WO2018098668A1 true WO2018098668A1 (zh) 2018-06-07

Family

ID=62029691

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/107939 WO2018098668A1 (zh) 2016-11-30 2016-11-30 电子装置及其软键盘显示方法

Country Status (4)

Country Link
US (1) US20190227707A1 (zh)
EP (1) EP3550417A1 (zh)
CN (1) CN107995969A (zh)
WO (1) WO2018098668A1 (zh)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USD802616S1 (en) * 2016-11-22 2017-11-14 Verifone, Inc. Display screen or portion thereof with a graphical user interface
CN109074171B (zh) 2017-05-16 2021-03-30 华为技术有限公司 输入方法及电子设备
USD877752S1 (en) * 2018-03-16 2020-03-10 Magic Leap, Inc. Display panel or portion thereof with graphical user interface
CN111027362B (zh) * 2019-04-27 2020-12-08 深圳市智码广告通有限公司 手持移动终端分等级显示平台
CN112965781A (zh) * 2021-04-08 2021-06-15 高天惟 一种处理页面的方法和装置
CN114420010A (zh) * 2021-12-30 2022-04-29 联想(北京)有限公司 一种控制方法、装置及电子设备
CN114510194A (zh) * 2022-01-30 2022-05-17 维沃移动通信有限公司 输入方法、装置、电子设备及可读存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080018501A1 (en) * 2006-07-19 2008-01-24 International Business Machines Corporation Identification of key information of keyboard
CN103995673A (zh) * 2014-06-09 2014-08-20 联想(北京)有限公司 键盘布局控制方法以及电子设备
CN105117127A (zh) * 2015-08-26 2015-12-02 成都秋雷科技有限责任公司 一种基于不同人群切换输入法的方法
CN105653116A (zh) * 2015-07-31 2016-06-08 宇龙计算机通信科技(深圳)有限公司 一种软键盘布局调整方法、装置及电子设备

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9692964B2 (en) * 2003-06-26 2017-06-27 Fotonation Limited Modification of post-viewing parameters for digital images using image region or feature information
US20090183098A1 (en) * 2008-01-14 2009-07-16 Dell Products, Lp Configurable Keyboard
EP2353125A4 (en) * 2008-11-03 2013-06-12 Veritrix Inc USER AUTHENTICATION FOR SOCIAL NETWORKS
US10096198B2 (en) * 2011-11-29 2018-10-09 Igt Anonymous biometric player tracking
KR20130083195A (ko) * 2012-01-12 2013-07-22 삼성전자주식회사 터치를 이용한 키보드 레이아웃 변경을 위한 방법 및 장치
CN104428785B (zh) * 2012-05-04 2017-08-08 罗文有限公司 使用图标的关键字的图标密码设定装置以及图标密码设定方法
JP2014137627A (ja) * 2013-01-15 2014-07-28 Sony Corp 入力装置、出力装置および記憶媒体
US9746938B2 (en) * 2014-12-15 2017-08-29 At&T Intellectual Property I, L.P. Exclusive view keyboard system and method
CN112597469A (zh) * 2015-03-31 2021-04-02 华为技术有限公司 移动终端隐私保护方法、保护装置及移动终端
CN105574386A (zh) * 2015-06-16 2016-05-11 宇龙计算机通信科技(深圳)有限公司 一种终端模式管理方法及装置
CN106020350A (zh) * 2016-05-16 2016-10-12 京东方科技集团股份有限公司 穿戴式设备和信息处理方法
KR102564267B1 (ko) * 2016-12-01 2023-08-07 삼성전자주식회사 전자 장치 및 그 동작 방법
US10607230B2 (en) * 2016-12-02 2020-03-31 Bank Of America Corporation Augmented reality dynamic authentication for electronic transactions
US11625473B2 (en) * 2018-02-14 2023-04-11 Samsung Electronics Co., Ltd. Method and apparatus with selective combined authentication

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080018501A1 (en) * 2006-07-19 2008-01-24 International Business Machines Corporation Identification of key information of keyboard
CN103995673A (zh) * 2014-06-09 2014-08-20 联想(北京)有限公司 键盘布局控制方法以及电子设备
CN105653116A (zh) * 2015-07-31 2016-06-08 宇龙计算机通信科技(深圳)有限公司 一种软键盘布局调整方法、装置及电子设备
CN105117127A (zh) * 2015-08-26 2015-12-02 成都秋雷科技有限责任公司 一种基于不同人群切换输入法的方法

Also Published As

Publication number Publication date
EP3550417A1 (en) 2019-10-09
CN107995969A (zh) 2018-05-04
US20190227707A1 (en) 2019-07-25

Similar Documents

Publication Publication Date Title
WO2018098668A1 (zh) 电子装置及其软键盘显示方法
US11310223B2 (en) Identity authentication method and apparatus
Peixoto et al. Face liveness detection under bad illumination conditions
WO2017202196A1 (en) Method and device for fingerprint unlocking and user terminal
US8856541B1 (en) Liveness detection
KR101957615B1 (ko) 지문 인증 방법, 시스템 및 지문 인증 기능을 지원하는 단말기
KR101326221B1 (ko) 안면 특징 검출
CN107223254B (zh) 用于隐藏设置处理的方法、用户装置和存储介质
US9122913B2 (en) Method for logging a user in to a mobile device
US20140250523A1 (en) Continuous Authentication, and Methods, Systems, and Software Therefor
US20120051605A1 (en) Method and apparatus of a gesture based biometric system
WO2017005020A1 (zh) 一种移动终端及其实现自动接听的方法
US11042727B2 (en) Facial recognition using time-variant user characteristics
WO2017113407A1 (zh) 一种手势识别方法、装置及电子设备
US20200275271A1 (en) Authentication of a user based on analyzing touch interactions with a device
WO2017219450A1 (zh) 一种信息处理方法、装置及移动终端
WO2020206734A1 (zh) 基于手写密码的身份认证方法及装置
CN107247936A (zh) 图像识别方法及装置
US20230353563A1 (en) Systems and methods for passive continuous session authentication
CN110647732B (zh) 一种基于生物识别特征的语音交互方法、系统、介质和设备
TWI584146B (zh) 基於人臉識別的整合登錄系統及方法
EP3420496A1 (en) Method and system for controlling an electronic device
CN113077262A (zh) 一种餐饮结算方法、装置、系统、机器可读介质及设备
US11120285B2 (en) Intelligent terminal
US11120284B2 (en) Startup authentication method for intelligent terminal

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16922948

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2016922948

Country of ref document: EP

Effective date: 20190701