WO2016206347A1 - 一种程序图标排序方法和装置 - Google Patents

一种程序图标排序方法和装置 Download PDF

Info

Publication number
WO2016206347A1
WO2016206347A1 PCT/CN2015/100228 CN2015100228W WO2016206347A1 WO 2016206347 A1 WO2016206347 A1 WO 2016206347A1 CN 2015100228 W CN2015100228 W CN 2015100228W WO 2016206347 A1 WO2016206347 A1 WO 2016206347A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
emotional state
iris
usage
program
Prior art date
Application number
PCT/CN2015/100228
Other languages
English (en)
French (fr)
Inventor
黎广
Original Assignee
宇龙计算机通信科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 宇龙计算机通信科技(深圳)有限公司 filed Critical 宇龙计算机通信科技(深圳)有限公司
Publication of WO2016206347A1 publication Critical patent/WO2016206347A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Definitions

  • the present invention relates to the field of electronic technologies, and in particular, to a method and apparatus for ordering program icons.
  • the arrangement icons of the application icons of the electronic terminal are arranged according to the order of installation time, that is, the icon of the application installed first is placed in the front position and the icon of the application installed after the page is installed. Will be ranked in the back position and page.
  • people now install a lot of applications, and the applications that are commonly used in different emotional states are usually different, so it is very difficult to quickly find the application icons they want in many application icons. . Therefore, the existing application icon arrangement method consumes a large amount of time for the user to find the icon of the application, thereby reducing the user's physical examination.
  • the technical problem to be solved by the embodiments of the present invention is to provide a program icon sorting method and device, which can intelligently arrange the user-used application icons in front according to the number or duration of the user's use of each application in different emotional states. To prevent users from looking for the application icon to be used for a long time.
  • an embodiment of the present invention provides a method for sorting program icons, where the method includes:
  • the display icons of the application are sorted and displayed according to the program usage record corresponding to the emotional state.
  • an embodiment of the present invention provides a program icon sorting apparatus, where the apparatus includes:
  • An iris acquisition module configured to acquire iris information of a user
  • a state determining module configured to determine, according to the acquired iris feature information, a current emotional state of the user
  • a record obtaining module configured to acquire a program usage record corresponding to the emotional state, where the program usage record includes usage information of the user for each application in the emotional state;
  • a sorting module configured to sort and display the display icons of the application according to the program usage record corresponding to the emotional state.
  • the embodiment of the present invention obtains the iris feature information of the user, determines the current emotional state of the user according to the obtained iris feature information, and acquires a program usage record corresponding to the emotional state, where the program usage record includes the user in the
  • the usage information of each application in the emotional state is sorted and displayed according to the program usage record corresponding to the emotional state, and the number of times the user uses each application according to different emotional states is realized. Or duration, intelligently aligning the user's commonly used application icons in an emotional state, reducing the amount of time users spend looking for application icons.
  • FIG. 1 is a schematic flow chart of a method for sorting program icons in an embodiment of the present invention
  • FIG. 2 is a schematic flow chart of a method for setting an emotional state in an embodiment of the present invention
  • FIG. 3 is a schematic flow chart of a program usage recording method in an embodiment of the present invention.
  • FIG. 4 is a schematic structural diagram of a program icon sorting apparatus according to an embodiment of the present invention.
  • FIG. 5 is a structural diagram of the sorting module of FIG. 4 in the embodiment of the present invention.
  • FIG. 1 is a schematic flowchart of a method for sorting program icons in an embodiment of the present invention.
  • the method flow may be implemented by a program icon sorting device, and the program icon sorting device may be a user terminal or a software program running on a user terminal.
  • the user terminal may include a mobile phone, a notebook computer, a tablet computer, a car computer, a POS (Point Of Sales) machine, and the like.
  • the method as shown in the figure includes at least:
  • Step S101 Acquire iris feature information of the user.
  • the user's eye image may be acquired by the terminal camera or other imaging device.
  • the entering the system desktop includes entering the system desktop when booting, entering the system desktop during standby startup, entering the system desktop when the program runs in the background, or entering the system desktop when the program exits.
  • the terminal may prompt the user to align the eye to the camera for eye image acquisition. If the collected eye image is complete and the detail information is clear, the iris feature information is collected; If the image of the eye is incomplete or the detail information is not clear enough, the terminal needs to acquire the eye image of the user again until the eye image of the user is complete and the detail information is clear.
  • the terminal After obtaining a clear eye image, the terminal needs to preprocess the eye image. First, it is necessary to detect the boundary between the iris and the pupil in the eye image, the boundary between the iris and the sclera, the boundary between the iris and the upper eyelid, and the boundary between the iris and the lower eyelid to determine the iris in the eye image.
  • the position of the pupil afterwards, the terminal can extract the iris feature information of the user by referring to the positions of the iris and the pupil, the iris feature information includes a pupil feature parameter and an iris feature parameter; and finally, the iris feature information is returned
  • the iris processing information is adjusted to a preset fixed size of the iris recognition system to ensure accurate identification of the iris feature information; the iris characteristic information after normalization can be enhanced, for example, adjusted Indicators such as brightness, contrast, and smoothness to improve the recognition rate of the iris feature information.
  • Step S102 determining the current emotion of the user according to the acquired iris feature information. status.
  • the terminal before determining the current emotional state of the user, the terminal has set different iris feature ranges, and an emotional state corresponding to each of the iris feature ranges.
  • the iris feature range can be compared with each of the iris feature ranges, and the iris feature information of the user is within a certain iris feature range. If the user is in an emotional state corresponding to the iris feature range, the emotional state corresponding to the iris feature range may be acquired.
  • the eye feature parameters of the user such as a pupil feature parameter, an iris feature parameter, an eyelid feature parameter, and the like
  • the eye feature parameters of the user may be extracted from the eye image of the user acquired in step S101, the pupil feature parameter, the iris
  • the feature parameters and the eyelid feature parameters can include any of diameter, radius, or area. Therefore, the terminal can determine the current current of the user by calculating a ratio of the pupil characteristic parameter to the iris characteristic parameter, the ratio of the pupil feature parameter to the eyelid feature parameter, the ratio of the pupil feature parameter to the eyelid feature parameter, and the like.
  • the emotional state is a ratio of the pupil characteristic parameter to the iris characteristic parameter, the ratio of the pupil feature parameter to the eyelid feature parameter, the ratio of the pupil feature parameter to the eyelid feature parameter, and the like.
  • the current emotional state of the user may be determined according to a ratio of a pupil feature parameter to an iris feature parameter in the iris feature information.
  • the preset four iris feature ranges and the emotional states corresponding to the four iris feature ranges are: 0 to 0.03 corresponding to the emotional state one, 0.03 to 0.06 corresponding to the emotional state two, and 0.06 to 0.09 corresponding to the emotional state three, 0.09 to 0.12 corresponds to emotional state four. If the ratio of the pupil feature parameter and the iris feature parameter acquired by the user in step S101 is 0.1, 0.1 is compared with the four set iris feature ranges, and 0.1 is detected as an emotion. State four, then the current emotional state of the user can be determined as emotional state four.
  • Step S103 Obtain a program usage record corresponding to the emotional state, where the program usage record includes usage information of the user for each application in the emotional state.
  • the program usage record is used by the terminal in each application to record usage information of each user in different emotional states, and the usage information includes usage duration, usage times, or operations. frequency. Therefore, the program usage record corresponding to a certain emotional state includes the duration of use, the number of uses, or the number of operations of the user in the emotional state to the respective application of the terminal.
  • the program usage record of the emotional state four may be a duration of various applications such as WeChat 16 hours, a video player 8 hours, an XX game 12 hours, a browser 10 hours, a reading software 0 hours, or the like, or may be WeChat.
  • the program usage record is not changed after the recording is completed within a preset time, but is continuously updated, and the user uses the duration, the number of uses, or the number of operations of each application each time. Record in time.
  • Step S104 sorting and displaying the display icons of the application according to the program usage record corresponding to the emotional state.
  • the terminal may sort and display the display icons of the application according to the usage information of each application in the emotional state acquired in step S103. Since the usage information includes the usage duration, the number of uses, or the number of operations, the application can be used in the order of the duration, the number of uses, or the number of operations of the respective applications in the emotional state.
  • the display icons of the program are sorted.
  • step S103 if sorting is performed according to the usage duration, the order of the application is: WeChat 16 hours, XX game 12 hours, browser 10 hours, video player 8 hours, reading Software 0 hours, then the system desktop will display the WeChat icon in the front, followed by XX games, browsers, video players, and finally reading software; if sorted by the number of uses or the number of operations, then The order of the application is: WeChat 60 times, browser 34 times, XX game 25 times, video player 10 times, reading software 0 times, then the application display icons on the system desktop are WeChat, browser, XX Games, video players and reading software.
  • the total usage duration of all applications in the emotional state may be calculated by using the sum of the usage durations of the respective applications, and then according to the usage duration of each application and the total usage duration of all applications according to the emotional state.
  • the ratio is obtained by the usage duration of each application, and the icons of the respective applications are arranged according to the size of the usage duration.
  • the total usage duration of all applications in the emotional state can be calculated by the sum of the usage durations of the respective applications, and then the ratio of the usage duration of each application to the total usage duration of all applications according to the emotional state can be calculated. Get the percentage of time spent on each application.
  • the usage time of the program usage record in an emotional state is 16 hours for WeChat, 8 hours for video player, 12 hours for XX game, 10 hours for browser, 0 hour for reading software, and 0 hours for all other applications.
  • the total usage duration of all applications in the emotional state is the sum of the usage durations of the respective applications for 36 hours, and the usage duration of each application is: WeChat 4/9, view Frequency player 2/9, XX game 1/3, browser 5/18, reading software 0, all other applications are 0.
  • the display icons of the application are sorted according to the order of use duration of each application in descending order.
  • the embodiment of the present invention obtains the iris feature information of the user, determines the current emotional state of the user according to the obtained iris feature information, and acquires a program usage record corresponding to the emotional state, where the program usage record includes the user in the
  • the usage information of each application in the emotional state is sorted and displayed according to the program usage record corresponding to the emotional state, and the number of times the user uses each application according to different emotional states is realized. Or duration, intelligently aligning the user's commonly used application icons in an emotional state, reducing the amount of time users spend looking for application icons.
  • FIG. 2 is a schematic flow chart of a method for setting an emotional state in an embodiment of the present invention. The method as shown includes:
  • step S201 the iris feature information of the user is collected according to a preset time interval.
  • the emotional state of the person may be reflected in the iris feature information, but the emotional state of each person and the iris feature range corresponding to the different emotional states are different, and thus the current emotional state of the user is determined.
  • the terminal needs to collect the iris feature information of the user according to a preset time interval. For example, the user's iris feature information can be collected every five minutes.
  • Step S202 the iris feature information collected in a predetermined time period is divided into a preset number of iris feature ranges.
  • a time period may be preset, and the iris feature information collected in a predetermined time period is divided into a preset number of iris feature ranges.
  • the iris characteristic information of the user is collected every five minutes, and the total range of the collected iris feature information is between 0 and 0.12 within one week. If the preset iris feature range is 4, then The iris characteristic information can be divided into four iris feature ranges of 0 to 0.03, 0.03 to 0.06, 0.06 to 0.09, and 0.09 to 0.12.
  • Step S203 setting an emotional state corresponding to each of the iris feature ranges.
  • Step S102 is also Determining the current emotional state of the user according to the iris feature range and its corresponding emotional state.
  • the terminal may set the emotional state corresponding to the four iris feature ranges as: 0 to 0.03 corresponding to the emotional state one, 0.03 to 0.06 corresponding to the emotional state two, 0.06 to 0.09 corresponding to the emotional state three, and 0.09 to 0.12 Corresponds to emotional state four.
  • the embodiment of the present invention collects the iris feature information of the user according to a preset time interval, and divides the iris feature information collected in a predetermined time period into a preset number of iris feature ranges, and sets the respective The emotional state corresponding to the iris feature range realizes the setting of the emotional state, so that the terminal determines different emotional states of the user, intelligently arranges the application icons commonly used by the user in a certain emotional state, and reduces the user's search for the application. The time spent on the program icon.
  • FIG. 3 is a schematic flow chart of a program usage recording method in an embodiment of the present invention. The method as shown includes:
  • Step S301 acquiring iris feature information of the user during the running of the target application.
  • the terminal may periodically acquire the eye image of the user through a terminal camera or other camera device according to a preset time interval, or may detect that the user enters the At the time of the target terminal program, the eye image of the user is acquired once. After obtaining a clear eye image, it is necessary to detect the boundary between the iris and the pupil in the eye image, the boundary between the iris and the sclera, the boundary between the iris and the upper eyelid, and the boundary between the iris and the lower eyelid to determine the location. The position of the iris and the pupil in the eye image is described; afterwards, the terminal can extract the iris feature information of the user by referring to the positions of the iris and the pupil, and the iris feature information includes a pupil feature parameter and an iris feature parameter.
  • Step S302 Determine, according to the acquired iris feature information, a current emotional state of the user.
  • the emotion state corresponding to the iris feature range that has been set in step S203 may be compared, and the iris feature information of the user is Within a certain iris feature range, it is indicated that the user is in an emotional state corresponding to the iris feature range, and then the emotional state corresponding to the iris feature range may be acquired.
  • step S303 recording is performed in the emotional state determined in this step.
  • step S102 refers to step S102.
  • Step S303 recording usage information of the user to the target application in the emotional state.
  • the terminal needs to record usage information of the user on the target application in different emotional states, and the usage information includes usage duration, usage times, or operation times.
  • the terminal uses the total duration of each application corresponding to the emotional state and the emotion. The duration of use of the target application corresponding to the state is increased by one time interval, or increased by the number of times of use or the number of operations of the target application corresponding to the emotional state.
  • the embodiment of the present invention obtains the iris feature information of the user during the running of the target application, determines the current emotional state of the user according to the acquired iris feature information, and records the current emotional state of the user.
  • the usage information of the target application realizes recording the usage record of the user under different emotions, so that the terminal can use the record according to the program, intelligently arrange the application icons commonly used by the user in a certain emotional state, and reduce the user's search. The time it takes to apply the icon.
  • the program icon sorting apparatus may be a user terminal or a software program running on a user terminal, and the user terminal may include a mobile phone, a notebook computer, and a tablet computer. , on-board computer, POS (Point Of Sales) machine, etc.
  • the device as shown in the figure includes at least:
  • the iris acquisition module 410 is configured to acquire iris feature information of the user.
  • the iris acquisition module 410 is specifically configured to acquire an eye image of the user through a terminal camera or other imaging device when detecting that the user enters or stays on the system desktop.
  • the entering the system desktop includes entering the system desktop when booting, entering the system desktop during standby startup, entering the system desktop when the program runs in the background, or entering the system desktop when the program exits.
  • the iris acquisition module 410 may prompt the user to align the eye to the camera for eye image acquisition. If the collected eye image is complete and the detail information is clear, the iris feature information is collected; If the image of the eye is incomplete or the detail information is not clear enough, the iris acquisition module 410 needs to acquire the eye image of the user again until the eye image of the user is complete and the detail information is clear.
  • the iris acquisition module 410 needs to preprocess the eye image. First of all, It is necessary to detect the position of the iris and the pupil in the eye image, the boundary between the iris and the sclera, the boundary between the iris and the upper eyelid, and the boundary between the iris and the lower eyelid to determine the iris and pupil in the image of the eye. Position; afterwards, the iris acquisition module 410 can extract the iris feature information of the user by referring to the positions of the iris and the pupil, the iris feature information includes a pupil feature parameter and an iris feature parameter; and finally, the iris feature information is performed.
  • the normalization process is to adjust the iris feature information to a preset fixed size of the iris recognition system to ensure accurate identification of the iris feature information; the iris feature information after normalization can be enhanced, for example Indicators such as brightness, contrast, and smoothness are adjusted to improve the recognition rate of the iris feature information.
  • the iris acquisition module 410 is further configured to collect iris feature information of the user according to a preset time interval.
  • the emotional state of the person may be reflected in the iris feature information, but the emotional state of each person and the iris feature range corresponding to the different emotional states are different, and thus the current emotional state of the user is determined.
  • the iris acquisition module 410 needs to collect the iris feature information of the user according to a preset time interval. For example, the user's iris feature information can be collected every five minutes.
  • the iris acquisition module 410 is further configured to acquire iris feature information of the user during the running of the target application.
  • the iris acquisition module 410 may periodically acquire the eye image of the user through a terminal camera or other imaging device according to a preset time interval, or may detect the user.
  • an eye image of the user is acquired. After obtaining a clear eye image, it is necessary to detect the boundary between the iris and the pupil in the eye image, the boundary between the iris and the sclera, the boundary between the iris and the upper eyelid, and the boundary between the iris and the lower eyelid to determine the location.
  • the position of the iris and the pupil in the eye image is described; after that, the iris acquisition module 410 can extract the iris feature information of the user by referring to the position of the iris and the pupil, and the iris feature information includes the pupil feature parameter and the iris feature parameter. .
  • the state determining module 420 is configured to determine the current emotional state of the user according to the acquired iris feature information.
  • the terminal before the state determining module 420 determines the current emotional state of the user, the terminal has set different iris feature ranges, and an emotional state corresponding to each of the iris feature ranges. In this way, after acquiring the iris feature information of the user, the iris feature range can be compared with each of the iris feature ranges, and the iris feature information of the user is within a certain iris feature range. If the user is in an emotional state corresponding to the iris feature range, the emotional state corresponding to the iris feature range may be acquired.
  • an eye feature parameter of the user such as a pupil feature parameter, an iris feature parameter, an eyelid feature parameter, and the like
  • the terminal can determine the current current of the user by calculating a ratio of the pupil characteristic parameter to the iris characteristic parameter, the ratio of the pupil feature parameter to the eyelid feature parameter, the ratio of the pupil feature parameter to the eyelid feature parameter, and the like.
  • the emotional state may be a ratio of the pupil characteristic parameter to the iris characteristic parameter, the ratio of the pupil feature parameter to the eyelid feature parameter, the ratio of the pupil feature parameter to the eyelid feature parameter, and the like.
  • the state determining module 420 may determine the current emotional state of the user according to a ratio of the pupil feature parameter to the iris feature parameter in the iris feature information.
  • the preset four iris feature ranges and the emotional states corresponding to the four iris feature ranges are: 0 to 0.03 corresponding to the emotional state one, 0.03 to 0.06 corresponding to the emotional state two, and 0.06 to 0.09 corresponding to the emotional state three, 0.09 to 0.12 corresponds to emotional state four. If the ratio of the pupil feature parameter and the iris feature parameter acquired by the user in step S101 is 0.1, 0.1 is compared with the four set iris feature ranges, and 0.1 is detected as an emotion. State four, then the current emotional state of the user can be determined as emotional state four.
  • the record obtaining module 430 is configured to acquire a program usage record corresponding to the emotional state, where the program usage record includes usage information of the user for each application in the emotional state.
  • the program usage record is used by the terminal in each application to record usage information of each user in different emotional states, and the usage information includes usage duration, usage times, or operations. frequency. Therefore, the program usage record corresponding to a certain emotional state acquired by the record obtaining module 430 includes the usage duration, the number of uses, or the number of operations of the user in the emotional state to the respective application of the terminal.
  • the program usage record in an emotional state can be used for various applications such as WeChat 16 hours, video player 8 hours, XX game 12 hours, browser 10 hours, reading software 0 hours, etc., or WeChat 60 Times, video player 10 times, XX game 25 times, browser 34 times, reading software 0 times, etc., the number of times of use or the number of operations of various applications.
  • the program usage record is not recorded within a preset time. It is constant, but is constantly updated. The duration, number of uses, or number of operations of each user using each application will be recorded in time.
  • the sorting module 440 is configured to sort and display the display icons of the application according to the program usage record corresponding to the emotional state.
  • the sorting module 440 may sort and display the display icons of the application according to the usage information of each application in the emotional state acquired by the record obtaining module 430. Since the usage information in the record obtaining module 430 includes the usage duration, the number of uses, or the number of operations, the sorting module 440 is specifically configured to use the duration, the number of uses, or the number of times the user uses the application in the emotional state. The display icons of the application are sorted in order of the number of operations.
  • the order of the application is: WeChat 16 hours, XX game 12 hours, browser 10 hours, video player 8 hours, Reading software for 0 hours, then the system desktop will display the WeChat icon in the front, followed by XX games, browser, video player, and finally reading software; if sorted by the number of uses or operations, then The order of the application is: WeChat 60 times, browser 34 times, XX game 25 times, video player 10 times, reading software 0 times, then the application display icons on the system desktop are WeChat, browser, XX games, video players and reading software.
  • the sorting module 440 includes a ratio determining unit 441 and a proportion sorting unit 442, as shown in FIG. 5.
  • the ratio determining unit 441 is configured to determine, according to the total usage duration of all the applications in the record usage program according to the emotional state, and the usage duration of each application, determining the usage duration of each application when the emotional state is used. .
  • the ratio determining unit 441 can calculate the total usage duration of all applications in the emotional state by using the sum of the usage durations of the respective applications, and then according to the usage duration of each application and all applications according to the emotional state.
  • the ratio of total usage time is used to account for the length of time each application is used.
  • the usage time of the program usage record in an emotional state is 16 hours for WeChat, 8 hours for video player, 12 hours for XX game, 10 hours for browser, 0 hour for reading software, and 0 hours for all other applications.
  • the total usage duration of all applications in the emotional state is the sum of the usage durations of the respective applications for 36 hours, and the duration of use of the respective applications.
  • the proportion is: WeChat 4/9, video player 2/9, XX game 1/3, browser 5/18, reading software 0, all other applications are 0.
  • the proportion sorting unit 442 is configured to sort and display the display icons of the respective applications according to the usage duration ratio of the respective applications.
  • the proportion sorting unit 442 may sort the display icons of the application according to the order of use duration of each application obtained by the ratio determining unit 441 in descending order.
  • the device further includes:
  • the usage record module 450 is configured to record usage information of the user on the target application in the emotional state.
  • the usage record module 450 needs to record usage information of the user on the target application in different emotional states, and the usage information includes usage duration, usage times, or operation times. .
  • the use of the recording module 450 is performed on each application corresponding to the emotional state.
  • the total duration of use and the duration of use of the target application corresponding to the emotional state are each increased by a time interval, or increased by the number of times of use or the number of operations of the target application corresponding to the emotional state.
  • the device further includes:
  • the range dividing module 460 is configured to divide the iris feature information collected in a predetermined time period into a preset number of iris feature ranges.
  • a time period may be preset, and the range dividing module 460 may divide the iris feature information collected in a predetermined time period into a preset number of irises. Range of features. For example, the iris characteristic information of the user is collected every five minutes, and the total range of the collected iris feature information is between 0 and 0.12 within one week. If the preset iris feature range is 4, then The iris characteristic information can be divided into four iris feature ranges of 0 to 0.03, 0.03 to 0.06, 0.06 to 0.09, and 0.09 to 0.12.
  • the state setting module 470 is configured to set an emotional state corresponding to each of the iris feature ranges.
  • the range of the iris features divided by the range dividing module 460 also needs to set a corresponding emotional state, so that the corresponding program usage record can be obtained according to the emotional state.
  • State determination The module 420 also determines the current emotional state of the user based on the iris feature range and its corresponding emotional state.
  • the terminal may set the emotional state corresponding to the four iris feature ranges as: 0 to 0.03 corresponding to the emotional state 1, 0.03 to 0.06 corresponding to the emotional state 2, 0.06 to 0.09 corresponding to the emotional state 3, 0.09 ⁇ 0.12 corresponds to emotional state four.
  • the embodiment of the present invention obtains the iris feature information of the user, determines the current emotional state of the user according to the obtained iris feature information, and acquires a program usage record corresponding to the emotional state, where the program usage record includes the user in the
  • the usage information of each application in the emotional state is sorted and displayed according to the program usage record corresponding to the emotional state, and the number of times the user uses each application according to different emotional states is realized. Or duration, intelligently aligns the user's commonly used application icons in an emotional state, reducing the amount of time users spend looking for application icons.
  • the storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), or a random access memory (RAM).

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种程序图标排序方法,所述方法包括:获取用户的虹膜特征信息(S101);根据获取到的所述虹膜特征信息,确定所述用户当前的情绪状态(S102);获取该情绪状态对应的程序使用记录,所述程序使用记录包括用户在该情绪状态下对各个应用程序的使用信息(S103);根据所述情绪状态对应的程序使用记录,对所述应用程序的显示图标进行排序并显示(S104)。还公开了一种程序图标排序装置。可以根据用户在不同情绪状态下使用各个应用程序的次数或时长,智能的将用户在某个情绪状态下常用的应用程序图标排列在前面,避免用户长时间寻找要使用的应用程序图标。

Description

一种程序图标排序方法和装置
本申请要求于2015年06月24日提交中国专利局,申请号为201510354906.8、发明名称为“一种程序图标排序方法和装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明涉及电子技术领域,尤其涉及一种程序图标排序方法和装置。
背景技术
随着电子技术领域的发展,电子终端已经越来越广泛的被人们使用,随之而来也出现了大量的各种类型的应用程序来满足人们使用电子终端进行各种工作、生活、娱乐的项目。目前,电子终端的应用程序图标的排列方式都是根据安装时间的先后次序进行排列的,也就是先安装的应用程序的图标就会排在靠前的位置和页面而后安装的应用程序的图标就会排在靠后的位置和页面。但是,现在人们安装的应用程序都非常的多,并且在不同情绪状态下常用的应用程序通常是不同的,这样在众多的应用程序图标中很快寻找自己想要的应用程序图标是很难的。因此现有的应用程序图标排列方法,耗费了用户大量时间在寻找应用程序的图标上,降低了用户体检。
发明内容
本发明实施例所要解决的技术问题在于,提供一种程序图标排序方法和装置,可以根据用户在不同情绪状态下使用各个应用程序的次数或时长,智能的将用户常用的应用程序图标排列在前面,避免用户长时间寻找要使用的应用程序图标。
为了解决上述技术问题,本发明实施例提供了提供一种程序图标排序方法,所述方法包括:
获取用户的虹膜特征信息;
根据获取到的所述虹膜特征信息,确定所述用户当前的情绪状态;
获取该情绪状态对应的程序使用记录,所述程序使用记录包括用户在该情绪状态下对各个应用程序的使用信息;
根据所述情绪状态对应的程序使用记录,对所述应用程序的显示图标进行排序并显示。
相应的,本发明实施例提供了提供一种程序图标排序装置,所述装置包括:
虹膜获取模块,用于获取用户的虹膜特征信息;
状态确定模块,用于根据获取到的所述虹膜特征信息,确定所述用户当前的情绪状态;
记录获取模块,用于获取该情绪状态对应的程序使用记录,所述程序使用记录包括用户在该情绪状态下对各个应用程序的使用信息;
排序模块,用于根据所述情绪状态对应的程序使用记录,对所述应用程序的显示图标进行排序并显示。
本发明实施例通过获取用户的虹膜特征信息,根据获取到的所述虹膜特征信息,确定所述用户当前的情绪状态,获取该情绪状态对应的程序使用记录,所述程序使用记录包括用户在该情绪状态下对各个应用程序的使用信息,根据所述情绪状态对应的程序使用记录,对所述应用程序的显示图标进行排序并显示,实现了根据用户在不同情绪状态下使用各个应用程序的次数或时长,智能的将用户在某个情绪状态下常用的应用程序图标排列在前面,减少用户在寻找应用程序图标时所花的时间。
附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本发明实施例中一种程序图标排序方法的流程示意图;
图2是本发明实施例中情绪状态设定方法的流程示意图;
图3是本发明实施例中程序使用记录方法的流程示意图;
图4是本发明实施例中一种程序图标排序装置的结构示意图;
图5是本发明实施例中图4的排序模块组成结构图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
图1是本发明实施例中一种程序图标排序方法的流程示意图,本方法流程可以由程序图标排序装置实施,所述程序图标排序装置可以为用户终端或运行在用户终端的软件程序,所述用户终端可以包括手机、笔记本电脑、平板电脑、车载电脑、POS(Point Of Sales,销售点)机等。如图所示所述方法至少包括:
步骤S101,获取用户的虹膜特征信息。
具体的,当检测到用户进入或停留在系统桌面时,可以通过终端摄像头或者其他摄像设备获取所述用户的眼部图像。所述进入系统桌面包括开机启动时进入系统桌面、待机启动时进入系统桌面、程序后台运行时进入系统桌面或程序退出时进入系统桌面。具体实施中,终端可以提示所述用户将眼部对准摄像头进行眼部图像采集,若采集到的所述眼部图像完整且细节信息清晰,则对虹膜特征信息进行采集;若采集到的所述眼部图像不完整或细节信息不够清晰,则终端需要再次获取所述用户的眼部图像,直到所述用户的眼部图像完整且细节信息清晰。获取到清晰的眼部图像之后,终端需要对所述眼部图像进行预处理。首先,需要检测眼部图像中虹膜与瞳孔的边界、虹膜与巩膜的边界、虹膜与上眼皮的边界以及虹膜与下眼皮的边界这几项特征的位置,以确定所述眼部图像中虹膜和瞳孔的位置;之后,终端就可以参照所述虹膜和瞳孔的位置,提取所述用户的虹膜特征信息,所述虹膜特征信息包括瞳孔特征参数与虹膜特征参数;最后将所述虹膜特征信息进行归一化处理,即将所述虹膜特征信息调整到虹膜识别系统预设的固定尺寸,以保证对所述虹膜特征信息的精确识别;归一化之后的所述虹膜特征信息可以进行增强处理,例如调节亮度、对比度和平滑度等等指标,以提高对所述虹膜特征信息的识别率。
步骤S102,根据获取到的所述虹膜特征信息,确定所述用户当前的情绪 状态。
具体的,在确定所述用户当前的情绪状态之前,终端已经设定了不同的虹膜特征范围,以及每个所述虹膜特征范围对应的情绪状态。这样,在获取到所述用户的虹膜特征信息之后,就可以和每个所述虹膜特征范围进行比对,比对的结果所述用户的虹膜特征信息是在某一个虹膜特征范围内的,说明所述用户是处于该虹膜特征范围对应的情绪状态下,则获取该虹膜特征范围对应的情绪状态即可。具体实施中,在步骤S101中获取的所述用户的眼部图像中可以提取出所述用户的瞳孔特征参数、虹膜特征参数、眼眶特征参数等等眼部特征参数,所述瞳孔特征参数、虹膜特征参数以及眼眶特征参数可以包括直径、半径或面积中的任一种。因此终端就可以通过计算瞳孔特征参数与虹膜特征参数的比值、瞳孔特征参数与眼眶特征参数的比值、瞳孔特征参数与眼眶特征参数的比值等计算方法中的某一种方法,确定所述用户当前的情绪状态。
在可选实施例中,可以根据所述虹膜特征信息中的瞳孔特征参数与虹膜特征参数的比值,来确定所述用户当前的情绪状态。例如,预设的四个虹膜特征范围以及所述四个虹膜特征范围对应的情绪状态为:0~0.03对应情绪状态一,0.03~0.06对应情绪状态二,0.06~0.09对应情绪状态三,0.09~0.12对应情绪状态四。在步骤S101中获取到的所述用户的瞳孔特征参数与虹膜特征参数的比值为0.1,则将0.1与所述四个设定的所述虹膜特征范围进行比对,可以检测出0.1是属于情绪状态四,那么所述用户当前的情绪状态即可确定为情绪状态四。
步骤S103,获取该情绪状态对应的程序使用记录,所述程序使用记录包括用户在该情绪状态下对各个应用程序的使用信息。
具体的,所述程序使用记录是终端在各个应用程序使用的过程中,记录的所述用户在不同的情绪状态下对各个应用程序的使用信息,所述使用信息包括使用时长、使用次数或操作次数。因此某种情绪状态对应的程序使用记录就包括了所述用户在该情绪状态下对终端各个应用程序的使用时长、使用次数或操作次数。例如,所述情绪状态四的程序使用记录可以为微信16小时、视频播放器8小时、XX游戏12小时、浏览器10小时、阅读软件0小时等各种应用程序的使用时长,也可以为微信60次、视频播放器10次、XX游戏25次、 浏览器34次、阅读软件0次等各种应用程序的使用次数或操作次数。进一步的,所述程序使用记录并不是在预设的时间内记录完毕就不变的,而是在不断更新的,所述用户每次使用各个应用程序的使用时长、使用次数或操作次数都会被及时记录。
步骤S104,根据所述情绪状态对应的程序使用记录,对所述应用程序的显示图标进行排序并显示。
具体的,终端可以根据步骤S103中获取到的该情绪状态下对各个应用程序的使用信息,对所述应用程序的显示图标进行排序并显示。由于所述使用信息中包括了使用时长、使用次数或操作次数,因此就可以按照用户在该情绪状态下对各个应用程序的使用时长、使用次数或操作次数从大到小的顺序对所述应用程序的显示图标进行排序。仍以步骤S103中的例子为例,如果以所述使用时长进行排序,那么所述应用程序的排序即为:微信16小时、XX游戏12小时、浏览器10小时、视频播放器8小时、阅读软件0小时,那么系统桌面就会将微信的图标显示在最前面,接着依次是XX游戏、浏览器、视频播放器,最后是阅读软件;如果以所述使用次数或操作次数进行排序,那么所述应用程序的排序即为:微信60次、浏览器34次、XX游戏25次、视频播放器10次、阅读软件0次,那么系统桌面上的应用程序显示图标依次为微信、浏览器、XX游戏、视频播放器和阅读软件。
可选的,可以通过各个应用程序的使用时长之和计算出所述情绪状态下所有应用程序的总使用时长,然后根据该情绪状态下各个应用程序的使用时长与所有应用程序的总使用时长的比值得到各个应用程序的使用时长占比,再根据所述使用时长占比的大小,对所述各个应用程序的图标进行排列。
具体的,可以通过各个应用程序的使用时长之和计算出所述情绪状态下所有应用程序的总使用时长,然后根据该情绪状态下各个应用程序的使用时长与所有应用程序的总使用时长的比值得到各个应用程序的使用时长占比。例如,某情绪状态下的程序使用记录的使用时长为微信16小时、视频播放器8小时、XX游戏12小时、浏览器10小时、阅读软件0小时、其他所有应用程序都为0小时,那么所述情绪状态下所有应用程序的总使用时长即为各个应用程序的使用时长之和36小时,则所述各个应用程序的使用时长占比为:微信4/9、视 频播放器2/9、XX游戏1/3、浏览器5/18、阅读软件0、其他所有应用程序都为0。再根据各个应用程序的使用时长占比按照从大到小的顺序对所述应用程序的显示图标进行排序。
本发明实施例通过获取用户的虹膜特征信息,根据获取到的所述虹膜特征信息,确定所述用户当前的情绪状态,获取该情绪状态对应的程序使用记录,所述程序使用记录包括用户在该情绪状态下对各个应用程序的使用信息,根据所述情绪状态对应的程序使用记录,对所述应用程序的显示图标进行排序并显示,实现了根据用户在不同情绪状态下使用各个应用程序的次数或时长,智能的将用户在某个情绪状态下常用的应用程序图标排列在前面,减少用户在寻找应用程序图标时所花的时间。
图2是本发明实施例中情绪状态设定方法的流程示意图。如图所示所述方法包括:
步骤S201,按照预设的时间间隔,采集所述用户的虹膜特征信息。
具体的,人的情绪状态是可以反应在所述虹膜特征信息中的,但是每个人的情绪状态以及不同的情绪状态对应的虹膜特征范围是有区别的,因此在确定所述用户当前的情绪状态之前,终端需要按照预设的时间间隔,采集所述用户的虹膜特征信息。例如,可以每隔五分钟采集一次所述用户的虹膜特征信息。
步骤S202,将在预定的时间周期内采集的所述虹膜特征信息划分出预设数量的虹膜特征范围。
具体的,根据步骤S201采集到的所述虹膜特征信息,可以预设一个时间周期,将在预定的时间周期内采集的所述虹膜特征信息划分成预设数量的虹膜特征范围。例如,每隔五分钟采集一次所述用户的虹膜特征信息,在一周之内,采集所得的所述虹膜特征信息的总范围在0~0.12之间,如果预设的虹膜特征范围为4,那么就可以将所述虹膜特征信息划分为:0~0.03、0.03~0.06、0.06~0.09、0.09~0.12这四个虹膜特征范围。
步骤S203,设定所述各个虹膜特征范围对应的情绪状态。
具体的,步骤S202划分出的所述虹膜特征范围还需要设定对应的情绪状态,这样才可以根据所述情绪状态获取对应的程序使用记录。步骤S102也是 根据所述虹膜特征范围以及其对应的情绪状态来确定所述用户当前的情绪状态的。根据步骤S202的例子,终端可以设定所述四个虹膜特征范围对应的情绪状态为:0~0.03对应情绪状态一,0.03~0.06对应情绪状态二,0.06~0.09对应情绪状态三,0.09~0.12对应情绪状态四。
本发明实施例通过按照预设的时间间隔,采集所述用户的虹膜特征信息,将在预定的时间周期内采集的所述虹膜特征信息划分出预设数量的虹膜特征范围,设定所述各个虹膜特征范围对应的情绪状态,实现了情绪状态的设定,以使终端确定用户的不同的情绪状态,智能的将用户在某情绪状态下常用的应用程序图标排列在前面,减少用户在寻找应用程序图标时所花的时间。
图3是本发明实施例中程序使用记录方法的流程示意图。如图所示所述方法包括:
步骤S301,在目标应用程序运行过程中,获取所述用户的虹膜特征信息。
具体的,在所述目标应用程序运行过程中,终端可以按照预设的时间间隔,通过终端摄像头或者其他摄像设备定时获取所述用户的眼部图像,也可以在检测到所述用户进入所述目标终端程序的时候,获取一次所述用户的眼部图像。获取到清晰的眼部图像之后,需要检测眼部图像中虹膜与瞳孔的边界、虹膜与巩膜的边界、虹膜与上眼皮的边界以及虹膜与下眼皮的边界这几项特征的位置,以确定所述眼部图像中虹膜和瞳孔的位置;之后,终端就可以参照所述虹膜和瞳孔的位置,提取所述用户的虹膜特征信息,所述虹膜特征信息包括瞳孔特征参数与虹膜特征参数。
步骤S302,根据获取到的所述虹膜特征信息,确定所述用户当前的情绪状态。
具体的,根据步骤S301获取到的所述虹膜特征信息,可以和步骤S203中已经设定到的所述虹膜特征范围对应的情绪状态进行比对,比对的结果所述用户的虹膜特征信息是在某一个虹膜特征范围内的,说明所述用户是处于该虹膜特征范围对应的情绪状态下,则获取该虹膜特征范围对应的情绪状态即可。从而在执行步骤S303时,就在本步骤确定的所述情绪状态下进行记录。具体实施方法可以参见步骤S102。
步骤S303,记录所述用户在该情绪状态下对目标应用程序的使用信息。
具体的,在目标应用程序使用的过程中,终端需要记录所述用户在不同的情绪状态下对所述目标应用程序的使用信息,所述使用信息包括使用时长、使用次数或操作次数。具体实施中,步骤S301和步骤S302在预设的时间间隔中定时获取并确定了所述用户当前的情绪状态后,终端就在所述情绪状态对应的各个应用程序的总使用时长以及所述情绪状态对应的所述目标应用程序的使用时长各增加一个时间间隔的时长,或在所述情绪状态对应的所述目标应用程序的使用次数或操作次数上增加一次。
本发明实施例通过在目标应用程序运行过程中,获取所述用户的虹膜特征信息,根据获取到的所述虹膜特征信息,确定所述用户当前的情绪状态,记录所述用户当前的情绪状态下目标应用程序的使用信息,实现了记录用户不同情绪下的程序使用记录,从而使得终端可以根据程序使用记录,智能的将用户在某情绪状态下常用的应用程序图标排列在前面,减少用户在寻找应用程序图标时所花的时间。
图4是本发明实施例中一种程序图标排序装置的结构示意图,所述程序图标排序装置可以为用户终端或运行在用户终端的软件程序,所述用户终端可以包括手机、笔记本电脑、平板电脑、车载电脑、POS(Point Of Sales,销售点)机等。如图所示所述装置至少包括:
虹膜获取模块410,用于获取用户的虹膜特征信息。
具体的,虹膜获取模块410具体用于当检测到用户进入或停留在系统桌面时,可以通过终端摄像头或者其他摄像设备获取所述用户的眼部图像。所述进入系统桌面包括开机启动时进入系统桌面、待机启动时进入系统桌面、程序后台运行时进入系统桌面或程序退出时进入系统桌面。具体实施中,虹膜获取模块410可以提示所述用户将眼部对准摄像头进行眼部图像采集,若采集到的所述眼部图像完整且细节信息清晰,则对虹膜特征信息进行采集;若采集到的所述眼部图像不完整或细节信息不够清晰,则虹膜获取模块410需要再次获取所述用户的眼部图像,直到所述用户的眼部图像完整且细节信息清晰。获取到清晰的眼部图像之后,虹膜获取模块410需要对所述眼部图像进行预处理。首先, 需要检测眼部图像中虹膜与瞳孔的边界、虹膜与巩膜的边界、虹膜与上眼皮的边界以及虹膜与下眼皮的边界这几项特征的位置,以确定所述眼部图像中虹膜和瞳孔的位置;之后,虹膜获取模块410就可以参照所述虹膜和瞳孔的位置,提取所述用户的虹膜特征信息,所述虹膜特征信息包括瞳孔特征参数与虹膜特征参数;最后将所述虹膜特征信息进行归一化处理,即将所述虹膜特征信息调整到虹膜识别系统预设的固定尺寸,以保证对所述虹膜特征信息的精确识别;归一化之后的所述虹膜特征信息可以进行增强处理,例如调节亮度、对比度和平滑度等等指标,以提高对所述虹膜特征信息的识别率。
进一步的,所述虹膜获取模块410还用于按照预设的时间间隔,采集所述用户的虹膜特征信息。
具体的,人的情绪状态是可以反应在所述虹膜特征信息中的,但是每个人的情绪状态以及不同的情绪状态对应的虹膜特征范围是有区别的,因此在确定所述用户当前的情绪状态之前,虹膜获取模块410需要按照预设的时间间隔,采集所述用户的虹膜特征信息。例如,可以每隔五分钟采集一次所述用户的虹膜特征信息。
进一步的,所述虹膜获取模块410还用于在目标应用程序运行过程中,获取所述用户的虹膜特征信息。
具体的,在所述目标应用程序运行过程中,虹膜获取模块410可以按照预设的时间间隔,通过终端摄像头或者其他摄像设备定时获取所述用户的眼部图像,也可以在检测到所述用户进入所述目标终端程序的时候,获取所述用户的眼部图像。获取到清晰的眼部图像之后,需要检测眼部图像中虹膜与瞳孔的边界、虹膜与巩膜的边界、虹膜与上眼皮的边界以及虹膜与下眼皮的边界这几项特征的位置,以确定所述眼部图像中虹膜和瞳孔的位置;之后,虹膜获取模块410就可以参照所述虹膜和瞳孔的位置,提取所述用户的虹膜特征信息,所述虹膜特征信息包括瞳孔特征参数与虹膜特征参数。
状态确定模块420,用于根据获取到的所述虹膜特征信息,确定所述用户当前的情绪状态。
具体的,在状态确定模块420确定所述用户当前的情绪状态之前,终端已经设定了不同的虹膜特征范围,以及每个所述虹膜特征范围对应的情绪状态。 这样,在获取到所述用户的虹膜特征信息之后,就可以和每个所述虹膜特征范围进行比对,比对的结果所述用户的虹膜特征信息是在某一个虹膜特征范围内的,说明所述用户是处于该虹膜特征范围对应的情绪状态下,则获取该虹膜特征范围对应的情绪状态即可。具体实施中,在虹膜获取模块410中获取的所述用户的眼部图像中可以提取出所述用户的瞳孔特征参数、虹膜特征参数、眼眶特征参数等等眼部特征参数,所述瞳孔特征参数、虹膜特征参数以及眼眶特征参数可以包括直径、半径或面积中的任一种。因此终端就可以通过计算瞳孔特征参数与虹膜特征参数的比值、瞳孔特征参数与眼眶特征参数的比值、瞳孔特征参数与眼眶特征参数的比值等计算方法中的某一种方法,确定所述用户当前的情绪状态。
在可选实施例中,状态确定模块420可以根据所述虹膜特征信息中的瞳孔特征参数与虹膜特征参数的比值,来确定所述用户当前的情绪状态。例如,预设的四个虹膜特征范围以及所述四个虹膜特征范围对应的情绪状态为:0~0.03对应情绪状态一,0.03~0.06对应情绪状态二,0.06~0.09对应情绪状态三,0.09~0.12对应情绪状态四。在步骤S101中获取到的所述用户的瞳孔特征参数与虹膜特征参数的比值为0.1,则将0.1与所述四个设定的所述虹膜特征范围进行比对,可以检测出0.1是属于情绪状态四,那么所述用户当前的情绪状态即可确定为情绪状态四。
记录获取模块430,用于获取该情绪状态对应的程序使用记录,所述程序使用记录包括用户在该情绪状态下对各个应用程序的使用信息。
具体的,所述程序使用记录是终端在各个应用程序使用的过程中,记录的所述用户在不同的情绪状态下对各个应用程序的使用信息,所述使用信息包括使用时长、使用次数或操作次数。因此记录获取模块430获取的某种情绪状态对应的程序使用记录就包括了所述用户在该情绪状态下对终端各个应用程序的使用时长、使用次数或操作次数。例如,某情绪状态下的程序使用记录可以为微信16小时、视频播放器8小时、XX游戏12小时、浏览器10小时、阅读软件0小时等各种应用程序的使用时长,也可以为微信60次、视频播放器10次、XX游戏25次、浏览器34次、阅读软件0次等各种应用程序的使用次数或操作次数。进一步的,所述程序使用记录并不是在预设的时间内记录完毕 就不变的,而是在不断更新的,所述用户每次使用各个应用程序的使用时长、使用次数或操作次数都会被及时记录。
排序模块440,用于根据所述情绪状态对应的程序使用记录,对所述应用程序的显示图标进行排序并显示。
具体的,排序模块440可以根据记录获取模块430中获取到的该情绪状态下对各个应用程序的使用信息,对所述应用程序的显示图标进行排序并显示。由于所述记录获取模块430中的使用信息中包括了使用时长、使用次数或操作次数,因此所述排序模块440具体用于按照用户在该情绪状态下对各个应用程序的使用时长、使用次数或操作次数从大到小的顺序对所述应用程序的显示图标进行排序。仍以记录获取模块430的例子为例,如果以所述使用时长进行排序,那么所述应用程序的排序即为:微信16小时、XX游戏12小时、浏览器10小时、视频播放器8小时、阅读软件0小时,那么系统桌面就会将微信的图标显示在最前面,接着依次是XX游戏、浏览器、视频播放器,最后是阅读软件;如果以所述使用次数或操作次数进行排序,那么所述应用程序的排序即为:微信60次、浏览器34次、XX游戏25次、视频播放器10次、阅读软件0次,那么系统桌面上的应用程序显示图标依次为微信、浏览器、XX游戏、视频播放器和阅读软件。
可选的,所述排序模块440包括占比确定单元441和占比排序单元442,如图5所示。
占比确定单元441,用于根据所述该情绪状态对应的程序使用记录中的所有应用程序的总使用时长,以及各个应用程序的使用时长,确定该情绪状态时各个应用程序的使用时长占比。
具体的,占比确定单元441可以通过各个应用程序的使用时长之和计算出所述情绪状态下所有应用程序的总使用时长,然后根据该情绪状态下各个应用程序的使用时长与所有应用程序的总使用时长的比值得到各个应用程序的使用时长占比。例如,某情绪状态下的程序使用记录的使用时长为微信16小时、视频播放器8小时、XX游戏12小时、浏览器10小时、阅读软件0小时、其他所有应用程序都为0小时,那么所述情绪状态下所有应用程序的总使用时长即为各个应用程序的使用时长之和36小时,则所述各个应用程序的使用时长 占比为:微信4/9、视频播放器2/9、XX游戏1/3、浏览器5/18、阅读软件0、其他所有应用程序都为0。
占比排序单元442,用于根据所述各个应用程序的使用时长占比,对所述各个应用程序的显示图标进行排序并显示。
具体的,占比排序单元442可以根据占比确定单元441得到的各个应用程序的使用时长占比按照从大到小的顺序对所述应用程序的显示图标进行排序。
进一步的,所述装置还包括:
使用记录模块450,用于记录所述用户在该情绪状态下对目标应用程序的使用信息。
具体的,在目标应用程序使用的过程中,使用记录模块450需要记录所述用户在不同的情绪状态下对所述目标应用程序的使用信息,所述使用信息包括使用时长、使用次数或操作次数。具体实施中,虹膜获取模块410和状态确定模块420在预设的时间间隔中定时获取并确定了所述用户当前的情绪状态后,使用记录模块450就在所述情绪状态对应的各个应用程序的总使用时长以及所述情绪状态对应的所述目标应用程序的使用时长各增加一个时间间隔的时长,或在所述情绪状态对应的所述目标应用程序的使用次数或操作次数上增加一次。
进一步的,所述装置还包括:
范围划分模块460,用于将在预定的时间周期内采集的所述虹膜特征信息划分出预设数量的虹膜特征范围。
具体的,根据虹膜获取模块410采集到的所述虹膜特征信息,可以预设一个时间周期,范围划分模块460可以将在预定的时间周期内采集的所述虹膜特征信息划分成预设数量的虹膜特征范围。例如,每隔五分钟采集一次所述用户的虹膜特征信息,在一周之内,采集所得的所述虹膜特征信息的总范围在0~0.12之间,如果预设的虹膜特征范围为4,那么就可以将所述虹膜特征信息划分为:0~0.03、0.03~0.06、0.06~0.09、0.09~0.12这四个虹膜特征范围。
状态设定模块470,用于设定所述各个虹膜特征范围对应的情绪状态。
具体的,范围划分模块460划分出的所述虹膜特征范围还需要设定对应的情绪状态,这样才可以根据所述情绪状态获取对应的程序使用记录。状态确定 模块420也是根据所述虹膜特征范围以及其对应的情绪状态来确定所述用户当前的情绪状态的。根据范围划分模块460的例子,终端可以设定所述四个虹膜特征范围对应的情绪状态为:0~0.03对应情绪状态一,0.03~0.06对应情绪状态二,0.06~0.09对应情绪状态三,0.09~0.12对应情绪状态四。
本发明实施例通过获取用户的虹膜特征信息,根据获取到的所述虹膜特征信息,确定所述用户当前的情绪状态,获取该情绪状态对应的程序使用记录,所述程序使用记录包括用户在该情绪状态下对各个应用程序的使用信息,根据所述情绪状态对应的程序使用记录,对所述应用程序的显示图标进行排序并显示,实现了根据用户在不同情绪状态下使用各个应用程序的次数或时长,智能的将用户在某情绪状态下常用的应用程序图标排列在前面,减少用户在寻找应用程序图标时所花的时间。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,所述的程序可存储于一计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,所述的存储介质可为磁碟、光盘、只读存储记忆体(Read-Only Memory,ROM)或随机存储记忆体(Random Access Memory,RAM)等。

Claims (16)

  1. 一种程序图标排序方法,其特征在于,所述方法包括:
    获取用户的虹膜特征信息;
    根据获取到的所述虹膜特征信息,确定所述用户当前的情绪状态;
    获取该情绪状态对应的程序使用记录,所述程序使用记录包括用户在该情绪状态下对各个应用程序的使用信息;
    根据所述情绪状态对应的程序使用记录,对所述应用程序的显示图标进行排序并显示。
  2. 如权利要求1所述的程序图标排序方法,其特征在于,所述虹膜特征信息包括瞳孔特征参数与虹膜特征参数;
    所述根据获取到的所述虹膜特征信息,确定所述用户当前的情绪状态包括:
    根据所述瞳孔特征参数与虹膜特征参数的比值,确定所述用户当前的情绪状态。
  3. 如权利要求1所述的程序图标排序方法,其特征在于,所述获取用户的虹膜特征信息之前还包括:
    在目标应用程序运行过程中,获取所述用户的虹膜特征信息;
    根据获取到的所述虹膜特征信息,确定所述用户当前的情绪状态;
    记录所述用户在该情绪状态下对目标应用程序的使用信息。
  4. 如权利要求1所述的程序图标排序方法,其特征在于,所述使用信息包括使用时长、使用次数或操作次数。
  5. 如权利要求4所述的程序图标排序方法,其特征在于,所述根据所述情绪状态对应的程序使用记录,对所述应用程序的显示图标进行排序并显示包括:
    按照用户在该情绪状态下对各个应用程序的使用时长、使用次数或操作次 数从大到小的顺序,对所述各个程序的显示图标进行排序并显示。
  6. 如权利要求4所述的程序图标排序方法,其特征在于,所述获取该情绪状态对应的程序使用记录包括:
    获取该情绪状态对应的程序使用记录中的所有应用程序的总使用时长,以及各个应用程序的使用时长;
    所述根据所述情绪状态对应的程序使用记录,对所述应用程序的显示图标进行排序并显示包括:
    根据所述该情绪状态对应的程序使用记录中的所有应用程序的总使用时长,以及各个应用程序的使用时长,确定该情绪状态时各个应用程序的使用时长占比;
    根据所述各个应用程序的使用时长占比,对所述各个应用程序的显示图标进行排序并显示。
  7. 如权利要求1所述的程序图标排序方法,其特征在于,所述获取用户的虹膜特征信息之前还包括:
    按照预设的时间间隔,采集所述用户的虹膜特征信息;
    将在预定的时间周期内采集的所述虹膜特征信息划分出预设数量的虹膜特征范围;
    设定所述各个虹膜特征范围对应的情绪状态。
  8. 如权利要求1所述的程序图标排序方法,其特征在于,所述获取用户的虹膜特征信息包括:
    检测到用户进入系统桌面时,获取用户的虹膜特征信息。
  9. 一种程序图标排序装置,其特征在于,所述装置包括:
    虹膜获取模块,用于获取用户的虹膜特征信息;
    状态确定模块,用于根据获取到的所述虹膜特征信息,确定所述用户当前的情绪状态;
    记录获取模块,用于获取该情绪状态对应的程序使用记录,所述程序使用记录包括用户在该情绪状态下对各个应用程序的使用信息;
    排序模块,用于根据所述情绪状态对应的程序使用记录,对所述应用程序的显示图标进行排序并显示。
  10. 如权利要求9所述的程序图标排序方法,其特征在于,所述虹膜特征信息包括瞳孔特征参数与虹膜特征参数;
    所述状态确定模块用于:
    根据所述瞳孔特征参数与虹膜特征参数的比值,确定所述用户当前的情绪状态。
  11. 如权利要求9所述的程序图标排序装置,其特征在于,
    所述虹膜获取模块用于:在目标应用程序运行过程中,获取所述用户的虹膜特征信息;
    所述状态确定模块用于:根据获取到的所述虹膜特征信息,确定所述用户当前的情绪状态;
    所述装置还包括:
    使用记录模块,用于记录所述用户在该情绪状态下对目标应用程序的使用信息。
  12. 如权利要求9所述的程序图标排序装置,其特征在于,所述使用信息包括使用时长、使用次数或操作次数。
  13. 如权利要求12所述的程序图标排序装置,其特征在于,所述排序模块用于:
    按照用户在该情绪状态下对各个应用程序的使用时长、使用次数或操作次数从大到小的顺序,对所述各个程序的显示图标进行排序并显示。
  14. 如权利要求12所述的程序图标排序装置,其特征在于,所述记录获 取模块用于:
    获取该情绪状态对应的程序使用记录中的所有应用程序的总使用时长,以及各个应用程序的使用时长;
    所述排序模块包括:
    占比确定单元,用于根据所述该情绪状态对应的程序使用记录中的所有应用程序的总使用时长,以及各个应用程序的使用时长,确定该情绪状态时各个应用程序的使用时长占比;
    占比排序单元,用于根据所述各个应用程序的使用时长占比,对所述各个应用程序的显示图标进行排序并显示。
  15. 如权利要求9所述的程序图标排序装置,其特征在于,
    所述虹膜获取模块用于:按照预设的时间间隔,采集所述用户的虹膜特征信息;
    所述装置还包括:
    范围划分模块,用于将在预定的时间周期内采集的所述虹膜特征信息划分出预设数量的虹膜特征范围;
    状态设定模块,用于设定所述各个虹膜特征范围对应的情绪状态。
  16. 如权利要求9所述的程序图标排序装置,其特征在于,所述虹膜获取模块用于:
    检测到用户进入系统桌面时,获取用户的虹膜特征信息。
PCT/CN2015/100228 2015-06-24 2015-12-31 一种程序图标排序方法和装置 WO2016206347A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510354906.8 2015-06-24
CN201510354906.8A CN105573613B (zh) 2015-06-24 2015-06-24 一种程序图标排序方法和装置

Publications (1)

Publication Number Publication Date
WO2016206347A1 true WO2016206347A1 (zh) 2016-12-29

Family

ID=55883817

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/100228 WO2016206347A1 (zh) 2015-06-24 2015-12-31 一种程序图标排序方法和装置

Country Status (2)

Country Link
CN (1) CN105573613B (zh)
WO (1) WO2016206347A1 (zh)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106970743A (zh) * 2017-03-27 2017-07-21 宇龙计算机通信科技(深圳)有限公司 一种图标排序方法、装置及移动终端
CN107422944A (zh) * 2017-06-09 2017-12-01 广东乐心医疗电子股份有限公司 一种自动调整菜单显示模式的方法与装置以及可穿戴设备
CN107239195A (zh) * 2017-06-12 2017-10-10 河南职业技术学院 基于计算机的桌面图标管理方法及桌面图标管理装置
CN109522109A (zh) * 2018-11-01 2019-03-26 Oppo广东移动通信有限公司 应用运行的管控方法、装置、存储介质及电子设备
CN111241140A (zh) * 2018-11-12 2020-06-05 奇酷互联网络科技(深圳)有限公司 智能终端及其数据排序方法、具有存储功能的装置
CN110837294B (zh) * 2019-10-14 2023-12-12 成都西山居世游科技有限公司 一种基于眼球追踪的面部表情控制方法及系统
CN111596835A (zh) * 2020-04-03 2020-08-28 维沃移动通信有限公司 一种显示控制方法及电子设备
CN111708939B (zh) * 2020-05-29 2024-04-16 平安科技(深圳)有限公司 基于情绪识别的推送方法、装置、计算机设备及存储介质
CN112035044B (zh) * 2020-09-01 2022-04-15 上海松鼠课堂人工智能科技有限公司 显示界面控制方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070239847A1 (en) * 2006-04-05 2007-10-11 Sony Corporation Recording apparatus, reproducing apparatus, recording and reproducing apparatus, recording method, reproducing method, recording and reproducing method and recording medium
CN102722664A (zh) * 2012-05-21 2012-10-10 北京百纳威尔科技有限公司 一种解锁方法及设备
CN104407771A (zh) * 2014-11-10 2015-03-11 深圳市金立通信设备有限公司 一种终端
CN104461235A (zh) * 2014-11-10 2015-03-25 深圳市金立通信设备有限公司 一种应用图标处理方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070239847A1 (en) * 2006-04-05 2007-10-11 Sony Corporation Recording apparatus, reproducing apparatus, recording and reproducing apparatus, recording method, reproducing method, recording and reproducing method and recording medium
CN102722664A (zh) * 2012-05-21 2012-10-10 北京百纳威尔科技有限公司 一种解锁方法及设备
CN104407771A (zh) * 2014-11-10 2015-03-11 深圳市金立通信设备有限公司 一种终端
CN104461235A (zh) * 2014-11-10 2015-03-25 深圳市金立通信设备有限公司 一种应用图标处理方法

Also Published As

Publication number Publication date
CN105573613B (zh) 2019-03-22
CN105573613A (zh) 2016-05-11

Similar Documents

Publication Publication Date Title
WO2016206347A1 (zh) 一种程序图标排序方法和装置
US10365714B2 (en) System and method for dynamic content delivery based on gaze analytics
CN109976506B (zh) 一种电子设备的唤醒方法、存储介质及机器人
US20160062456A1 (en) Method and apparatus for live user recognition
CN104731316B (zh) 基于眼睛跟踪在设备上呈现信息的系统及方法
US11176944B2 (en) Transcription summary presentation
KR20150003591A (ko) 스마트 글라스
CN105488957A (zh) 疲劳驾驶检测方法及装置
US9223455B2 (en) User preference analysis method and device
EP3053014B1 (en) Method of recognizing multi-gaze and apparatus therefor
WO2018133681A1 (zh) 搜索结果排序方法、装置、服务器及存储介质
JP2007219161A (ja) プレゼンテーション評価装置及びプレゼンテーション評価方法
CN111868686B (zh) 常用应用程序的导出方法和使用该方法的导出设备
CN111401238B (zh) 一种视频中人物特写片段的检测方法及装置
CN111818385B (zh) 视频处理方法、视频处理装置及终端设备
CN108920368A (zh) 数据测试的方法、装置及电子设备
CN107870856A (zh) 视频播放启动时长测试方法、装置及电子终端
Urh et al. TaskyApp: inferring task engagement via smartphone sensing
CN110476180A (zh) 用于提供基于文本阅读的奖励型广告服务的方法及用于实行该方法的用户终端
CN109962983B (zh) 一种点击率统计方法及装置
CN108304076B (zh) 电子装置、视频播放应用的管理方法及相关产品
US8750565B2 (en) Adjusting display format in electronic device
CN115512829A (zh) 疾病诊断相关分组的获取方法、装置及介质
CN111241284B (zh) 文章内容识别方法、装置及计算机存储介质
US20210097984A1 (en) Query disambiguation using environmental audio

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15896230

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 17.05.2018)

122 Ep: pct application non-entry in european phase

Ref document number: 15896230

Country of ref document: EP

Kind code of ref document: A1