CN111443813B - Application program management operation method and device, electronic equipment and storage medium - Google Patents

Application program management operation method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN111443813B
CN111443813B CN202010270910.7A CN202010270910A CN111443813B CN 111443813 B CN111443813 B CN 111443813B CN 202010270910 A CN202010270910 A CN 202010270910A CN 111443813 B CN111443813 B CN 111443813B
Authority
CN
China
Prior art keywords
face
image
application program
percentage
management operation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010270910.7A
Other languages
Chinese (zh)
Other versions
CN111443813A (en
Inventor
黄奥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202010270910.7A priority Critical patent/CN111443813B/en
Publication of CN111443813A publication Critical patent/CN111443813A/en
Application granted granted Critical
Publication of CN111443813B publication Critical patent/CN111443813B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification

Abstract

The invention discloses an application program management operation method, an application program management operation device, electronic equipment and a storage medium, wherein the application program management operation method comprises the following steps: acquiring a real-time first acquired image under the condition that a first face part and a first application program are in a relevant relationship, wherein the first acquired image comprises the first face part; under the condition that a preset object exists in the area where the first person face position is located, managing the first application program; wherein the preset object comprises a finger, a palm or a pen; the management operation includes opening or deleting. By utilizing the embodiment of the invention, a new mode is provided for managing and operating the application program, thereby enriching the management and operation modes of the application program.

Description

Application program management operation method and device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of communications technologies, and in particular, to an application management operation method and apparatus, an electronic device, and a storage medium.
Background
At present, electronic devices (such as mobile phones) have become a necessity in daily life, and people can install various applications on the electronic devices. If a user wants to manage and operate an application (for example, to start or delete an application), an application icon of the application needing to be managed and operated needs to be found on the desktop of the electronic device, and the user performs management control on the application corresponding to the application icon through touch control of the application icon.
However, the management operation manner of the application program is single when the user manages and controls the application program in the manner described above, and therefore how to enrich the management operation manner of the application program becomes a technical problem which needs to be solved urgently.
Disclosure of Invention
The invention mainly aims to provide an application program management operation method, an application program management operation device, an electronic device and a storage medium, and aims to solve the technical problem that the management operation mode of an application program is single.
In order to achieve the above object, the present invention provides an application management operation method applied to an electronic device, the method including:
acquiring a real-time first acquired image under the condition that a first face part and a first application program are in a relevant relationship, wherein the first acquired image comprises the first face part;
under the condition that a preset object exists in the area where the first person face position is located, managing the first application program;
wherein the preset object comprises a finger, a palm or a pen; the management operation includes opening or deleting.
In addition, in order to achieve the above object, the present invention further provides an application management operating apparatus applied to an electronic device, the apparatus including:
the first image acquisition module is used for acquiring a real-time first acquired image under the condition that a first face part and a first application program establish a relationship, wherein the first acquired image comprises the first face part;
the management operation module is used for performing management operation on the first application program under the condition that a preset object exists in the area where the first person face is located;
wherein the preset object comprises a finger, a palm or a pen; the management operation includes opening or deleting.
The invention further provides an electronic device, which includes a processor, a memory and a computer program stored in the memory and capable of running on the processor, and when the computer program is executed by the processor, the steps of the application program management operation method are realized.
The invention also proposes a computer-readable storage medium on which a computer program is stored, which computer program, when executed by a processor, implements the steps of the application management operation method.
According to the application program management operation method, device, electronic equipment and storage medium, under the condition that the first face part and the first application program are in a relevant relationship, a user can select the first face part on the face of the user through a preset object (such as a finger), and then management operation is performed on the first application program relevant to the first face part. Therefore, the user can manage the application program by selecting the face part on the face of the user, the user does not need to operate the application icon, a new mode is provided for managing the application program, and the management operation mode of the application program is enriched.
Drawings
Fig. 1 is a flowchart illustrating an application management operation method according to a first embodiment of the present invention;
fig. 2 is a schematic diagram illustrating an application icon displayed on a first face image according to a first embodiment of the present invention;
FIG. 3 is a diagram illustrating a user touching the nose according to a first embodiment of the present invention;
fig. 4 is a schematic diagram illustrating displaying an application icon on a second face image according to the first embodiment of the present invention;
FIG. 5 is a diagram illustrating a user touching an ear according to a first embodiment of the present invention;
fig. 6 is a schematic diagram illustrating a user performing a mouth-sealing gesture on an electronic device according to a first embodiment of the invention;
fig. 7 is a schematic diagram of a front camera of an electronic device collecting a face of a user according to a first embodiment of the present invention;
FIG. 8 is a flowchart illustrating a method for managing and operating application programs according to a second embodiment of the present invention;
fig. 9 is a schematic structural diagram of an application management operating device according to a third embodiment of the present invention.
Fig. 10 is a schematic diagram of a hardware structure of an electronic device implementing various embodiments of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The electronic equipment in the embodiment of the invention can be a mobile terminal or a non-mobile terminal. For example, the mobile terminal may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted terminal, a wearable device, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), or the like, and the non-mobile terminal may be a Personal Computer (PC), a Television (TV), a teller machine, a self-service machine, or the like, which is not limited herein.
"first" and "second" in the embodiments of the present invention are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
As shown in fig. 1, a first embodiment of the present invention provides an application management operation method, which is applied to an electronic device, where the electronic device includes a front-facing camera. The application program management operation method comprises the following steps:
step 101, acquiring a real-time first acquired image under the condition that a correlation relationship is established between a first face part and a first application program, wherein the first acquired image comprises the first face part.
The method comprises the steps of acquiring a real-time first acquired image by adopting a front camera of electronic equipment.
102, under the condition that a preset object exists in an area where a first person face position is located, managing a first application program; wherein the preset object includes a finger, a palm or a pen (such as an electronic pen for writing on an electronic device or a pen capable of writing on paper); management operations include opening or deleting.
If the area where the first person face position is located has the preset object, the user is indicated to touch the first person face position or select the first person face position by using a finger or a touch pen. The managing operation of the first application includes opening or deleting the first application.
In the embodiment of the present invention, in a case where the first face part is associated with the first application, the user may select the first face part on his/her face through a predetermined object (such as a finger), and then perform a management operation on the first application associated with the first face part. Therefore, the user can manage the application program by selecting the face part on the face of the user, the user does not need to operate the application icon, a new mode is provided for managing the application program, and the management operation mode of the application program is enriched. In addition, as the user selects different parts on the face, the electronic equipment can be operated in the air, which not only brings different fun to the user in the process of using the electronic equipment, but also avoids the face stiffness caused by long-term use of the electronic equipment by the user.
In one or more embodiments of the present invention, in a case that the first face part and the first application program have an association relationship, before acquiring the real-time first acquired image, the application program management operation method further includes:
displaying a first face image, wherein the first face image comprises a first face part;
receiving a first input of a user; wherein the first input may include a slide input to move an icon of the first application; alternatively, the first input may include an input of clicking an icon of the first application first, and then clicking an area on the first face image where the first face position is located.
In response to the first input, displaying an icon of the first application program in an area where the first person face part is located to establish an association relationship between the first person face part and the first application program.
In the embodiment of the invention, by displaying the first face image, the user can display the icon of the first application program in the area of the first face part on the first face image, so that the incidence relation between the first face part and the first application program is simply and conveniently established.
In one or more embodiments of the present invention, after the association relationship between the first face part and the first application is established, the application management operation method further includes: and taking the first face image as a background image of a first display interface (such as a main interface) so that the first face image is displayed on the first display interface. And displaying an icon of the first application program in an area where the first face part is located on the first face image displayed on the first display interface. Therefore, the user can conveniently select the first face part on the face of the user by using the preset object by referring to the arrangement of the icons on the first display interface. The user does not need to memorize the incidence relation between the first face part and the first application program, thereby facilitating the operation of the user.
For example, referring to fig. 2, a first face image is displayed on a main interface of the electronic device, and each face part of the first face image is displayed with an application icon a to an application icon P. The face part of the first face image may include at least one of a left head, a right head, a left forehead, a right forehead, a left eyebrow, a right eyebrow, a nose, a left cheek, a right cheek, a middle and left ear, a right ear, a mouth, and a chin.
Referring to fig. 3, in a case where the user finger a touches the nose, the electronic device performs a management operation on an application program corresponding to the application icon G displayed in the nose region on the first face image in fig. 2.
In one or more embodiments of the present invention, in a case where a preset object exists in an area where a first person face position is located, before performing a management operation on a first application program, the application program management operation method further includes:
acquiring a target image matched with the first acquired image from at least one preset image, wherein preset objects exist in areas where different face parts in different preset images are located respectively; for example, a predetermined object exists in a region where the left amount in the first predetermined image is located, and a predetermined object exists in a region where the right amount in the second predetermined image is located;
the face part of the target image where the predetermined object exists is taken as the first face part.
As one example, the predetermined image may be acquired in advance. Specifically, in the case where a predetermined object touches a certain face portion on the face of the user, the electronic device acquires an image when the predetermined object touches the face portion as a predetermined image. And in the same way, obtaining a preset image when the preset object touches each human face part.
As another example, the predetermined image is an image in a predetermined picture library.
According to the embodiment of the invention, each preset image is matched with the first collected image, and if the first collected image is matched with the target image in the preset images, for example, the similarity between the first collected image and the target image is greater than a preset similarity threshold value, or the similarity between the first collected image and the target image is the maximum, the fact that the first collected image and the target image have the preset object at the same face part is explained, and further the fact that the preset object exists in the area where the first face part is located is obtained.
In one or more embodiments of the present invention, it may be recognized by means of Artificial Intelligence (AI) that a preset object exists in a region where the first person's face is located.
In one or more embodiments of the present invention, in a case where a preset object exists in an area where a first person face position is located, before performing a management operation on a first application program, the application program management operation method further includes:
acquiring a first coordinate range of a first face part in a first coordinate system of a first face image, and acquiring a second coordinate range of a preset object in a second coordinate system of a first acquisition image;
under the condition that the first coordinate range and the second coordinate range are at least partially overlapped, determining that a preset object exists in the area where the first person face position is located; and the origin of the first coordinate system and the origin of the second coordinate system are positioned at the same face part.
For example, a first coordinate system of the first face image is established by taking the nose head in the first face image as the origin of the coordinate system; and establishing a second coordinate system of the first collected image by taking the nose head in the first collected image as the origin of the coordinate system. Under the condition that the fingers of a user touch the left eye, acquiring a first coordinate range of the left eye in a first coordinate system of the first face image; acquiring a second coordinate range of the finger of the user in a second coordinate system of the first acquisition image; in a case where the first coordinate range and the second coordinate range partially overlap, it is determined that the user's finger is present in the region where the left eye is present.
In the embodiment of the invention, the preset object in the area where the first face part is located can be accurately obtained according to whether the first coordinate range of the first face part in the first coordinate system of the first face image is at least partially overlapped with the second coordinate range of the preset object in the second coordinate system of the first acquired image.
In addition, when the first collected image is collected, if the front camera is far away from the face, the face area is a small area on the first collected image. In this way, in the case where the first captured image is captured when the predetermined object touches the first face part, the coordinate range of the predetermined object on the first captured image and the coordinate range of the first face part on the first face image may not coincide. For example, if the user's finger touches the left eye, the left eye coordinate range on the first face image and the user's finger coordinate range on the first captured image may not overlap.
In order to solve the above problem, in one or more embodiments of the present invention, before acquiring the second coordinate range of the preset object in the second coordinate system of the first captured image, the application management operation method further includes:
calculating a first percentage of the area of the face region on the first acquired image to the total area of the first acquired image;
under the condition that the first percentage is smaller than the preset percentage, cutting the first collected image, and enabling the first percentage of the area of the face region on the cut first collected image to account for the total area of the cut first collected image to be larger than or equal to the preset percentage;
and establishing a second coordinate system for the cut first collected image.
In the embodiment of the present invention, if the first percentage is less than the predetermined percentage (for example, the predetermined percentage is 70%), which indicates that the face region is a smaller region on the first captured image, the first captured image is cropped so that the percentage of the area of the cropped face region in the total area of the first captured image is greater than or equal to the predetermined percentage, thereby increasing the proportion of the face region on the cropped first captured image and avoiding the problem that the coordinate range of the predetermined object on the first captured image and the coordinate range of the first face part on the first face image do not coincide with each other.
In one or more embodiments of the present invention, a difference between a first percentage and a second percentage of a face region area on the cropped first captured image to a total area of the cropped first captured image is within a predetermined numerical range; and the second percentage is the percentage of the area of the face region on the first face image in the total area of the first face image.
In the embodiment of the invention, the difference between the first percentage of the area of the face region in the first captured image after cropping and the second percentage of the area of the face region in the first captured image is smaller, so that the overlapping of the coordinate systems on the first captured image and the first face image can be further ensured.
In one or more embodiments of the present invention, before performing the management operation on the first application, the application management operation method further includes:
acquiring M preset coordinate ranges of M personal face parts on a first face image respectively, wherein the M face parts correspond to the M preset coordinate ranges one by one;
acquiring a target coordinate range corresponding to the face position of the first person in the M preset coordinate ranges;
the management operation of the first application program specifically includes:
and managing the application program corresponding to the application icon displayed in the target coordinate range on the first face image.
The one-to-one correspondence relationship between the M face parts and the M predetermined coordinate ranges may be as shown in table 1:
TABLE 1
Figure BDA0002443131970000071
Figure BDA0002443131970000081
Under the condition that a preset object exists in the area where the first person face part is located, the relation between the M face parts and the M preset coordinate ranges shown in the table 1 in a one-to-one correspondence manner can be inquired to obtain a target coordinate range corresponding to the target face part. For example, the target face part is a nose, and the target coordinate range corresponding to the nose is found from table 1, i.e., the target coordinate range of the nose on the first face image is (X13, Y13) — (X14, Y14). Then the application icon displayed at the position (X13, Y13) - (X14, Y14) is found on the first face image, and the management operation is performed on the application program corresponding to the application icon.
In one or more embodiments of the present invention, in a case where the first face image is displayed on the first display interface, the application management operation method further includes:
receiving a second input of the user;
and responding to a second input, switching from the first display interface to a second display interface, wherein a second face image is displayed on the second display interface, at least one face part on the second face image respectively displays an application icon, and the application icon displayed on the first face image is different from the application icon displayed on the second face image.
The first face image and the second face image can be the same face image, or two face images respectively shot by the same user.
For example, the first display interface is the interface shown in fig. 2, and the second display interface is the interface shown in fig. 4. The first input includes at least one of a gesture input, a touch input for a displayed function icon, a voice input, and a touch input of a physical key. After switching to the second display interface, as shown in fig. 5, if the user touches the left ear with his finger a, the electronic device performs operation management on the application program corresponding to the application icon X in fig. 4.
In the embodiment of the invention, the application icons can be distributed on a plurality of face images, and the plurality of face images can be switched and displayed along with the requirements of users. Because the application icons are distributed on a plurality of face images and are not densely gathered on one face image, the user can find the application icon needing to be operated on the face image, and the operation of the user is facilitated.
In one or more embodiments of the invention, in the case that a user is recognized to make a predetermined gesture motion to the electronic device according to the first collected image, switching is performed from the first display interface to a second display interface, wherein a second face image is displayed on the second display interface, at least one face part on the second face image respectively displays an application icon, and the application icon displayed on the first face image is different from the application icon displayed on the second face image.
For example, referring to fig. 6, when the user performs a mouth-sealing gesture motion on the electronic device, the electronic device switches from the first display interface to the second display interface.
In the embodiment of the invention, under the condition that the user performs the preset gesture action on the electronic equipment, the first display interface can be switched to the second display interface. Therefore, the user can realize the switching of the application icons by performing the blank operation, and more modes are provided for switching the application icons.
In one or more embodiments of the present invention, in a case where a preset object exists in an area where a first person face position is located, before performing a management operation on a first application program, the application program management operation method further includes:
matching the first collected image with a prestored identity verification face image; and under the condition that the first collected image is matched with the pre-stored identity verification face image, determining that a preset object exists in the area where the first person face is located.
For example, referring to fig. 7, a front camera of the electronic device collects a face of a user to obtain a first collected image, and matches the first collected image with an authentication face image to authenticate the identity of the user.
The embodiment of the invention firstly verifies the identity of the user, and the user can select the face part through the predetermined object under the condition that the identity of the user passes verification, thereby managing and operating the application program. It is ensured that the application is controlled by the holder of the electronic device.
In one or more embodiments of the present invention, after matching the first captured image with a pre-stored authentication face image, the application management operation method further includes:
and under the condition that the first collected image is not matched with the pre-stored identity verification face image, performing screen locking operation.
In the embodiment of the invention, under the condition that the first collected image is not matched with the identity verification face image, the user identity verification is not passed, and the condition that information in the electronic equipment is leaked is avoided through the screen locking operation.
As shown in fig. 8, a second embodiment of the present invention provides another application management operation method, which is applied to an electronic device including a front-facing camera. The application program management operation method comprises the following steps:
step 201, pre-inputting photos of the user touching each face part and covering the mouth as predetermined images.
Step 202, for the face image on the desktop, a coordinate range is divided for each area for placing the application icon in the face image.
And step 203, moving each application icon to the face part of the face image on the desktop.
And step 204, the front camera collects the face state of the user and carries out face recognition verification.
Step 205, determining whether the face authentication is passed, specifically, matching the collected face image of the user with a pre-stored authentication face image. In the case of failing to pass the face authentication, step 206 is executed, and in the case of passing the face authentication, step 207 is executed.
And step 206, locking the screen.
And step 207, the front camera collects the face state of the user.
In step 208, it is determined whether the user touches the face, and if the user does not touch the face, no processing is performed, i.e., step 209. In case the user touches the face, step 210 is performed.
Step 209, no processing is performed.
Step 210, determine whether the user has made a gesture to cover his mouth. Specifically, matching the picture of the user covering the mouth in step 501 with the collected face image of the user; if the two are matched, the user makes a gesture for covering his mouth, then step 211 is executed; if the two are not matched, the user does not make a gesture covering his/her mouth, step 212 is executed.
In step 211, the next interface is cut. Namely, the face image currently displayed on the desktop is switched to another face image, wherein the application icon displayed on the face image before switching is different from the application icon displayed on the face image after switching.
Step 212, determining the target face part touched by the finger of the user. Specifically, the acquired face image of the user is matched with the predetermined image of each face part touched by the user in step 201, and the target face part is determined according to the predetermined image matched with the acquired face image of the user.
In step 213, in the coordinate ranges of the respective face parts divided in step 202, a target coordinate range of the target face part on the face image is found.
And step 214, starting an application program corresponding to the application icon displayed in the target coordinate range on the face image.
In the embodiment of the invention, the application icons are displayed on the face parts of the face images, a user can select the target face part on the face of the user through a preset object (such as a finger) and start the application program corresponding to the application icon displayed on the target face part, the user does not need to click the application icon, a new mode is provided for starting the application program, and the starting mode of the application program is enriched. In addition, before the application program is started, the user is subjected to face verification, and in the case of passing the face verification, the user can select a target face part on the face of the user to start the application program. And the screen is locked under the condition that the face verification is not passed, so that the privacy information of the user is prevented from being stolen.
As shown in fig. 9, a third embodiment of the present invention provides an application management operating device 300, which is applied to an electronic device, and the application management operating device 300 includes:
the first image acquisition module 301 is configured to acquire a real-time first acquired image under the condition that a relationship is established between a first face position and a first application program, where the first acquired image includes the first face position;
a management operation module 302, configured to perform a management operation on a first application program when a preset object exists in an area where a first person face is located;
wherein the preset object comprises a finger, a palm or a pen; management operations include opening or deleting.
In the embodiment of the present invention, in a case where the first face part is associated with the first application, the user may select the first face part on his/her face through a predetermined object (such as a finger), and then perform a management operation on the first application associated with the first face part. Therefore, the user can manage the application program by selecting the face part on the face of the user, the user does not need to operate the application icon, a new mode is provided for managing the application program, and the management operation mode of the application program is enriched. In addition, as the user selects different parts on the face, the electronic equipment can be operated in the air, which not only brings different fun to the user in the process of using the electronic equipment, but also avoids the face stiffness caused by long-term use of the electronic equipment by the user.
In one or more embodiments of the present invention, the application management operation device 300 further includes:
the face display module is used for displaying a first face image, and the first face image comprises a first face part;
the first input receiving module is used for receiving a first input of a user;
and the first input response module is used for responding to the first input and displaying an icon of the first application program in the area where the first person face part is located so as to establish the incidence relation between the first person face part and the first application program.
In the embodiment of the invention, by displaying the first face image, the user can display the icon of the first application program in the area of the first face part on the first face image, so that the incidence relation between the first face part and the first application program is simply and conveniently established.
In one or more embodiments of the present invention, the application management operation device 300 further includes:
the coordinate range acquisition module is used for acquiring a first coordinate range of the first face position in a first coordinate system of the first face image and acquiring a second coordinate range of the preset object in a second coordinate system of the first collected image;
the object determining module is used for determining that a preset object exists in the area where the first person face position is located under the condition that the first coordinate range and the second coordinate range are at least partially overlapped;
and the origin of the first coordinate system and the origin of the second coordinate system are positioned at the same face part.
In the embodiment of the invention, the preset object in the area where the first face part is located can be accurately obtained according to whether the first coordinate range of the first face part in the first coordinate system of the first face image is at least partially overlapped with the second coordinate range of the preset object in the second coordinate system of the first acquired image.
In one or more embodiments of the present invention, the application management operation device 300 further includes:
the percentage calculation module is used for calculating a first percentage of the area of the face region on the first acquired image in the total area of the first acquired image;
the image cutting module is used for cutting the first collected image under the condition that the first percentage is smaller than the preset percentage, and enabling the first percentage of the cut first collected image to be larger than or equal to the preset percentage;
and the coordinate system establishing module is used for establishing a second coordinate system for the cut first collected image.
In the embodiment of the present invention, if the first percentage is less than the predetermined percentage (for example, the predetermined percentage is 70%), which indicates that the face region is a smaller region on the first captured image, the first captured image is cropped so that the percentage of the area of the cropped face region in the total area of the first captured image is greater than or equal to the predetermined percentage, thereby increasing the percentage of the face region in the cropped first captured image, and avoiding the problem that the coordinate range of the predetermined object in the first captured image does not coincide with the coordinate range of the first face part in the first captured image.
In one or more embodiments of the invention, the difference between the first percentage and the second percentage of the cropped first captured image is within a predetermined range of values; and the second percentage is the percentage of the area of the face region on the first face image in the total area of the first face image.
In the embodiment of the invention, the difference between the first percentage of the area of the face region in the first captured image after cropping and the second percentage of the area of the face region in the first captured image is smaller, so that the overlapping of the coordinate systems on the first captured image and the first face image can be further ensured.
An embodiment of the present invention provides an electronic device, which includes a processor, a memory, and a computer program stored in the memory and capable of running on the processor, where the computer program, when executed by the processor, implements the steps of any one of the application management operation methods described above.
The electronic device provided by the embodiment of the invention can realize each process of the icon sorting method embodiment, can achieve the same technical effect, and is not repeated here to avoid repetition.
As shown in fig. 10, the electronic device 400 includes, but is not limited to: radio frequency unit 401, network module 402, audio output unit 403, input unit 404, sensor 405, display unit 406, user input unit 407, interface unit 408, memory 409, processor 410, power supply 411, and an image capture device (such as a front-facing camera, which is not labeled in fig. 10). Those skilled in the art will appreciate that the electronic device configuration shown in fig. 10 does not constitute a limitation of the electronic device, and that the electronic device may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the electronic device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The system comprises a front-facing camera, a first application program and a second application program, wherein the front-facing camera is used for collecting a real-time first collected image under the condition that the first face part and the first application program establish a relevant relationship, and the first collected image comprises the first face part;
the processor 410 is used for performing management operation on the first application program under the condition that a preset object exists in the area where the first person face position is located;
wherein the preset object comprises a finger, a palm or a pen; management operations include opening or deleting.
In one or more embodiments of the invention, the display unit 406 is configured to display a first face image, where the first face image includes a first face portion;
a user input unit 407 for receiving a first input by a user;
the display unit 406 is further configured to display an icon of the first application program in an area where the first person face part is located in response to the first input, so as to establish an association relationship between the first person face part and the first application program.
In one or more embodiments of the present invention, in the case where a preset object exists in the region where the first-person face position exists,
the processor 410 is further configured to obtain a first coordinate range of the first face position in a first coordinate system of the first face image, and obtain a second coordinate range of the preset object in a second coordinate system of the first captured image;
the processor 410 is further configured to determine that a preset object exists in an area where the first person face is located under the condition that the first coordinate range and the second coordinate range at least partially coincide;
and the origin of the first coordinate system and the origin of the second coordinate system are positioned at the same face part.
In one or more embodiments of the invention, the processor 410 is further configured to calculate a first percentage of the area of the face region on the first captured image to the total area of the first captured image;
a processor 410, further configured to crop the first captured image if the first percentage is less than the predetermined percentage, and make the first percentage of the cropped first captured image greater than or equal to the predetermined percentage; and establishing a second coordinate system for the cut first collected image.
In one or more embodiments of the invention, the difference between the first percentage and the second percentage of the cropped first captured image is within a predetermined range of values;
and the second percentage is the percentage of the area of the face region on the first face image in the total area of the first face image.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 401 may be used for receiving and sending signals during a process of sending and receiving information or a call, and specifically, after receiving downlink data from a base station, the downlink data is processed by the processor 410; in addition, the uplink data is transmitted to the base station. Typically, radio unit 401 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. Further, the radio unit 401 can also communicate with a network and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access to the user via the network module 402, such as assisting the user in sending and receiving e-mails, browsing web pages, and accessing streaming media.
The audio output unit 403 may convert audio data received by the radio frequency unit 401 or the network module 402 or stored in the memory 409 into an audio signal and output as sound. Also, the audio output unit 403 may also provide audio output related to a specific function performed by the electronic apparatus 400 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 403 includes a speaker, a buzzer, a receiver, and the like.
The input unit 404 is used to receive audio or video signals. The input Unit 404 may include a Graphics Processing Unit (GPU) 4041 and a microphone 4042, and the Graphics processor 4041 processes image data of a still picture or video obtained by an image capturing apparatus (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 406. The image frames processed by the graphic processor 4041 may be stored in the memory 409 (or other storage medium) or transmitted via the radio frequency unit 401 or the network module 402. The microphone 4042 may receive sound, and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 401 in case of the phone call mode.
The electronic device 400 also includes at least one sensor 405, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 4061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 4061 and/or the backlight when the electronic device 400 is moved to the ear. As one type of motion sensor, an accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of an electronic device (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 405 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which will not be described in detail herein.
The display unit 406 is used to display information input by the user or information provided to the user. The Display unit 406 may include a Display panel 4061, and the Display panel 4061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 407 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device. Specifically, the user input unit 407 includes a touch panel 4071 and other input devices 4072. Touch panel 4071, also referred to as a touch screen, may collect touch operations by a user on or near it (e.g., operations by a user on or near touch panel 4071 using a finger, a stylus, or any suitable object or attachment). The touch panel 4071 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 410, receives a command from the processor 410, and executes the command. In addition, the touch panel 4071 can be implemented by using various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 4071, the user input unit 407 may include other input devices 4072. Specifically, the other input devices 4072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a track ball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 4071 can be overlaid on the display panel 4061, and when the touch panel 4071 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 410 to determine the type of the touch event, and then the processor 410 provides a corresponding visual output on the display panel 4061 according to the type of the touch event. Although in fig. 10, the touch panel 4071 and the display panel 4061 are two independent components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 4071 and the display panel 4061 may be integrated to implement the input and output functions of the electronic device, and this is not limited herein.
The interface unit 408 is an interface for connecting an external device to the electronic apparatus 400. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 408 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the electronic apparatus 400 or may be used to transmit data between the electronic apparatus 400 and an external device.
The memory 409 may be used to store software programs as well as various data. The memory 409 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 409 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 410 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, performs various functions of the electronic device and processes data by operating or executing software programs and/or modules stored in the memory 409 and calling data stored in the memory 409, thereby performing overall monitoring of the electronic device. Processor 410 may include one or more processing units; preferably, the processor 410 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 410.
The electronic device 400 may further comprise a power supply 411 (e.g. a battery) for supplying power to various components, and preferably, the power supply 411 may be logically connected to the processor 410 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the electronic device 400 includes some functional modules that are not shown, and are not described in detail herein.
Optionally, an embodiment of the present invention further provides an electronic device, which includes a processor 410, a memory 409, and a computer program that is stored in the memory 409 and can be run on the processor 410, and when being executed by the processor 410, the computer program implements each process of the above icon sorting method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The electronic device provided by the embodiment of the invention collects a real-time first collected image under the condition that a first face part and a first application program establish a relevant relationship, wherein the first collected image comprises the first face part; under the condition that a preset object exists in the area where the first person face position is located, managing the first application program; wherein the preset object comprises a finger, a palm or a pen; management operations include opening or deleting. Therefore, a new way is provided for controlling the application program, and the control way of the application program is enriched.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the embodiment of the icon sorting method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. An application management operation method applied to an electronic device is characterized by comprising the following steps:
acquiring a real-time first acquired image under the condition that a first face part and a first application program are in a relevant relationship, wherein the first acquired image comprises the first face part;
under the condition that a preset object exists in the area where the first person face position is located, managing the first application program;
wherein the preset object comprises a finger, a palm or a pen; the management operation comprises opening or deleting;
under the condition that the first face part and the first application program establish a relationship, before acquiring a real-time first acquired image, the method further comprises the following steps:
displaying a first face image, wherein the first face image comprises the first face part;
receiving a first input of a user;
responding to the first input, displaying an icon of the first application program in an area where the first person face part is located so as to establish an incidence relation between the first person face part and the first application program;
after the establishing the association relationship between the first face part and the first application program, the method further comprises:
displaying a first face image on a first display interface, wherein icons of different first application programs are displayed in areas where different first face positions in the first face image are located, and the first display interface is a main interface of the electronic device.
2. The method for managing and operating an application program according to claim 1, wherein when a preset object exists in an area where the first person face is located, before the first application program is managed and operated, the method further comprises:
acquiring a first coordinate range of the first face part in a first coordinate system of the first face image and acquiring a second coordinate range of the preset object in a second coordinate system of the first acquired image;
determining that the preset object exists in the area where the first person face position is located under the condition that the first coordinate range and the second coordinate range are at least partially overlapped;
and the origin of the first coordinate system and the origin of the second coordinate system are positioned at the same face part.
3. The method of claim 2, wherein the obtaining the second coordinate range of the preset object in the second coordinate system of the first captured image is preceded by:
calculating a first percentage of a face region area on the first captured image to a total area of the first captured image;
in the case that the first percentage is less than a predetermined percentage, cropping the first captured image and making the first percentage of the first captured image after cropping greater than or equal to the predetermined percentage;
and establishing the second coordinate system for the first collected image after cutting.
4. The method of claim 3, wherein the difference between the first percentage and the second percentage of the cropped first captured image is within a predetermined range of values;
wherein the second percentage is a percentage of a face region area on the first face image to a total area of the first face image.
5. An application management operation device applied to an electronic device, comprising:
the first image acquisition module is used for acquiring a real-time first acquired image under the condition that a first face part and a first application program establish a relationship, wherein the first acquired image comprises the first face part;
the management operation module is used for performing management operation on the first application program under the condition that a preset object exists in the area where the first person face is located;
wherein the preset object comprises a finger, a palm or a pen; the management operation comprises opening or deleting;
further comprising:
the face display module is used for displaying a first face image, and the first face image comprises the first face part;
the first input receiving module is used for receiving a first input of a user;
the first input response module is used for responding to the first input, displaying an icon of the first application program in an area where the first person face part is located so as to establish an incidence relation between the first person face part and the first application program;
further comprising:
the face image display module is used for displaying a first face image on a first display interface, icons of different first application programs are displayed in areas where different first face parts in the first face image are located, and the first display interface is a main interface of the electronic equipment.
6. The app-management operating device according to claim 5, further comprising:
a coordinate range obtaining module, configured to obtain a first coordinate range of the first face portion in a first coordinate system of the first face image, and obtain a second coordinate range of the preset object in a second coordinate system of the first captured image;
the object determining module is used for determining that the preset object exists in the area where the first person face position is located under the condition that the first coordinate range and the second coordinate range are at least partially overlapped;
and the origin of the first coordinate system and the origin of the second coordinate system are positioned at the same face part.
7. The app-management operating device according to claim 6, further comprising:
a percentage calculation module for calculating a first percentage of a face region area on the first captured image to a total area of the first captured image;
an image cropping module for cropping the first captured image if the first percentage is less than a predetermined percentage and making the first percentage of the cropped first captured image greater than or equal to the predetermined percentage;
and the coordinate system establishing module is used for establishing the second coordinate system for the first collected image after cutting.
8. The app management operation device according to claim 7, wherein a difference between the first percentage and the second percentage of the first captured image after the cropping is within a predetermined numerical range;
wherein the second percentage is a percentage of a face region area on the first face image to a total area of the first face image.
9. An electronic device comprising a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the application management operation method according to any one of claims 1 to 4.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the application management operation method of any one of claims 1 to 4.
CN202010270910.7A 2020-04-08 2020-04-08 Application program management operation method and device, electronic equipment and storage medium Active CN111443813B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010270910.7A CN111443813B (en) 2020-04-08 2020-04-08 Application program management operation method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010270910.7A CN111443813B (en) 2020-04-08 2020-04-08 Application program management operation method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111443813A CN111443813A (en) 2020-07-24
CN111443813B true CN111443813B (en) 2022-08-16

Family

ID=71654064

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010270910.7A Active CN111443813B (en) 2020-04-08 2020-04-08 Application program management operation method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111443813B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014011511A (en) * 2012-06-27 2014-01-20 Kyocera Corp Electronic device, control method, and control program
CN109863466A (en) * 2016-10-26 2019-06-07 哈曼贝克自动系统股份有限公司 Combined type eyes and gesture tracking
CN109952753A (en) * 2016-10-25 2019-06-28 苹果公司 For managing the user interface of the access to the voucher used in operation

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120081282A1 (en) * 2008-05-17 2012-04-05 Chin David H Access of an application of an electronic device based on a facial gesture
US8509479B2 (en) * 2009-05-29 2013-08-13 Microsoft Corporation Virtual object
US20130104089A1 (en) * 2011-10-20 2013-04-25 Fuji Xerox Co., Ltd. Gesture-based methods for interacting with instant messaging and event-based communication applications
US10423214B2 (en) * 2012-11-20 2019-09-24 Samsung Electronics Company, Ltd Delegating processing from wearable electronic device
CN103336575B (en) * 2013-06-27 2016-06-29 深圳先进技术研究院 The intelligent glasses system of a kind of man-machine interaction and exchange method
CN203482367U (en) * 2013-08-12 2014-03-12 联想(北京)有限公司 Earphone
JP6374514B2 (en) * 2013-09-24 2018-08-15 ベイジン ズブン インフォメーション テクノロジー カンパニー リミテッド Non-contact type palm print authentication method, apparatus and portable terminal
TWI540462B (en) * 2014-11-17 2016-07-01 緯創資通股份有限公司 Gesture recognition method and electronic apparatus using the same
CN106909289B (en) * 2017-03-31 2019-12-03 维沃移动通信有限公司 A kind of operating method and mobile terminal of application controls
CN107422859B (en) * 2017-07-26 2020-04-03 广东美的制冷设备有限公司 Gesture-based regulation and control method and device, computer-readable storage medium and air conditioner
CN108089891B (en) * 2017-11-30 2021-01-26 维沃移动通信有限公司 Application program starting method and mobile terminal
US10831316B2 (en) * 2018-07-26 2020-11-10 At&T Intellectual Property I, L.P. Surface interface

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014011511A (en) * 2012-06-27 2014-01-20 Kyocera Corp Electronic device, control method, and control program
CN109952753A (en) * 2016-10-25 2019-06-28 苹果公司 For managing the user interface of the access to the voucher used in operation
CN109863466A (en) * 2016-10-26 2019-06-07 哈曼贝克自动系统股份有限公司 Combined type eyes and gesture tracking

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Face location by template matching with a quadratic discriminant function;F. Weber等;《Proceedings International Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems. In Conjunction with ICCV"99 (Cat. No.PR00378)》;20020806;1-4 *

Also Published As

Publication number Publication date
CN111443813A (en) 2020-07-24

Similar Documents

Publication Publication Date Title
CN108459797B (en) Control method of folding screen and mobile terminal
CN108471498B (en) Shooting preview method and terminal
CN109078319B (en) Game interface display method and terminal
CN110213440B (en) Image sharing method and terminal
CN109542306B (en) Screen capturing method and terminal equipment
CN107784089B (en) Multimedia data storage method, processing method and mobile terminal
CN109241775B (en) Privacy protection method and terminal
CN109933273B (en) Information processing method and terminal equipment
CN108052819B (en) Face recognition method, mobile terminal and computer readable storage medium
CN111339515A (en) Application program starting method and electronic equipment
CN110855921B (en) Video recording control method and electronic equipment
CN109523253B (en) Payment method and device
CN111124179A (en) Information processing method and electronic equipment
CN108710806B (en) Terminal unlocking method and mobile terminal
CN111405181B (en) Focusing method and electronic equipment
CN111078002A (en) Suspended gesture recognition method and terminal equipment
CN111522477A (en) Application starting method and electronic equipment
CN109669656B (en) Information display method and terminal equipment
CN110830747A (en) Video call method, wearable device and storage medium
CN109166164B (en) Expression picture generation method and terminal
CN110928407A (en) Information display method and device
CN107809515B (en) Display control method and mobile terminal
CN110007821B (en) Operation method and terminal equipment
CN109739430B (en) Display method and mobile terminal
CN108563940B (en) Control method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant