CN111913637B - Component operation control method, device and storage medium - Google Patents

Component operation control method, device and storage medium Download PDF

Info

Publication number
CN111913637B
CN111913637B CN202010888940.4A CN202010888940A CN111913637B CN 111913637 B CN111913637 B CN 111913637B CN 202010888940 A CN202010888940 A CN 202010888940A CN 111913637 B CN111913637 B CN 111913637B
Authority
CN
China
Prior art keywords
component
target
hierarchical
hierarchy
view
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010888940.4A
Other languages
Chinese (zh)
Other versions
CN111913637A (en
Inventor
邓俊俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An International Smart City Technology Co Ltd
Original Assignee
Ping An International Smart City Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An International Smart City Technology Co Ltd filed Critical Ping An International Smart City Technology Co Ltd
Priority to CN202010888940.4A priority Critical patent/CN111913637B/en
Publication of CN111913637A publication Critical patent/CN111913637A/en
Application granted granted Critical
Publication of CN111913637B publication Critical patent/CN111913637B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions

Abstract

The present invention relates to the field of data processing technologies, and in particular, to a method and an apparatus for controlling component operation, and a storage medium, where the method is applied to a computer device, and the method includes: creating a view DOM and a hierarchy component aiming at a target component, wherein the view DOM is used for view display, the hierarchy component is an empty element with the same position and size as the view DOM, and the hierarchy component is the lowest hierarchy and is in a hidden state; activating a hierarchical component of the target component, keeping the view DOM motionless, displaying the hierarchical component, and adjusting the level of the hierarchical component to be the highest level; receiving a dragging instruction, and performing dragging operation on the target component so as to control the hierarchical component and the target component to perform synchronous movement operation; at the end of the drag operation, the view DOM is left unchanged and the hierarchical components are hidden, and the level of the hierarchical components is adjusted to the lowest level. By adopting the embodiment of the application, the convenience and the flexibility of the dragging operation can be improved.

Description

Component operation control method, device and storage medium
Technical Field
The present disclosure relates to the field of data processing technologies, and in particular, to a method and an apparatus for controlling component operation, and a storage medium.
Background
With the wide popularization of applications of computer devices (such as mobile phones and tablet computers), the applications that the computer devices can support are more and more, the functions are more and more powerful, and the computer devices are developed towards diversification and individuation, so that the computer devices become indispensable electronic articles in the life of users. For computer equipment, the dragging function is a very common function, and is used from dragging of a computer application simplest desktop shortcut to orderly arrangement or dragging to a recycling station, to various chart library tools, self-service visual billboard construction tools, skill release of various application games and the like. It can be said that from the beginning of the computer with an interface display to the development of various kinds of rich applications, the drag function accompanies the computer to grow for decades and the higher the frequency of use is, it simplifies many complex calculations and operations when people use the computer.
Disclosure of Invention
The embodiment of the application provides a method and a device for controlling component operation and a storage medium, which can improve the convenience and the flexibility of drag operation.
In a first aspect, an embodiment of the present application provides a method for controlling operation of a component, where the method includes:
Creating a view DOM and a hierarchy component aiming at a target component, wherein the view DOM is used for view display, the hierarchy component is an empty element with the same position and size as the view DOM, and the hierarchy component is in a lowest hierarchy and is in a hidden state;
activating the hierarchical component of the target component, keeping the view DOM motionless, displaying the hierarchical component, and adjusting the level of the hierarchical component to the highest level;
receiving a dragging instruction, and performing dragging operation on the target component so as to control the hierarchical component and the target component to perform synchronous movement operation;
at the end of the drag operation, the view DOM is left unchanged and the hierarchical component is hidden, and the level of the hierarchical component is adjusted to the lowest level.
In a second aspect, an embodiment of the present application provides a device for controlling operation of a component, the device including: a creation unit, a component control unit and a dragging unit, wherein,
the creating unit is used for creating a view DOM aiming at a target component and a hierarchy component, wherein the view DOM is used for view display, the hierarchy component is an empty element with the same position and size as the view DOM, and the hierarchy component is in a lowest hierarchy and is in a hidden state;
The component control unit is used for activating the hierarchical component of the target component, keeping the view DOM motionless, displaying the hierarchical component and adjusting the level of the hierarchical component to be the highest level;
the dragging unit is used for receiving a dragging instruction, and carrying out dragging operation on the target component so as to control the hierarchical component and the target component to execute synchronous movement operation;
the component control unit is further configured to, when the drag operation is finished, keep the view DOM unchanged and hide the hierarchical component, and adjust the level of the hierarchical component to the lowest level.
In a third aspect, embodiments of the present application provide a computer device comprising a processor, a memory, a communication interface, and one or more computer programs, wherein the one or more computer programs are stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing the steps in the first aspect of embodiments of the present application.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program causes a computer to perform some or all of the steps as described in the first aspect of the embodiments of the present application.
In a fifth aspect, embodiments of the present application provide a computer program product, wherein the computer program product comprises a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps described in the first aspect of the embodiments of the present application. The computer program product may be a software installation package.
By implementing the embodiment of the application, the following beneficial effects are achieved:
it can be seen that, in the component operation control method, device and storage medium described in the embodiments of the present application, a view DOM and a hierarchy component are created for a target component, the view DOM is used for view exhibition, the hierarchy component is an empty element with the same position and size as the view DOM, the hierarchy component is the lowest hierarchy and is in a hidden state, the hierarchy component of the target component is activated, the view DOM is kept motionless, the hierarchy component is displayed, the hierarchy component is adjusted to the highest hierarchy, a dragging instruction is received, a dragging operation is performed on the target component, so that synchronous movement operation is performed between the hierarchy component and the target component, the view DOM is kept unchanged and the hierarchy component is hidden when the dragging operation is finished, and the hierarchy component is adjusted to the lowest hierarchy, so that after the operation of hierarchy separation (view DOM and hierarchy component) is performed, when the dragging operation of the component is performed, the operation that the component can be subjected to overlapping dragging operation is realized, and visual blocking errors caused when the component is dragged by overlapping the lower hierarchy component are avoided.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a method for controlling operation of a component according to an embodiment of the present application;
FIG. 2 is a flow chart of another method for controlling operation of a component according to an embodiment of the present application;
FIG. 3 is a schematic structural diagram of a computer device according to an embodiment of the present application;
FIG. 4A is a functional block diagram of a component operation control device according to an embodiment of the present application;
FIG. 4B is a functional block diagram of another component operation control device according to an embodiment of the present application;
fig. 4C is a functional unit block diagram of another component operation control device provided in an embodiment of the present application.
Detailed Description
In order to make the present application solution better understood by those skilled in the art, the following description will clearly and completely describe the technical solution in the embodiments of the present application with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The terms first, second and the like in the description and in the claims of the present application and in the above-described figures, are used for distinguishing between different objects and not for describing a particular sequential order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the present application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
The computer device according to the embodiment of the present application may include various handheld devices, desktop devices, vehicle-mounted devices, wearable devices (smart watches, smart bracelets, wireless headphones, augmented reality/virtual reality devices, smart glasses), computing devices or other processing devices connected to a wireless modem, and various forms of User Equipment (UE), mobile Stations (MSs), terminal devices (terminal devices), and so on, which have wireless communication functions. For convenience of description, the above-mentioned devices are collectively referred to as computer devices.
In the related art, in order to avoid overlapping components, the pixel positions occupied by two components cannot be overlapped, one component moves to the position of the other component, and the other component must be moved away from the position, which is typically the case as a desktop icon of windows.
In addition, in the hierarchical upgrade design, the two components can be overlapped, when the lower component is dragged, a part of the lower component needs to be clicked, and then the hierarchy of the lower component is made to be the highest, so that the component can be dragged, and when the dragging is finished, the hierarchy of the component is restored.
The dragging problem during component overlapping can be well solved in the two modes, but simultaneously, the two modes have obvious defects:
the first way is to define that the two requirements cannot overlap, only a small part of the requirements can accept the effect of non-overlapping, and obviously cannot meet most of the requirements;
in the second mode, when the bottom layer is dragged, the bottom layer component needs to be temporarily placed at the highest level, at the moment, the upper level can be shielded for the bottom layer, the original level relation is restored only after the dragging is completed, and if the two levels have strict upper-lower relation, the strict relation is broken during dragging, so that temporary vision blocking errors can be caused. In a stringent environment, such as government visual project reporting, efficient real-time painting and animation, this temporary visual obstruction error is often fatal, severely affecting the continuity of a process and the continuity of vision.
For temporary vision blocking errors caused by dragging of overlapping components, in the embodiment of the present application, the effect of component view and component level separation may be used to solve the vision blocking errors, which specifically includes the following steps:
creating a view DOM and a hierarchy component aiming at a target component, wherein the view DOM is used for view display, the hierarchy component is an empty element with the same position and size as the view DOM, and the hierarchy component is in a lowest hierarchy and is in a hidden state;
activating the hierarchical component of the target component, keeping the view DOM motionless, displaying the hierarchical component, and adjusting the level of the hierarchical component to the highest level;
receiving a dragging instruction, and performing dragging operation on the target component so as to control the hierarchical component and the target component to perform synchronous movement operation;
at the end of the drag operation, the view DOM is left unchanged and the hierarchical component is hidden, and the level of the hierarchical component is adjusted to the lowest level.
After the operation of hierarchical separation (view DOM and hierarchical components), when the dragging operation of the components is carried out, the operation that the components can be subjected to overlapping dragging can be realized, and the visual blocking error caused by overlapping dragging of the lower components at the components can be avoided.
The embodiments of the present application are described in detail below.
Referring to fig. 1, fig. 1 is a flow chart of a method for controlling operation of a component according to an embodiment of the present application, as shown in the drawing, the method for controlling operation of a component includes:
101. a view DOM for view presentation is created for a target component, and a hierarchy component which is an empty element with the same position and size as the view DOM, has the lowest hierarchy and is in a hidden state.
The target component may be a component of any application, such as a desktop icon, and also, for example, a control in a certain application or a certain interface.
In a specific implementation, the target component may be set by default by the system or by the user. The computer device may create a view DOM (view component) for the target component for view presentation, and a hierarchy component that is an empty element (div) of the same position, size as the view DOM, the hierarchy component being at the lowest hierarchy and in a hidden state.
Optionally, an identifier mapping relationship is between the identifier of the view DOM and the identifier of the hierarchical component, where the identifier of the view DOM may be at least one of the following: type, number, function identification, etc., not limited herein, the identification of the hierarchical component may be at least one of: the type, the label, the function identifier and the like are not limited herein, and the identifier of the view DOM and the identifier of the hierarchical component may be the same identifier, or the identifier of the view DOM and the identifier of the hierarchical component form an identifier pair, so that an identifier mapping relationship is provided between the identifier of the view DOM and the identifier of the hierarchical component, and functional binding of the two can be realized.
In the embodiment of the application, a mapping relationship can be established between the view DOM and the hierarchy component, for example, an identified mapping relationship, namely an identified mapping relationship, between the view DOM and the hierarchy component can be established.
102. Activating the hierarchical component of the target component, keeping the view DOM motionless, displaying the hierarchical component, and adjusting the level of the hierarchical component to the highest level.
The computer equipment can select the target component by clicking the target component, and after the target component is selected, the corresponding hierarchical component can be activated, the view DOM is kept still, only the hierarchical component is displayed, and the level of the hierarchical component is adjusted to be the highest level.
Optionally, in the step 102, activating the hierarchical component of the target component may include the following steps:
21. determining the gazing duration of gazing at the target assembly according to an eyeball tracking technology;
22. and when the watching time period is longer than a preset time period, activating the hierarchical component of the target component.
The preset duration may be set by the user or default by the system, and the preset duration may be 1s, 2s, 3.3s, or the like, which is not limited herein. In a specific implementation, the computer device may calculate, according to the eye tracking technology, the gaze point of the user with respect to the target component, and calculate the gaze time length of the gaze point in the area where the target component is located, and when the gaze time length is greater than a preset time length, select the target component to activate the hierarchical component, otherwise, when the gaze time length is less than or equal to the preset time length, it is indicated that the user does not want to select the target component, so that the step of activating the hierarchical component of the target component may not be performed, and thus, a contactless operation may be implemented through the eye tracking technology, and operation convenience may be improved.
Optionally, in the step 102, activating the hierarchical component of the target component, keeping the view DOM motionless, displaying the hierarchical component, and adjusting the level of the hierarchical component to the highest level may be implemented as follows:
and when the click on the target component is detected, starting the activation state of the view DOM, starting the hierarchical state of the associated hierarchical component to be the highest hierarchy through the identification mapping relation, displaying the hierarchical component, and keeping the view DOM motionless.
In a specific implementation, when the computer device detects that the target component is clicked, the activation state of the view DOM is started, the hierarchical state of the associated hierarchical component is started to be adjusted to the highest hierarchical level by identifying the mapping relation, the hierarchical component is displayed, and the view DOM is kept still, so that the condition can be prepared for the subsequent drag operation.
103. And receiving a dragging instruction, and performing dragging operation on the target component so as to control the hierarchical component and the target component to execute synchronous movement operation.
The computer device can drag the target component, so that the hierarchical component and the target component can synchronously move, and the specific drag operation can be realized through a mouse or touch operation.
Optionally, in step 103, receiving the drag instruction may include:
31. acquiring a target touch parameter;
32. when the target touch parameters meet preset conditions, determining target dragging parameters according to the target touch parameters;
33. and generating the dragging instruction according to the target dragging parameter.
In this embodiment of the present application, the target touch parameter may be at least one of the following: the preset conditions can be set by the user or default by the system by himself or herself, such as a touch trajectory generated by the touch display, a touch force of the touch display, a touch area of the touch display, a touch duration of the touch display, a touch number of times of the touch display, and the like. Of course, the touch track includes: track length, track direction, etc.
In a specific implementation, the computer device may further store preset touch parameters in advance, where the preset touch parameters may be at least one of the following: the method comprises the steps of generating a touch track by a touch display screen, the touch force of the touch display screen, the touch area of the touch display screen, the touch duration of the touch display screen, the touch times of the touch display screen and the like. The mapping relation between the touch parameter and the dragging parameter can be stored in the computer device in advance. The drag parameter may be at least one of: the drag direction, drag rate, drag trajectory, drag duration, etc., are not limited herein.
Specifically, the computer device may obtain the target touch parameter through the touch display screen, and further may match the target touch parameter with a preset touch parameter, and confirm that the target touch parameter meets the preset condition when the target touch parameter is successfully matched with the preset touch parameter, or confirm that the target touch parameter does not meet the preset condition when the target touch parameter fails to be matched with the preset touch parameter. Further, when the target touch parameter meets the preset condition, the computer device may determine a target dragging parameter corresponding to the target touch parameter according to a mapping relation between the touch parameter and the dragging parameter, and generate a dragging instruction according to the target dragging parameter, so that the computer device may generate a corresponding dragging instruction following the touch operation to control the target component to move.
104. At the end of the drag operation, the view DOM is left unchanged and the hierarchical component is hidden, and the level of the hierarchical component is adjusted to the lowest level.
In a specific implementation, the computer device may automatically deselect the target component at the end of the drag operation. And the view DOM is unchanged, hiding the hierarchical components and adjusting their level to the lowest level. For temporary visual blocking errors caused by overlapping component dragging, the effect of component view and component level separation can be used to account for this visual blocking error.
According to the embodiment of the application, the idea of mutual binding of the dragging and the hierarchy is changed, after the operation of hierarchy separation, when the dragging operation of the component is carried out, the component can be overlapped and dragged, the visual blocking error caused by overlapping and dragging the lower component by the component can be avoided, so that when the bottom component is dragged, the operation of lifting and recovering the hierarchy is not needed, the effect of no visual blocking is achieved, when the real-time visual interaction is carried out by dragging, the problem that the upper component and the lower component are blocked due to the hierarchy operation is solved, the problem that the user does not have visual blocking when using the function is solved, the operation is smoother, and the original hierarchy relation is not needed to be changed.
Optionally, the method further comprises the following steps:
a1, detecting whether a drag event aiming at the target component occurs in a preset time period;
a2, the step of keeping the view DOM unchanged and hiding the hierarchical component is executed when a drag event aiming at the target component does not occur within the preset time period.
The preset time period may be set by the user or default by the system, and the preset time period may be 1s, 2s, 3s, 4s, or the like, which is not limited herein.
In a specific implementation, the computer device may detect whether a drag event for the target component occurs within a preset time period, and if the drag event for the target component does not occur within the preset time period, it indicates that the user does not drag the target component any more, the step of keeping the view DOM unchanged and hiding the hierarchical component may be performed, otherwise, it indicates that the user continues to drag the target component, and then may control the hierarchical component and the target component to perform a synchronous movement operation, so that the operation flexibility for the target component may be improved.
By way of illustration, a computer device can create a new component in two steps, first, create a view DOM for the component for view presentation, and second, create a hierarchy component with the hierarchy component being an empty div that is the same size as the view DOM, the hierarchy being the lowest hierarchy, and hide it. At the same time, binding ID for two components (view DOM, hierarchy component) makes them correspond to mapping relation.
When the new component is activated, the view DOM is kept motionless, the hierarchy component is displayed, the self hierarchy of the hierarchy component is changed to the highest hierarchy of all components, when the component is dragged, the view DOM is motionless, the hierarchy component is actually dragged to change along with the change of the position of the dragging, when the new component is deselected, the view DOM is unchanged, the hierarchy component is hidden, and the hierarchy level is adjusted, so that the hierarchy component level is located at the lowest hierarchy of all component levels.
Wherein the effect of the hierarchical and view separation comes out. However, the state of the view DOM has not changed all the time, that is, the user has dragged the component, but cannot see any change, so that the state of the view DOM can be changed along with the response of the hierarchical component, id mapping of the view DOM and the hierarchical component is established, when the component is clicked, the activated state of the view DOM is started, the state of the associated hierarchical component is started through the id mapping, the associated dragging event is bound, when the hierarchical component is dragged, the state of the view DOM is changed through monitoring of a mouse or a screen, the real-time state response is achieved, the activated state of the view DOM and the activated state of the hierarchical component are canceled through the id mapping, and the temporary vision blocking caused by dragging of the bottom layer by one overlapping component can be solved.
Optionally, before the step 101, the method may further include the following steps:
b1, acquiring a first face image;
b2, dividing the first face image into a plurality of areas;
b3, determining the distribution density of the characteristic points of each of the plurality of areas to obtain a distribution density set of the characteristic points, wherein each area corresponds to one distribution density of the characteristic points;
B4, determining a target average value and a target mean square error corresponding to the characteristic point distribution density set;
b5, determining a target image enhancement algorithm corresponding to the target average value according to a mapping relation between the preset average value and the image enhancement algorithm;
b6, determining a target fine adjustment coefficient corresponding to the target mean square error according to a mapping relation between the preset mean square error and the fine adjustment coefficient;
b7, adjusting algorithm control parameters of the target image enhancement algorithm according to the target fine adjustment coefficient to obtain target algorithm control parameters;
b8, carrying out image enhancement processing on the first face image according to the target algorithm control parameters and the target image enhancement algorithm to obtain a second face image;
b9, matching the second face image with a preset face template;
and B10, executing a step 101 when the second face image is successfully matched with the preset face template.
In this embodiment of the present application, a preset face template may be stored in the computer device in advance. The image enhancement algorithm may be at least one of: gamma correction, histogram equalization, image sharpening, wavelet transformation, gray stretching, etc., are not limited herein. Each image enhancement algorithm corresponds to an algorithm control parameter, and the algorithm control algorithm is used for controlling the image enhancement degree. The mapping relation between the preset average value and the image enhancement algorithm and the mapping relation between the preset mean square error and the fine tuning coefficient can be stored in the computer equipment in advance.
In a specific implementation, the computer device may acquire the first face image, and further, may divide the first face image into a plurality of areas, where the number of the plurality of areas may be understood to be more than 3, and the size of each area in the plurality of areas is in a preset area range, where the size of each area in the plurality of areas may be the same or different, and the preset area range may be set by a user or default by the system. The computer device may determine a feature point distribution density of each of the plurality of regions to obtain a feature point distribution density set, where the feature point distribution density set includes a plurality of feature point distribution densities, each region corresponds to one feature point distribution density, i.e., a number of feature points of each of the plurality of regions and a corresponding region area may be determined, and a ratio between the number of feature points and the corresponding region area is taken as the feature point distribution density. The computer device may determine a target average value and a target mean square error corresponding to the feature point distribution density set, that is, the target average value=the total number of feature points/the number of regions corresponding to the feature point distribution density set, and may determine the target mean square error corresponding to the feature point distribution density set based on the target average value and the feature point distribution density set. The average value reflects the integral characteristic of the image, the mean square error reflects the relevance among the areas, and then the corresponding image enhancement algorithm and the corresponding algorithm control parameters can be selected by combining the integral characteristic of the image and the relevance of the areas, so that the image enhancement efficiency, namely the quality of the face image, is improved.
Further, the computer device may determine a target image enhancement algorithm corresponding to the target average value according to a mapping relationship between the preset average value and the image enhancement algorithm, and may determine a target fine adjustment coefficient corresponding to the target mean square error according to a mapping relationship between the preset mean square error and the fine adjustment coefficient, then the computer device may adjust algorithm control parameters of the target image enhancement algorithm according to the target fine adjustment coefficient to obtain target algorithm control parameters, and perform image enhancement processing on the first face image according to the target algorithm control parameters and the target image enhancement algorithm to obtain a second face image, further, since the second face image has been subjected to image enhancement processing, the computer device may match the second face image with a preset face template, and when the second face image is successfully matched with the preset face template, execute step 101, otherwise, may prompt the user to continue inputting the face image, so as to improve face recognition efficiency, and may implement face unlocking through the face, and perform image enhancement processing on the image to be processed under the condition of successful unlocking, thereby being beneficial to improving security of the computer device.
It can be seen that, in the component operation control method described in the embodiment of the present application, for a target component, a view DOM and a hierarchy component are created, the view DOM is used for view display, the hierarchy component is an empty element with the same position and size as the view DOM, the hierarchy component is the lowest hierarchy and is in a hidden state, the hierarchy component of the target component is activated, the view DOM is kept motionless, the hierarchy component is displayed, the hierarchy component is adjusted to the highest hierarchy, a drag instruction is received, a drag operation is performed on the target component, so as to control the hierarchy component and the target component to perform synchronous movement operation, and after the drag operation is finished, the view DOM is kept unchanged and the hierarchy component is hidden, and the hierarchy component is adjusted to the lowest hierarchy.
In accordance with the embodiment shown in fig. 1, please refer to fig. 2, fig. 2 is a flow chart of a method for controlling operation of a component according to an embodiment of the present application, as shown in the drawing, the method for controlling operation of a component includes:
201. A view DOM for view presentation is created for a target component, and a hierarchy component which is an empty element with the same position and size as the view DOM, has the lowest hierarchy and is in a hidden state.
202. And determining the gazing duration of gazing at the target assembly according to the eyeball tracking technology.
203. And when the watching time length is longer than the preset time length, activating the hierarchical component of the target component, keeping the view DOM motionless, displaying the hierarchical component, and adjusting the level of the hierarchical component to be the highest level.
204. And receiving a dragging instruction, and performing dragging operation on the target component so as to control the hierarchical component and the target component to execute synchronous movement operation.
205. At the end of the drag operation, the view DOM is left unchanged and the hierarchical component is hidden, and the level of the hierarchical component is adjusted to the lowest level.
The specific descriptions of the steps 201 to 205 may refer to the corresponding steps described in fig. 1, and are not repeated herein.
It can be seen that, in the component operation control method described in the embodiment of the present application, after the operation of hierarchical separation (view DOM and hierarchical component), when the dragging operation of the component is performed, the operation that the component can be dragged in an overlapping manner can be achieved, and the visual blocking error caused when the component is dragged in an overlapping manner to the lower component can be avoided.
In accordance with the above embodiments, referring to fig. 3, fig. 3 is a schematic structural diagram of a computer device provided in an embodiment of the present application, and as shown in the fig. 3, the computer device includes a processor, a memory, a communication interface, and one or more computer programs, where the one or more computer programs are stored in the memory and configured to be executed by the processor, and in the embodiment of the present application, the programs include instructions for performing the following steps:
creating a view DOM and a hierarchy component aiming at a target component, wherein the view DOM is used for view display, the hierarchy component is an empty element with the same position and size as the view DOM, and the hierarchy component is in a lowest hierarchy and is in a hidden state;
activating the hierarchical component of the target component, keeping the view DOM motionless, displaying the hierarchical component, and adjusting the level of the hierarchical component to the highest level;
receiving a dragging instruction, and performing dragging operation on the target component so as to control the hierarchical component and the target component to perform synchronous movement operation;
at the end of the drag operation, the view DOM is left unchanged and the hierarchical component is hidden, and the level of the hierarchical component is adjusted to the lowest level.
Optionally, in the receiving drag instruction, the program includes instructions for performing the steps of:
acquiring a target touch parameter;
when the target touch parameters meet preset conditions, determining target dragging parameters according to the target touch parameters;
and generating the dragging instruction according to the target dragging parameter.
Optionally, in said activating said hierarchical component of said target component, the above program comprises instructions for:
determining the gazing duration of gazing at the target assembly according to an eyeball tracking technology;
and when the watching time period is longer than a preset time period, activating the hierarchical component of the target component.
Optionally, an identifier mapping relationship is formed between the identifier of the view DOM and the identifier of the hierarchical component.
Optionally, in the activating the hierarchical component of the target component, keeping the view DOM motionless, displaying the hierarchical component, and adjusting the level of the hierarchical component to the highest level, the program includes instructions for performing the steps of:
and when the click on the target component is detected, starting the activation state of the view DOM, starting the hierarchical state of the associated hierarchical component to be the highest hierarchy through the identification mapping relation, displaying the hierarchical component, and keeping the view DOM motionless.
Optionally, the above program further comprises instructions for performing the steps of:
detecting whether a drag event aiming at the target component occurs in a preset time period;
and executing the step of keeping the view DOM unchanged and hiding the hierarchical component when a drag event for the target component does not occur within the preset time period.
Optionally, before the creating the view DOM and the hierarchical component for the target component, the program further comprises instructions for:
acquiring a first face image;
dividing the first face image into a plurality of areas;
determining the distribution density of the characteristic points of each region in the plurality of regions to obtain a distribution density set of the characteristic points, wherein each region corresponds to one distribution density of the characteristic points;
determining a target average value and a target mean square error corresponding to the characteristic point distribution density set;
according to a mapping relation between a preset average value and an image enhancement algorithm, determining a target image enhancement algorithm corresponding to the target average value;
determining a target fine tuning coefficient corresponding to the target mean square error according to a mapping relation between the preset mean square error and the fine tuning coefficient;
adjusting algorithm control parameters of the target image enhancement algorithm according to the target fine adjustment coefficient to obtain target algorithm control parameters;
Performing image enhancement processing on the first face image according to the target algorithm control parameters and the target image enhancement algorithm to obtain a second face image;
matching the second face image with a preset face template;
and when the second face image is successfully matched with the preset face template, executing the step of creating the view DOM and the hierarchical component aiming at the target component.
The foregoing description of the embodiments of the present application has been presented primarily in terms of a method-side implementation. It will be appreciated that the computer device, in order to carry out the functions described above, comprises corresponding hardware structures and/or software modules that perform the respective functions. Those of skill in the art will readily appreciate that the elements and algorithm steps described in connection with the embodiments disclosed herein may be embodied as hardware or a combination of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the application may divide the functional units of the computer device according to the above method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated in one processing unit. The integrated units may be implemented in hardware or in software functional units. It should be noted that, in the embodiment of the present application, the division of the units is schematic, which is merely a logic function division, and other division manners may be implemented in actual practice.
Fig. 4A is a functional unit block diagram of the component operation control apparatus 400 related in the embodiment of the present application. The assembly operation control device 400 is applied to a computer apparatus, and the device includes: a creation unit 401, a component control unit 402, and a drag unit 403, wherein,
the creating unit 401 is configured to create, for a target component, a view DOM, and a hierarchy component, where the view DOM is used for view exhibition, the hierarchy component is an empty element with the same position and size as the view DOM, and the hierarchy component is at the lowest hierarchy and is in a hidden state;
the component control unit 402 is configured to activate the hierarchical component of the target component, keep the view DOM motionless, display the hierarchical component, and adjust the level of the hierarchical component to the highest level;
The drag unit 403 is configured to receive a drag instruction, and perform a drag operation on the target component, so as to control the hierarchical component and the target component to perform a synchronous movement operation;
the component control unit 402 is further configured to, at the end of the drag operation, keep the view DOM unchanged and hide the hierarchical component, and adjust the level of the hierarchical component to the lowest level.
It can be seen that, in the component operation control device described in the embodiment of the present application, for a target component, a view DOM and a hierarchy component are created, the view DOM is used for view display, the hierarchy component is an empty element with the same position and size as the view DOM, the hierarchy component is the lowest hierarchy and is in a hidden state, the hierarchy component of the target component is activated, the view DOM is kept motionless, the hierarchy component is displayed, the hierarchy component is adjusted to the highest hierarchy, a drag operation is performed on the target component, so that synchronous movement operation is performed on the hierarchy component and the target component, the view DOM is kept unchanged and the hierarchy component is hidden when the drag operation is finished, and the hierarchy component is adjusted to the lowest hierarchy.
Optionally, in the aspect of receiving a drag instruction and performing a drag operation on the target component, the drag unit 403 is specifically configured to:
acquiring a target touch parameter;
when the target touch parameters meet preset conditions, determining target dragging parameters according to the target touch parameters;
and generating the dragging instruction according to the target dragging parameter.
Optionally, in terms of the activating the hierarchical component of the target component, the component control unit 402 is specifically configured to:
determining the gazing duration of gazing at the target assembly according to an eyeball tracking technology;
and when the watching time period is longer than a preset time period, activating the hierarchical component of the target component.
Optionally, an identifier mapping relationship is formed between the identifier of the view DOM and the identifier of the hierarchical component.
Optionally, in the aspect that the hierarchical component of the target component is activated, the view DOM is kept still, the hierarchical component is displayed, and the level of the hierarchical component is adjusted to the highest level, the component control unit 402 is specifically configured to:
and when the click on the target component is detected, starting the activation state of the view DOM, starting the hierarchical state of the associated hierarchical component to be the highest hierarchy through the identification mapping relation, displaying the hierarchical component, and keeping the view DOM motionless.
Alternatively, as shown in fig. 4B, fig. 4B is a further modified structure of the component operation control device shown in fig. 4A, which may further include a detection unit 404 as compared with fig. 4A, specifically as follows:
the detecting unit 404 is configured to detect whether a drag event for the target component occurs within a preset period of time;
the step of keeping the view DOM unchanged and hiding the hierarchical component is performed by the component control unit 402 in the case that a drag event for the target component does not occur within the preset period of time.
Alternatively, as shown in fig. 4C, fig. 4C is a further modified structure of the component operation control device shown in fig. 4A, which may further include, in comparison with fig. 4A: an acquisition unit 405, a division unit 406, a determination unit 407, an adjustment unit 408, an image enhancement processing unit 409, a matching unit 410, specifically as follows:
the acquiring unit 405 is configured to acquire a first face image;
the dividing unit 406 is configured to divide the first face image into a plurality of areas;
the determining unit 407 is configured to determine a feature point distribution density of each of the plurality of regions, to obtain a feature point distribution density set, where each region corresponds to a feature point distribution density; determining a target average value and a target mean square error corresponding to the characteristic point distribution density set; according to a mapping relation between a preset average value and an image enhancement algorithm, determining a target image enhancement algorithm corresponding to the target average value; determining a target fine tuning coefficient corresponding to the target mean square error according to a mapping relation between the preset mean square error and the fine tuning coefficient;
The adjusting unit 408 is configured to adjust the algorithm control parameter of the target image enhancement algorithm according to the target fine tuning coefficient to obtain a target algorithm control parameter;
the image enhancement processing unit 409 is configured to perform image enhancement processing on the first face image according to the target algorithm control parameter and the target image enhancement algorithm, so as to obtain a second face image;
the matching unit 410 is configured to match the second face image with a preset face template;
and when the second face image is successfully matched with the preset face template, the creating unit 401 executes the step of creating the view DOM and the hierarchical component for the target component.
It may be understood that the functions of each program module of the component operation control device of the present embodiment may be specifically implemented according to the method in the foregoing method embodiment, and the specific implementation process may refer to the relevant description of the foregoing method embodiment, which is not repeated herein.
The present application also provides a computer storage medium storing a computer program for electronic data exchange, where the computer program causes a computer to execute some or all of the steps of any one of the methods described in the method embodiments, where the computer includes a computer device.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer-readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any one of the methods described in the method embodiments above. The computer program product may be a software installation package, said computer comprising a computer device.
It should be noted that, for simplicity of description, the foregoing method embodiments are all expressed as a series of action combinations, but it should be understood by those skilled in the art that the present application is not limited by the order of actions described, as some steps may be performed in other order or simultaneously in accordance with the present application. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily required in the present application.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, such as the above-described division of units, merely a division of logic functions, and there may be additional manners of dividing in actual implementation, such as multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, or may be in electrical or other forms.
The units described above as separate components may or may not be physically separate, and components shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units described above, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable memory. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a memory, including several instructions for causing a computer device (which may be a personal computer, a server or a network device, etc.) to perform all or part of the steps of the above-mentioned method of the various embodiments of the present application. And the aforementioned memory includes: a U-disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Those of ordinary skill in the art will appreciate that all or a portion of the steps in the various methods of the above embodiments may be implemented by a program that instructs associated hardware, and the program may be stored in a computer readable memory, which may include: flash disk, read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), magnetic disk or optical disk.
The foregoing has outlined rather broadly the more detailed description of embodiments of the present application, wherein specific examples are provided herein to illustrate the principles and embodiments of the present application, the above examples being provided solely to assist in the understanding of the methods of the present application and the core ideas thereof; meanwhile, as those skilled in the art will have modifications in the specific embodiments and application scope in accordance with the ideas of the present application, the present description should not be construed as limiting the present application in view of the above.

Claims (8)

1. A method of controlling operation of a component, the method comprising:
creating a view DOM and a hierarchy component aiming at a target component, wherein the view DOM is used for view display, the hierarchy component is an empty element with the same position and size as the view DOM, and the hierarchy component is in a lowest hierarchy and is in a hidden state;
Determining the gazing duration of gazing at the target assembly according to an eyeball tracking technology, when the gazing duration is longer than a preset duration, activating the hierarchical assembly of the target assembly, keeping the view DOM motionless, displaying the hierarchical assembly, and adjusting the level of the hierarchical assembly to be the highest level;
acquiring a target touch parameter, determining a target dragging parameter according to the target touch parameter when the target touch parameter meets a preset condition, generating a dragging instruction according to the target dragging parameter, and performing a dragging operation on the target component so as to control the hierarchical component and the target component to execute a synchronous moving operation;
at the end of the drag operation, the view DOM is left unchanged and the hierarchical component is hidden, and the level of the hierarchical component is adjusted to the lowest level.
2. The method of claim 1, wherein the step of determining the position of the substrate comprises,
the identification mapping relation is between the identification of the view DOM and the identification of the hierarchical component.
3. The method of claim 2, the activating the hierarchical component of the target component, holding the view DOM motionless, displaying the hierarchical component, and adjusting the level of the hierarchical component to a highest level, comprising:
And when the click on the target component is detected, starting the activation state of the view DOM, starting the hierarchical state of the associated hierarchical component to be the highest hierarchy through the identification mapping relation, displaying the hierarchical component, and keeping the view DOM motionless.
4. The method according to claim 1, wherein the method further comprises:
detecting whether a drag event aiming at the target component occurs in a preset time period;
and executing the step of keeping the view DOM unchanged and hiding the hierarchical component when a drag event for the target component does not occur within the preset time period.
5. The method of claim 1, wherein prior to the creating the view DOM and hierarchical components for the target component, the method further comprises:
acquiring a first face image;
dividing the first face image into a plurality of areas;
determining the distribution density of the characteristic points of each region in the plurality of regions to obtain a distribution density set of the characteristic points, wherein each region corresponds to one distribution density of the characteristic points;
determining a target average value and a target mean square error corresponding to the characteristic point distribution density set;
According to a mapping relation between a preset average value and an image enhancement algorithm, determining a target image enhancement algorithm corresponding to the target average value;
determining a target fine tuning coefficient corresponding to the target mean square error according to a mapping relation between the preset mean square error and the fine tuning coefficient;
adjusting algorithm control parameters of the target image enhancement algorithm according to the target fine adjustment coefficient to obtain target algorithm control parameters;
performing image enhancement processing on the first face image according to the target algorithm control parameters and the target image enhancement algorithm to obtain a second face image;
matching the second face image with a preset face template;
and when the second face image is successfully matched with the preset face template, executing the step of creating the view DOM and the hierarchical component aiming at the target component.
6. A component operation control device, characterized in that the device comprises: a creation unit, a component control unit and a dragging unit, wherein,
the creating unit is used for creating a view DOM aiming at a target component and a hierarchy component, wherein the view DOM is used for view display, the hierarchy component is an empty element with the same position and size as the view DOM, and the hierarchy component is in a lowest hierarchy and is in a hidden state;
The assembly control unit is used for determining the gazing time length of gazing at the target assembly according to an eyeball tracking technology, activating the hierarchical assembly of the target assembly when the gazing time length is longer than a preset time length, keeping the view DOM motionless, displaying the hierarchical assembly, and adjusting the level of the hierarchical assembly to be the highest level;
the dragging unit is used for acquiring target touch parameters, determining target dragging parameters according to the target touch parameters when the target touch parameters meet preset conditions, generating a dragging instruction according to the target dragging parameters, and carrying out dragging operation on the target assembly so as to control the hierarchical assembly and the target assembly to execute synchronous movement operation;
the component control unit is further configured to, when the drag operation is finished, keep the view DOM unchanged and hide the hierarchical component, and adjust the level of the hierarchical component to the lowest level.
7. A computer device comprising a processor, a memory for storing one or more computer programs and configured to be executed by the processor, the computer program comprising instructions for performing the steps in the method of any of claims 1-5.
8. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program comprising program instructions which, when executed by a processor, cause the processor to perform the method of any of claims 1-5.
CN202010888940.4A 2020-08-28 2020-08-28 Component operation control method, device and storage medium Active CN111913637B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010888940.4A CN111913637B (en) 2020-08-28 2020-08-28 Component operation control method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010888940.4A CN111913637B (en) 2020-08-28 2020-08-28 Component operation control method, device and storage medium

Publications (2)

Publication Number Publication Date
CN111913637A CN111913637A (en) 2020-11-10
CN111913637B true CN111913637B (en) 2024-01-02

Family

ID=73267664

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010888940.4A Active CN111913637B (en) 2020-08-28 2020-08-28 Component operation control method, device and storage medium

Country Status (1)

Country Link
CN (1) CN111913637B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112307402B (en) * 2020-11-11 2024-01-26 北京指掌易科技有限公司 Webpage component creation method and device, electronic equipment and readable storage medium
CN112462991A (en) * 2020-11-27 2021-03-09 广州视源电子科技股份有限公司 Control method of intelligent interactive tablet, storage medium and related equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017041632A1 (en) * 2015-09-07 2017-03-16 阿里巴巴集团控股有限公司 Method and apparatus for transferring data in display page
CN107885437A (en) * 2017-11-29 2018-04-06 广州视源电子科技股份有限公司 Multielement exchange method, device, equipment and storage medium
CN109116983A (en) * 2018-07-23 2019-01-01 Oppo广东移动通信有限公司 Method for controlling mobile terminal, device, mobile terminal and computer-readable medium
CN110471656A (en) * 2018-05-10 2019-11-19 北京京东尚科信息技术有限公司 The method of adjustment and device of component level
CN111258569A (en) * 2020-01-09 2020-06-09 卓望数码技术(深圳)有限公司 Webpage component editing method, device, equipment and computer readable storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017041632A1 (en) * 2015-09-07 2017-03-16 阿里巴巴集团控股有限公司 Method and apparatus for transferring data in display page
CN107885437A (en) * 2017-11-29 2018-04-06 广州视源电子科技股份有限公司 Multielement exchange method, device, equipment and storage medium
CN110471656A (en) * 2018-05-10 2019-11-19 北京京东尚科信息技术有限公司 The method of adjustment and device of component level
CN109116983A (en) * 2018-07-23 2019-01-01 Oppo广东移动通信有限公司 Method for controlling mobile terminal, device, mobile terminal and computer-readable medium
CN111258569A (en) * 2020-01-09 2020-06-09 卓望数码技术(深圳)有限公司 Webpage component editing method, device, equipment and computer readable storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
WEB端可视化表单生成引擎的设计与实现;宋奕爽;刘绍华;;软件(12);全文 *

Also Published As

Publication number Publication date
CN111913637A (en) 2020-11-10

Similar Documents

Publication Publication Date Title
KR102301599B1 (en) Implementation of biometric authentication
KR102185854B1 (en) Implementation of biometric authentication
CN107977141B (en) Interaction control method and device, electronic equipment and storage medium
US20170344202A1 (en) Methods and apparatuses for window display, and methods and apparatuses for touch-operating an application
KR20200043356A (en) User termincal device for supporting user interaxion and methods thereof
RU2599536C2 (en) User interface interaction method and apparatus applied in touchscreen device, and touchscreen device
KR102103866B1 (en) Implementation of biometric authentication
US8982061B2 (en) Angular contact geometry
CN107704177A (en) interface display method, device and terminal
CN111913637B (en) Component operation control method, device and storage medium
US20040113888A1 (en) Cursor locator for multi-monitor systems
US20130234957A1 (en) Information processing apparatus and information processing method
CN110347317A (en) A kind of windows switching method, device, storage medium and interactive intelligent tablet computer
KR102127387B1 (en) Mobile terminal and screen scroll method thereof
US20150063785A1 (en) Method of overlappingly displaying visual object on video, storage medium, and electronic device
CN109388472B (en) Wallpaper display method and device and electronic equipment
CN113655929A (en) Interface display adaptation processing method and device and electronic equipment
CN115047976A (en) Multi-level AR display method and device based on user interaction and electronic equipment
CN107239222A (en) The control method and terminal device of a kind of touch-screen
CN105094503B (en) Information processing method and deformable electronic equipment
CN113946250A (en) Folder display method and device and electronic equipment
CN107437269A (en) A kind of method and device for handling picture
KR20150090698A (en) Method and apparatus of managing objects on wallpaper
CN112947805A (en) Icon sorting method and device
CN111079119A (en) Verification method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant