CN113986067A - Object control method, device, equipment and storage medium - Google Patents

Object control method, device, equipment and storage medium Download PDF

Info

Publication number
CN113986067A
CN113986067A CN202111277576.9A CN202111277576A CN113986067A CN 113986067 A CN113986067 A CN 113986067A CN 202111277576 A CN202111277576 A CN 202111277576A CN 113986067 A CN113986067 A CN 113986067A
Authority
CN
China
Prior art keywords
display object
input
parameter
moving
complexity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111277576.9A
Other languages
Chinese (zh)
Inventor
何军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202111277576.9A priority Critical patent/CN113986067A/en
Publication of CN113986067A publication Critical patent/CN113986067A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Abstract

The application discloses an object control method, device, equipment and storage medium, and belongs to the technical field of communication. The object control method includes: under the condition that a first input for a display object is received, acquiring parameters of the display object, wherein the parameters comprise at least one of the size, the content complexity and the color complexity of the display object, and the first input is used for moving the display object; the display object is moved based on the parameter, and in a case where it is detected that the first input is ended, the display object is continuously moved based on the parameter.

Description

Object control method, device, equipment and storage medium
Technical Field
The application belongs to the technical field of terminal display, and particularly relates to an object control method, device, equipment and storage medium.
Background
With the continuous development of mobile internet and computer technology, the electronic equipment provides a movable control on a screen for a user to operate, and the interaction experience of the user is met.
In the related technology, the user can only realize the movement of the control by long-time pressing and dragging the control, the control movement mode is single, and the user experience is poor.
Disclosure of Invention
The embodiment of the application aims to provide an object control method, device, equipment and storage medium, and can solve the problems that in the related art, a control is single in moving mode and poor in user experience.
In a first aspect, an embodiment of the present application provides an object control method, where the method includes: under the condition that a first input for a display object is received, acquiring parameters of the display object, wherein the parameters comprise at least one of the size, the content complexity and the color complexity of the display object, and the first input is used for moving the display object; the display object is moved based on the parameter, and in a case where it is detected that the first input is ended, the display object is continuously moved based on the parameter.
In a second aspect, an embodiment of the present application provides an object control apparatus, including: the display device comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring parameters of a display object under the condition that a first input of the display object is received, the parameters comprise at least one of the size, the content complexity and the color complexity of the display object, and the first input is used for moving the display object; and the moving module is used for moving the display object based on the parameters and continuously moving the display object based on the parameters under the condition that the first input is detected to be finished.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, where the program or instructions, when executed by the processor, implement the steps of the object control method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the object control method according to the first aspect.
In a fifth aspect, embodiments of the present application provide a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the steps of the object control method according to the first aspect.
In the embodiment of the application, the electronic device may receive a first input for moving the display object, acquire at least one of a size, a content complexity and a color complexity of the display object, control the display object to move through the first input, and continue to move the display object based on the at least one of the size, the content complexity and the color complexity when the first input is ended. Therefore, under the condition that the operation that a user drags a certain control is received, the electronic equipment can obtain the size, the content complexity and the color complexity of the control, and after the operation that the user releases the control is detected, the electronic equipment can continuously move the control based on the size, the content complexity and the color complexity of the control, so that an innovative control moving mode is provided, the diversified and individual requirements of the user can be met, and the user experience is effectively improved.
Drawings
Fig. 1 is a schematic flowchart of an object control method provided in an embodiment of the present application;
FIG. 2 is one of schematic diagrams of an example of an object control interface provided by an embodiment of the present application;
FIG. 3 is a second schematic diagram of an example of an object control interface provided by an embodiment of the present application;
FIG. 4 is a third schematic diagram of an example of an object control interface provided by an embodiment of the present application;
fig. 5 is a second schematic flowchart of an object control method according to an embodiment of the present application;
fig. 6 is a third schematic flowchart of an object control method according to an embodiment of the present application;
FIG. 7 is a fourth illustration of an example of an object control interface provided by an embodiment of the present application;
FIG. 8 is a fifth illustration of an example of an object control interface provided by an embodiment of the present application;
fig. 9 is a schematic structural diagram of an object control apparatus according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
fig. 11 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
As background art, in the related art, a user can only perform long-time pressing and dragging operations on a control to move the control, the control moving mode is single, and user experience is poor.
In view of the problems occurring in the related art, embodiments of the present application provide an object control method, in which an electronic device may receive a first input for moving a display object, acquire at least one of a size, a content complexity, and a color complexity of the display object, may control the display object to move through the first input, and in a case where the first input is ended, continue to move the display object based on the at least one of the size, the content complexity, and the color complexity. Therefore, under the condition that the operation that a user drags a certain control is received, the electronic equipment can acquire the size, the content complexity and the color complexity of the control, and after the operation that the user releases the control is detected, the electronic equipment can continue to move the control based on the size, the content complexity and the color complexity of the control.
The object control method provided by the embodiment of the present application is described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
Fig. 1 is a schematic flowchart of an object control method provided in an embodiment of the present application, where an execution subject of the object control method may be an electronic device. The above-described execution body does not constitute a limitation of the present application.
As shown in fig. 1, the object control method provided in the embodiment of the present application may include step 110 and step 120.
In step 110, in a case that a first input to the display object is received, a parameter of the display object is obtained, where the parameter may include at least one of a size, a content complexity, and a color complexity of the display object.
The display object can be a control, an icon, a page, page content and other interface elements which can be moved and are displayed on the interface of the electronic equipment.
The first input is used to move the display object, and the first input may be a slide input, a drag input, or the like of the user on the display object for moving the display object.
In one example, the display object may be a page, which may be a hover window, as shown in fig. 2, and the first input may be an input by a user dragging hover window 201 in direction 1. In another example, the display object may be an application icon, and as shown in fig. 3, the first input may be an input in which the user long-presses the icon 301 and slides the icon 301 in the direction 2.
In one embodiment, the parameters of the display object may be used to characterize the quality of the display object.
And a step 120 of moving the display object based on the parameters, and continuing to move the display object based on the parameters when the first input end is detected.
The electronic device can control the display object to move by receiving the first input, and detects that the first input is ended when the user releases the display object.
As a specific example, the first input may be an input that a user presses and drags the floating window 201 shown in fig. 2 for a long time, during the dragging process of the user, the floating window 201 may move correspondingly along with a finger of the user, and in a case that the user releases the floating window 201, the electronic device detects that the first input that the user moves the floating window 201 is ended, at this time, the floating window 201 may continue to be moved based on at least one of the size, the content complexity, and the color complexity of the floating window 201. After the user moves the icon 301 shown in fig. 3 from the position a to the position B through the first input, in a case where the user releases the icon 301, the electronic device detects that the first input of the user to move the icon 301 is ended, at which time the icon 301 may be continuously moved based on at least one of the size, content complexity, and color complexity of the icon 301. Since the size, content complexity, and color complexity of the floating window 201 and the icon 301 are different, the moving effect of the two will be different.
According to the object control method provided by the embodiment of the application, the electronic device can receive a first input for moving the display object, obtain at least one of the size and the color complexity of the display object, and continue to move the display object based on the size or the color complexity of the display object when the first input is finished. Therefore, under the condition that the operation that a user drags a certain control is received, the electronic equipment can acquire the size and the color complexity of the control, and after the operation that the user releases the control is detected, the electronic equipment can continuously move the control based on the size and the color complexity of the control, so that an innovative control moving mode is provided, the diversified and personalized requirements of the user can be met, and the user experience is effectively improved.
In some embodiments of the present application, the size of the display object may be a display area of the display object, and the content complexity of the display object may be determined based on at least one of the number, category, or number of categories of interface elements included in the display object.
In one embodiment, the content complexity of the display object may be the number of interface elements contained in the display object. For example, if the interface elements included in the display object have 3 controls and 2 pages, the complexity of the display object may be 5.
In another embodiment, the content complexity of the display object may be the sum of the number of interface elements and the number of categories contained in the display object. For example, if the interface element included in the display object has 3 controls and 2 pages, the complexity of the display object may be 7.
In another embodiment, the content complexity of the display object may be a sum of preset complexities corresponding to all interface elements in the display object, and the preset complexities corresponding to different types of interface elements are different. For example, the interface elements included in the display object include 3 controls and 2 pages, where the preset complexity corresponding to the controls is 1, and the preset complexity corresponding to the pages is 3, and then the complexity of the display object may be 3 × 1+2 × 3 — 9.
In some embodiments of the present application, the display object may include a page, the page may include at least one control, and the content complexity may include a number of the at least one control.
When the electronic equipment receives a first input of a user moving a page, the whole page can move along with the user operation, and the content in the page is not changed.
Illustratively, as shown in fig. 4, a page may be a control panel 401, and the control panel 401 may include 6 controls, so the content complexity of the control panel 401 may be 6.
In the embodiment of the application, the display object may include a page, and therefore, when it is detected that the first input of the page moved by the user is ended, the inertial movement distance of the page may be determined according to the number of the controls in the page, so that the page continues to move based on the inertial movement distance, and the innovation and interest of the page when moving are enhanced, thereby improving the visual effect of the page moving.
Optionally, in an embodiment, the color complexity may be a number of colors corresponding to the display object; or, the color complexity may be a product of a color number corresponding to the display object and a color value, where the color value may be an average value of RGB of all pixel points corresponding to the display object.
In some embodiments of the present application, the moving the display object based on the parameter in step 120 may include the following steps: determining a first distance based on the parameter; the display object is moved a first distance in a first direction, the first direction being determined based on the input parameters of the first input.
The input parameter may include an input direction corresponding to the first input, and the first direction may be an input direction of the first input.
Illustratively, as shown in fig. 2, the first input is an input that the user drags the floating window 201 in the direction 1, and then the first direction may be the direction 1.
Illustratively, as shown in fig. 3, the first input is an input by the user sliding the icon 301 in direction 2, and the first direction may be direction 2.
The first distance may be a moving distance of the display object in a case where the first input is ended.
It should be noted that the first distance may be used to represent the inertial movement distance of the display object.
Illustratively, as shown in fig. 3, in a case where the user moves the icon 301 from the position a to the position B in the direction 2 through the first input and releases the icon 301, the electronic device detects that the first input of the user moving the icon 301 is ended, at this time, a first distance may be determined based on at least one of the size, the content complexity, and the color complexity of the icon 301, and the icon 301 continues to be moved from the position B to the position C in the direction 2 based on the first distance.
In this embodiment, the electronic device may determine, when it is detected that the first input of the user moving the display object ends, an inertial movement distance of the display object, that is, the first distance, based on the size, the content complexity, and the color complexity of the display object. Compared with the method that the inertial moving distance is determined only based on the moving speed of the display object, the inertial moving distance is determined according to the display parameters such as the size, the content complexity and the color complexity of the display object, and the method is more innovative and interesting, and can effectively improve the user interaction experience.
In some embodiments of the present application, in order to improve the moving effect of the display object, fig. 5 is a flowchart of another object control method provided in the embodiments of the present application, and the determining the first distance based on the parameter may specifically include steps 510 to 530 shown in fig. 5.
The quality of the display object is determined based on the parameters, step 510.
In some embodiments of the present application, the quality may have a positive correlation with the parameter, i.e., the larger the size of the display object, or the higher the content complexity, or the higher the color complexity, the higher the quality; the smaller the size of the display object, or the lower the content complexity, or the lower the color complexity, the smaller the quality.
Alternatively, in one embodiment, where the parameters include size, content complexity, and color complexity, the quality of the display object may be the sum of size L, content complexity P, and color complexity Q, i.e., m ═ L + P + Q; alternatively, the quality of the display object may be the product of size, content complexity, and color complexity, i.e., m ═ L × P × Q; or, the mass m of the display object is a + L + b + P + c Q, where a, b, and c are preset weights of size, content complexity, and color complexity, respectively.
It should be noted that the above are only two exemplary formulas for determining the quality according to the size, the content complexity, and the color complexity, and the quality of the display object may also be calculated by using other formulas based on the size, the content complexity, and the color complexity, which is not described herein again.
In step 520, an initial velocity and acceleration of the display object are determined based on the input parameters of the first input.
The input parameters can also comprise input force and input speed, the initial speed and the acceleration of the display object can be positively correlated with the input speed, namely the faster the input speed of the display object is, the faster the initial speed and the acceleration of the display object when moving are; the acceleration of the display object may be in positive correlation with the input force, i.e., the greater the input force to the display object, the faster the acceleration when the display object moves.
Illustratively, F ═ ma, where F is the input strength of the first input and a is the acceleration of the display object.
Based on the mass, the initial velocity, and the acceleration, a first distance is determined, step 530.
Wherein the first distance may be positively correlated with the mass, the initial velocity, and the acceleration.
It should be noted that, on the basis that the first distance may be in positive correlation with the mass, the initial velocity, and the acceleration, the first distance may be calculated by combining a displacement calculation formula in the related art or a deformation formula thereof.
In the embodiment of the application, a concept of quality can be given to the display object in the interface, and based on the quality parameter and the first input parameter, the moving effect of the inertial movement of the object in the nature is simulated, so that the inertial moving distance and the moving effect of the display object in the interface are more consistent with the physical law in the nature, the moving effect and the user interaction experience of the display object are effectively improved, and the innovation and the diversity are increased.
Because the screen size of the electronic device is limited, when the electronic device receives a first input that a user moves a display object to the edge position of the screen, the influence of the screen frame on the elasticity or resistance of the display object can be simulated when the first input is finished, so that the display object slowly stops after sliding for a certain distance, and the rebound effect of the display object is provided.
In some embodiments of the present application, in order to provide a rebound effect of a displayed object, fig. 6 is a flowchart of another object control method provided in embodiments of the present application, and as shown in fig. 6, step 120 may specifically include step 610 and step 620.
Step 610, in the case that the end of the first input is detected, continuing to move the display object along a first direction based on the parameters, wherein the first direction is determined based on the input parameters of the first input;
and step 620, in the case that the display position of the display object is located at the edge position of the screen, determining a second direction based on the first direction, and moving the display object in the second direction based on the parameter.
The first direction is the same as the input direction corresponding to the first input, the second direction is different from the first direction, and the second direction can be set according to specific requirements.
For example, the second direction may be opposite to the first direction, or the second direction may be at a predetermined angle with respect to the first direction.
Illustratively, the second direction may be at a 90 degree angle to the first direction. As shown in fig. 7, the electronic device may receive a first input by the user moving the icon 701 to the edge of the screen along direction 3, and move the icon 701 along direction 4 when the first input is ended, where direction 3 is at a 90 degree angle to direction 4.
In this application embodiment, in electronic equipment display interface, can simulate resistance or the elasticity influence that the object received when touching another object in nature, when the user will show the object and move to the screen edge, simulate the elasticity effect that shows the object and receive the screen frame, make the motion direction of showing the object along simulation, the second direction removes promptly, provides the resilience effect that shows the object, has promoted the variety of mobile mode.
In some embodiments of the present application, in order to provide a bottom-out rebound effect of the page content, step 120 may specifically include: and continuously moving the display object based on the parameter, and in the case that the target content in the page content is displayed at the target position, scrolling and displaying the page content in the third direction based on the parameter.
The target content may include content located at a head position or an end position in the page content, and the target position may be an edge position. The third direction is opposite to the first direction, and the first direction is determined based on the input parameter of the first input, and may be specifically the input direction of the first input.
Illustratively, the page content may be a setting list, an icon list, a song list, a video list, an album list, a document, a web page and the like which can be slid in the page longitudinally, and may also be an application control center, a background application list and the like which can be slid in the interface transversely.
In one example, if the page content is a document, the target content may be the first line or the last line of words of the document. If the user slides the document upward through the first input, the electronic device may provide a bottom-touch bounce effect of sliding downward based on the size, content complexity, and color complexity of the document when sliding the last line of characters to the top position of the interface.
In another example, as shown in fig. 8, the first direction may be direction 5, the third direction is direction 6, the page content may be a background application list in the interface 801, the target content may be APP1 arranged at a head end position of the background application list, and the target position may be a screen edge. The electronic device may receive a first input that the user moves the APP1 in direction 5, and when the APP1 moves to the edge of the screen, the electronic device may simulate a bounce effect provided by the edge of the screen to the APP1 based on at least one of the size, content complexity, and color complexity of the list of background applications, so that the list of background applications may be scrolled in direction 6.
In the embodiment of the application, in the display interface of the electronic equipment, when the user slides the page content to enable the page content to slide to the top or the bottom, the rebound effect of the page content can be provided based on the parameters of the page content, the interaction experience of the user is improved, and the diversified and individual requirements of the user are met.
In the object control method provided in the embodiment of the present application, the execution subject may be an object control device, or a control module in the object control device for executing the method for object control. In the embodiment of the present application, a method for an object control device to perform object control is taken as an example, and the object control device provided in the embodiment of the present application is described. The object control device will be described in detail below.
Fig. 9 is a schematic structural diagram of an object control device provided in the present application.
As shown in fig. 9, an embodiment of the present application provides an object control apparatus 900, where the object control apparatus 900 includes: an acquisition module 910 and a moving module 920.
The obtaining module 910 is configured to obtain a parameter of the display object under the condition that a first input to the display object is received, where the parameter includes at least one of a size, a content complexity, and a color complexity of the display object, and the first input is used to move the display object; and a moving module 920, configured to move the display object based on the parameter, and continue to move the display object based on the parameter when detecting that the first input is ended.
According to the object control device provided by the embodiment of the application, the electronic equipment can receive a first input for moving the display object, acquire at least one of the size, the content complexity and the color complexity of the display object, control the display object to move through the first input, and continue to move the display object based on at least one of the size, the content complexity and the color complexity when the first input is finished. Therefore, under the condition that the operation that a user drags a certain control is received, the electronic equipment can obtain the size, the content complexity and the color complexity of the control, and after the operation that the user releases the control is detected, the electronic equipment can continuously move the control based on the size, the content complexity and the color complexity of the control, so that an innovative control moving mode is provided, the diversified and individual requirements of the user can be met, and the user experience is effectively improved.
In some embodiments of the present application, the moving module 920 includes: a determining unit for determining a first distance based on the parameter; and a moving unit for moving the display object by a first distance in a first direction, the first direction being determined based on the input parameter of the first input.
In this embodiment, the electronic device may determine, when it is detected that the first input of the user moving the display object ends, an inertial movement distance of the display object, that is, the first distance, based on the size, the content complexity, and the color complexity of the display object. Compared with the method that the inertial moving distance is determined only based on the moving speed of the display object, the inertial moving distance is determined according to the display parameters such as the size, the content complexity and the color complexity of the display object, and the method is more innovative and interesting, and can effectively improve the user interaction experience.
In some embodiments of the present application, the determining unit is specifically configured to: determining a quality of the display object based on the parameter; determining an initial velocity and an acceleration of the display object based on the input parameters of the first input; based on the mass, the initial velocity, and the acceleration, a first distance is determined.
In the embodiment of the application, a concept of quality can be given to the display object in the interface, and based on the quality parameter and the first input parameter, the moving effect of the inertial movement of the object in the nature is simulated, so that the inertial moving distance and the moving effect of the display object in the interface are more consistent with the physical law in the nature, the moving effect and the user interaction experience of the display object are effectively improved, and the innovation and the diversity are increased.
In some embodiments of the present application, the display object comprises a page, the page comprises at least one control, and the content complexity comprises a number of the at least one control.
In the embodiment of the application, the display object may include a page, and therefore, when it is detected that the first input of the page moved by the user is ended, the inertial movement distance of the page may be determined according to the number of the controls in the page, so that the page continues to move based on the inertial movement distance, and the innovation and interest of the page when moving are enhanced, thereby improving the visual effect of the page moving.
In some embodiments of the present application, the moving module 920 is specifically configured to: in a case where it is detected that the first input is ended, continuing to move the display object in a first direction based on the parameter, the first direction being determined based on the input parameter of the first input; in a case where the display position of the display object is located at the screen edge position, a second direction is determined based on the first direction, and the display object is moved in the second direction based on the parameter.
In this application embodiment, in electronic equipment display interface, can simulate resistance or the elasticity influence that the object received when touching another object in nature, when the user will show the object and move to the screen edge, simulate the elasticity effect that shows the object and receive the screen frame, make the motion direction of showing the object along simulation, the second direction removes promptly, provides the resilience effect that shows the object, has promoted the variety of mobile mode.
In some embodiments of the present application, the moving module 920 is specifically configured to: and continuously moving the display object based on the parameter, and in the case that the target content in the page content is displayed at the target position, scrolling and displaying the page content in a third direction based on the parameter, wherein the third direction is opposite to the first direction, and the first direction is determined based on the input parameter of the first input.
In the embodiment of the application, in the display interface of the electronic equipment, when the user slides the page content to enable the page content to slide to the top or the bottom, the rebound effect of the page content can be provided based on the parameters of the page content, the interaction experience of the user is improved, and the diversified and individual requirements of the user are met.
The object control apparatus provided in this embodiment of the present application can implement each process implemented by the electronic device in the method embodiments of fig. 1 to 8, and is not described here again to avoid repetition.
The object control device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The object control device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android operating system (Android), an iOS operating system, or other possible operating systems, which is not specifically limited in the embodiments of the present application.
Optionally, as shown in fig. 10, an electronic device 1000 is further provided in this embodiment of the present application, and includes a processor 1001, a memory 1002, and a program or an instruction stored in the memory 1002 and executable on the processor 1001, where the program or the instruction is executed by the processor 1001 to implement each process of the above-mentioned embodiment of the object control method, and can achieve the same technical effect, and no further description is provided here to avoid repetition.
It should be noted that the electronic devices in the embodiments of the present application include the mobile electronic devices and the non-mobile electronic devices described above.
Fig. 11 is a schematic hardware structure diagram of another electronic device according to an embodiment of the present application.
The electronic device 1100 includes, but is not limited to: radio frequency unit 1101, network module 1102, audio output unit 1103, input unit 1104, sensor 1105, display unit 1106, user input unit 1107, interface unit 1108, memory 1109, and processor 1111.
Those skilled in the art will appreciate that the electronic device 1100 may further include a power source (e.g., a battery) for supplying power to the various components, and the power source may be logically connected to the processor 1110 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system. The electronic device structure shown in fig. 11 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is not repeated here.
The processor 1110 is configured to obtain parameters of the display object under a condition that a first input to the display object is received, where the parameters include at least one of a size, a content complexity, and a color complexity of the display object, and the first input is used to move the display object; a processor 1110 for moving the display object based on the parameter, and in case of detecting that the first input is ended, continuing to move the display object based on the parameter.
In the embodiment of the application, the electronic device may receive a first input for moving the display object, acquire at least one of a size, a content complexity and a color complexity of the display object, control the display object to move through the first input, and continue to move the display object based on the at least one of the size, the content complexity and the color complexity when the first input is ended. Therefore, under the condition that the operation that a user drags a certain control is received, the electronic equipment can obtain the size, the content complexity and the color complexity of the control, and after the operation that the user releases the control is detected, the electronic equipment can continuously move the control based on the size, the content complexity and the color complexity of the control, so that an innovative control moving mode is provided, the diversified and individual requirements of the user can be met, and the user experience is effectively improved.
In some embodiments of the present application, processor 1110 is specifically configured to: determining a first distance based on the parameter; the display object is moved a first distance in a first direction, the first direction being determined based on the input parameters of the first input.
In this embodiment, the electronic device may determine, when it is detected that the first input of the user moving the display object ends, an inertial movement distance of the display object, that is, the first distance, based on the size, the content complexity, and the color complexity of the display object. Compared with the method that the inertial moving distance is determined only based on the moving speed of the display object, the inertial moving distance is determined according to the display parameters such as the size, the content complexity and the color complexity of the display object, and the method is more innovative and interesting, and can effectively improve the user interaction experience.
In some embodiments of the present application, processor 1110 is specifically configured to: determining a quality of the display object based on the parameter; determining an initial velocity and an acceleration of the display object based on the input parameters of the first input; based on the mass, the initial velocity, and the acceleration, a first distance is determined.
In the embodiment of the application, a concept of quality can be given to the display object in the interface, and based on the quality parameter and the first input parameter, the moving effect of the inertial movement of the object in the nature is simulated, so that the inertial moving distance and the moving effect of the display object in the interface are more consistent with the physical law in the nature, the moving effect and the user interaction experience of the display object are effectively improved, and the innovation and the diversity are increased.
In some embodiments of the present application, the display object comprises a page, the page comprises at least one control, and the content complexity comprises a number of the at least one control.
In the embodiment of the application, the display object may include a page, and therefore, when it is detected that the first input of the page moved by the user is ended, the inertial movement distance of the page may be determined according to the number of the controls in the page, so that the page continues to move based on the inertial movement distance, and the innovation and interest of the page when moving are enhanced, thereby improving the visual effect of the page moving.
In some embodiments of the present application, processor 1110 is specifically configured to: in a case where it is detected that the first input is ended, continuing to move the display object in a first direction based on the parameter, the first direction being determined based on the input parameter of the first input; in a case where the display position of the display object is located at the screen edge position, a second direction is determined based on the first direction, and the display object is moved in the second direction based on the parameter.
In this application embodiment, in electronic equipment display interface, can simulate resistance or the elasticity influence that the object received when touching another object in nature, when the user will show the object and move to the screen edge, simulate the elasticity effect that shows the object and receive the screen frame, make the motion direction of showing the object along simulation, the second direction removes promptly, provides the resilience effect that shows the object, has promoted the variety of mobile mode.
In some embodiments of the present application, processor 1110 is specifically configured to: and continuously moving the display object based on the parameter, and in the case that the target content in the page content is displayed at the target position, scrolling and displaying the page content in a third direction based on the parameter, wherein the third direction is opposite to the first direction, and the first direction is determined based on the input parameter of the first input.
In the embodiment of the application, in the display interface of the electronic equipment, when the user slides the page content to enable the page content to slide to the top or the bottom, the rebound effect of the page content can be provided based on the parameters of the page content, the interaction experience of the user is improved, and the diversified and individual requirements of the user are met.
It should be understood that in the embodiment of the present application, the input Unit 1104 may include a Graphics Processing Unit (GPU) 11041 and a microphone 11042, and the Graphics processor 11041 processes image data of still pictures or video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 1106 may include a display panel 11061, and the display panel 11061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1107 includes a touch panel 11071 and other input devices 11072. A touch panel 11071, also called a touch screen. The touch panel 11071 may include two portions of a touch detection device and a touch controller. Other input devices 11072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 1109 may be used for storing software programs and various data including, but not limited to, application programs and an operating system. Processor 1110 may integrate an application processor that handles primarily operating systems, user interfaces, applications, etc. and a modem processor that handles primarily wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1110.
The embodiments of the present application further provide a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the above-mentioned embodiment of the object control method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device in the above embodiment. Readable storage media, including computer-readable storage media, examples of which include non-transitory computer-readable storage media, such as computer-Read-Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and so forth.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the embodiment of the object control method, and can achieve the same technical effect, and in order to avoid repetition, the description is omitted here.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. An object control method, comprising:
acquiring parameters of a display object under the condition that a first input for the display object is received, wherein the parameters comprise at least one of the size, the content complexity and the color complexity of the display object, and the first input is used for moving the display object;
and moving the display object based on the parameter, and if the first input end is detected, continuing to move the display object based on the parameter.
2. The method of claim 1, wherein continuing to move the display object based on the parameter comprises:
determining a first distance based on the parameter;
moving the display object the first distance in a first direction, the first direction being determined based on the input parameters of the first input.
3. The method of claim 2, wherein determining the first distance based on the parameter comprises:
determining a quality of the display object based on the parameter;
determining an initial velocity and acceleration of the display object based on the input parameters of the first input;
determining the first distance based on the mass, the initial velocity, and the acceleration.
4. The method of claim 1, wherein the display object comprises a page, wherein the page comprises at least one control, and wherein the content complexity comprises a number of the at least one control.
5. The method of claim 1, wherein the continuing to move the display object based on the parameter in the case that the end of the first input is detected comprises:
in a case where it is detected that the first input is ended, continuing to move the display object in a first direction based on the parameter, the first direction being determined based on an input parameter of the first input;
and determining a second direction based on the first direction and moving the display object in the second direction based on the parameter when the display position of the display object is located at the edge position of the screen.
6. The method of claim 1, wherein the display object comprises page content, and wherein continuing to move the display object based on the parameter comprises:
and continuing to move the display object based on the parameter, and scrolling and displaying the page content in a third direction based on the parameter when the target content in the page content is displayed at the target position, wherein the third direction is opposite to the first direction, and the first direction is determined based on the input parameter of the first input.
7. An object control apparatus, characterized by comprising:
an obtaining module, configured to obtain a parameter of a display object under a condition that a first input to the display object is received, where the parameter includes at least one of a size, a content complexity, and a color complexity of the display object, and the first input is used to move the display object;
and the moving module is used for moving the display object based on the parameters and continuously moving the display object based on the parameters under the condition that the first input is detected to be finished.
8. The apparatus of claim 7, wherein the moving module comprises:
a determining unit for determining a first distance based on the parameter;
a moving unit for moving the display object by the first distance in a first direction, the first direction being determined based on the input parameter of the first input.
9. An electronic device comprising a processor, a memory, and a program or instructions stored on the memory and executed on the processor, the program or instructions, when executed by the processor, implementing the steps of the object control method according to any one of claims 1-6.
10. A readable storage medium, characterized in that it stores thereon a program or instructions which, when executed by a processor, implement the steps of the object control method according to any one of claims 1-6.
CN202111277576.9A 2021-10-29 2021-10-29 Object control method, device, equipment and storage medium Pending CN113986067A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111277576.9A CN113986067A (en) 2021-10-29 2021-10-29 Object control method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111277576.9A CN113986067A (en) 2021-10-29 2021-10-29 Object control method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113986067A true CN113986067A (en) 2022-01-28

Family

ID=79744940

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111277576.9A Pending CN113986067A (en) 2021-10-29 2021-10-29 Object control method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113986067A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114911406A (en) * 2022-06-01 2022-08-16 北京字节跳动网络技术有限公司 Dynamic effect generation method, device, medium and equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114911406A (en) * 2022-06-01 2022-08-16 北京字节跳动网络技术有限公司 Dynamic effect generation method, device, medium and equipment
CN114911406B (en) * 2022-06-01 2023-10-17 北京字节跳动网络技术有限公司 Dynamic effect generation method, dynamic effect generation device, dynamic effect generation medium and dynamic effect generation equipment

Similar Documents

Publication Publication Date Title
EP2812796B1 (en) Apparatus and method for providing for remote user interaction
AU2014244765B2 (en) Display method and apparatus for diversely displaying an object according to scroll speed
US9239674B2 (en) Method and apparatus for providing different user interface effects for different implementation characteristics of a touch event
EP2817704B1 (en) Apparatus and method for determining the position of a user input
US10182141B2 (en) Apparatus and method for providing transitions between screens
CN108920069B (en) Touch operation method and device, mobile terminal and storage medium
CN103197885A (en) Method for controlling mobile terminal and mobile terminal thereof
CN112099707A (en) Display method and device and electronic equipment
CN112433693A (en) Split screen display method and device and electronic equipment
US9019315B2 (en) Method of controlling display
JP6758922B2 (en) Electronic devices and their control methods
CN113986067A (en) Object control method, device, equipment and storage medium
US10001906B2 (en) Apparatus and method for providing a visual indication of an operation
CN105468094B (en) Method for operating computer terminal and computer terminal
US20140111551A1 (en) Information-processing device, storage medium, information-processing method, and information-processing system
CN112181252B (en) Screen capturing method and device and electronic equipment
CN113783995A (en) Display control method, display control device, electronic apparatus, and medium
CN112596660A (en) Writing display processing method and electronic equipment
US9626742B2 (en) Apparatus and method for providing transitions between screens
JP2012238086A (en) Image processing apparatus, image processing method and image processing program
JP6339550B2 (en) Terminal program, terminal device, and terminal control method
JP6907368B2 (en) Electronic devices and their control methods
CN114138141A (en) Display method and device and electronic equipment
CN113190162A (en) Display method, display device, electronic equipment and readable storage medium
CN112732214B (en) Control method, electronic device, and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination