WO2016188289A1 - Procédé et dispositif de tracé pour élément d'interface - Google Patents

Procédé et dispositif de tracé pour élément d'interface Download PDF

Info

Publication number
WO2016188289A1
WO2016188289A1 PCT/CN2016/080306 CN2016080306W WO2016188289A1 WO 2016188289 A1 WO2016188289 A1 WO 2016188289A1 CN 2016080306 W CN2016080306 W CN 2016080306W WO 2016188289 A1 WO2016188289 A1 WO 2016188289A1
Authority
WO
WIPO (PCT)
Prior art keywords
interface element
focus
target
target location
time
Prior art date
Application number
PCT/CN2016/080306
Other languages
English (en)
Chinese (zh)
Inventor
李剑波
Original Assignee
阿里巴巴集团控股有限公司
李剑波
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 阿里巴巴集团控股有限公司, 李剑波 filed Critical 阿里巴巴集团控股有限公司
Publication of WO2016188289A1 publication Critical patent/WO2016188289A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8146Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics

Definitions

  • the present application relates to the field of computer technology, and in particular, to a method and an apparatus for drawing interface elements.
  • interface elements in the display interface may have a location tracking relationship with each other.
  • the two interface elements of focus and focus frame as an example, there is a position tracking relationship between them.
  • the focus frame moves to the current position of the focus, which means that the focus frame control draws the focus frame at the current position of the focus.
  • the focus is under the control of the focus control, and the tracking moves to the current position of the focus frame, that is, the focus control draws the focus at the current position of the focus frame.
  • the prior art implements location tracking as follows:
  • the focus frame control draws at the target position according to the information of the acquired target position.
  • the focus frame in order to achieve the purpose of the focus frame tracking focus to move.
  • the problem with the above method is that there is a large lag at the moment when the focus frame moves to the target position relative to the time when the focus moves to the target position, and the existence of the large lag may cause the user to mistakenly think that the focus frame is not Move to the target position as desired, and the user will re-trigger the focus and the focus frame to move, which causes a misoperation, resulting in unnecessary waste of processing resources.
  • the embodiment of the present application provides a method for drawing an interface element, which is used to solve the problem that the location tracking method of the interface element in the prior art is used, which may result in unnecessary processing resource waste.
  • the embodiment of the present application further provides an apparatus for drawing an interface element, which is used to solve the problem that the location tracking method using the interface elements in the prior art may cause unnecessary processing resource waste.
  • a method of drawing interface elements including:
  • a drawing device for an interface element comprising:
  • a motion trend information acquiring unit configured to acquire information that characterizes a motion trend of the first interface element
  • a target location prediction unit configured to predict, according to the information, a target location to which the first interface element is to be moved
  • a drawing unit configured to The second interface element is drawn by the second interface element drawing position determined by the target position.
  • the target position to which the first interface element is to be moved can be predicted, and the second interface element is drawn according to the target position, the second interface element is not drawn after the first interface element moves to the target position, thereby reducing the The hysteresis of the movement of the second interface element avoids unnecessary waste of processing resources due to repeated movement operations of the user.
  • FIG. 1 is a schematic flowchart of a specific implementation process of a method for drawing an interface element according to an embodiment of the present application
  • FIG. 1a is a schematic diagram of a time axis according to an embodiment of the present application.
  • FIG. 1b is a schematic diagram of a focus of a digital television screen interface and a positional relationship of a focus frame according to an embodiment of the present application;
  • 1c is a schematic diagram of a focus of a digital television screen interface and a positional relationship of a focus frame according to an embodiment of the present application;
  • FIG. 1 is a schematic structural diagram of a drawing component of an interface element according to an embodiment of the present application
  • FIG. 2 is a schematic flowchart of a specific implementation process of a method for drawing a focus or a focus frame in a digital television according to an embodiment of the present disclosure
  • FIG. 3 is a schematic structural diagram of a device for drawing an interface element according to an embodiment of the present application.
  • the embodiment of the present application provides a method for drawing an interface element, which is used to solve the lag problem that occurs when the interface element is moved in the prior art.
  • the schematic diagram of the specific implementation process of the method is shown in FIG. 1 and mainly includes the following steps:
  • Step 11 Obtain information that characterizes a motion trend of the first interface element
  • the first interface element is an interface element that moves relative to the two interface elements that have a location tracking relationship.
  • acquiring information indicating a trend of movement of the first interface element includes: acquiring at least two position parameters of the first interface element, and when the first interface element is located at a position respectively indicated by the two position parameters Determining, according to the at least two position parameters, a moving direction of the first interface element; determining, according to the at least two position parameters, and a time when the first interface element is located at the position respectively indicated by the two position parameters The speed at which the first interface element moves.
  • the positional parameter of the first interface element is (0, 11), and at time T2 after 5s, the position of the first interface element is The number is (0, 20), so that according to the two position parameters, it can be determined that the moving direction of the first interface element is moving in the vertical direction (such as along the Y axis of the vertical coordinate axis).
  • the motion speed of the focus can be calculated as (20-10)/10; or, when the first interface element moves at a uniform speed according to the default speed set by the system, the motion of the first interface element Speed information can be obtained from the system settings information.
  • the location parameter refers to any information that can indicate the location of the interface element, such as a coordinate value and the like.
  • the manner of obtaining information indicating the trend of the movement of the first interface element may include at least the following two types:
  • Manner 1 According to the preset position parameter sampling period, information for characterizing the movement trend of the first interface element is obtained.
  • the system will automatically acquire the coordinate value of the position where the focus is in the screen coordinate system every 30 ms, as the first interface.
  • the positional parameter of the element, and according to the positional parameter of the first interface element, information for characterizing the motion trend of the first interface element is obtained.
  • the location parameter of the sampled first interface element needs to be at least two in order to obtain the motion direction information of the first interface element.
  • Manner 2 After receiving the first interface element position change triggering instruction, obtaining information indicating a trend of the first interface element motion.
  • the position of the focus in the screen coordinate system can be obtained. Further, according to the preset position parameter sampling period, the position parameter of the focus during the movement process is acquired; and the motion trend information of the focus is obtained through the acquired position parameters.
  • the manner of obtaining the motion trend information of the focus according to the position parameter may be referred to. As mentioned above, it will not be repeated here.
  • the embodiment of the present application provides a method for timing sampling.
  • the method specifically includes:
  • the system timer is set, and the first interface element position parameter is sampled whenever the timing time of the system timer reaches the sampling time determined according to the sampling period. For example, if it is necessary to acquire the position parameter of the focus every 30 ms, the time of setting the system timer is 30 ms, and then the position parameter of the focus point is sampled every 30 ms according to the time of the system timer.
  • Step 12 predict, according to information indicating a trend of motion of the first interface element, a target location to which the first interface element is to be moved;
  • step 12 may include: according to the drawing period of the display interface, the moving speed, and The direction of movement, predicting the target location to which the first interface element will be moved.
  • the drawing period of the display interface that is, the drawing period of the display interface that is followed when the interface (such as the interface including the first interface element and the second interface element) is drawn, and the drawing period of the display interface may be 30 ms, 40 ms, and many more.
  • the drawing period of the display interface is 30ms, one frame interface is drawn every 30ms.
  • the specific size of the drawing period of the display interface is not limited.
  • the interface of the first interface element in the N+1th frame or the N+2 frame may be predicted according to the drawing period of the display interface of 30 ms and the acquired information indicating the motion trend of the first interface element.
  • the target position to be moved to when being drawn may be predicted according to the drawing period of the display interface of 30 ms and the acquired information indicating the motion trend of the first interface element.
  • the drawing period of the display interface is 30 ms, and the information indicating the motion trend of the first interface element includes motion.
  • the direction is to the right in the horizontal direction and the moving speed is 5/30ms, then It is predicted that the first interface element is at the target position of (15, 0) when the interface of the N+1th frame is drawn.
  • 10 of the above (10, 0), 5 of the motion speed expression 5/30 ms, and 15 of the above (15, 0) may each be a unit length in the interface coordinate system.
  • the embodiment of the present application provides an estimation algorithm, where the algorithm This can be done by moving the distance estimator by the first interface element.
  • the evaluator can be used to complete step 11 in addition to the step 12 being completed.
  • the estimator can calculate the moving speed of the first interface element according to the obtained positional parameter of the first interface element, and determine the moving direction of the first interface element; and further predict the first according to the instantaneous speed and the moving direction. The target location to which the interface element will be moved.
  • the focus is displayed on the interface of the smart TV
  • the smart TV draws a frame display interface with a drawing period of 30 ms;
  • the smart TV receives the focus movement command, that is, the focus starts to move at time T0;
  • the position parameter of the focus at time T0 is (x0, y1);
  • sampling period is the same as the drawing period of the display interface, which is 30 ms.
  • the estimator can acquire the position parameter (x0, y1) of the focus at time T0, and acquire the position parameter of the focus at time T1 (x1). , y1);
  • the position parameter (x2, y1) of the focus is obtained at time T2.
  • the time interval between T0 and T1 and the interval between T1 and T2 is a drawing period, that is, 30 ms.
  • expectedDistanceX focusViewMovingVelocityX ⁇ (T2-T0). That is, how much distance the predicted focus moves at the upcoming T2 relative to the position at T0.
  • the information of the movement tendency of the focus can be determined as the X-axis along the horizontal direction.
  • the target position to which the focus is to be moved at the time T2 can be predicted, that is, the positional parameter after the focus moves in the horizontal direction X axis is ( X0+expectedDistanceX, y1).
  • the position parameter at time t0 is (0, 9)
  • the sampling period is preset according to the preset focus position parameter
  • the focus is acquired at time t1.
  • the position parameter is (0, 12)
  • it can be determined that the moving direction of the focus is moving in the Y-axis direction in the vertical direction; and at the same time, the position of the focus is from t0 to t1 according to the positional parameters of the focus at time t0 and time t1.
  • the time interval between t2 and t1 is 30 ms of the system preset sampling period
  • the focus can be calculated to move at time t2.
  • step 13 the second interface element is drawn according to the target position.
  • Case 1 When the first interface element moves to the target position, the second interface is drawn according to the target position element.
  • the first interface element reaches the target position at time T1 (ie, a specific time as mentioned above), at time T1, the second interface element is drawn according to the target position.
  • a display interface including the first interface element and the second interface element is drawn; 2.
  • the function obtained by performing step 11 is used to represent the movement trend of the first interface element.
  • the information is: the position parameters of the first interface element in the first frame display interface and the second frame display interface respectively; 3.
  • the first interface element is currently in the second frame display interface.
  • the time at which the second interface element is drawn according to the target position may be a time at which the third frame display interface is drawn, that is, a time later than the drawing time of the second frame display interface by 30 ms.
  • Case 2 At the time before the first interface element moves to the target position, the second interface element is drawn according to the target position.
  • the first interface element starts moving at time T0 and reaches the target position at time T1, and then the second interface can be drawn according to the target position at a certain time before the time T1, that is, at a certain time before the time T1 (that is, the specific time mentioned above). element.
  • the second interface element is drawn according to the determined target location, and specifically includes the following two implementation manners:
  • Implementation 1 Determine a first drawing position according to a target position and a set distance threshold; and at the specific time, draw a second interface element according to the target position.
  • the current position of the element when the distance between the target position and the position represented by the current position parameter of the second interface element (hereinafter referred to as the current position of the element) is less than the set distance threshold, the current position of the element may be determined. A position is drawn for the second interface element; such that at the particular moment, the current position of the element is taken as the first drawing position and the second interface element is drawn at the location. In particular, it is possible for the target position to coincide with the current position of the element, ie the above distance is zero.
  • the target position When the distance between the target position and the current position of the element is not less than the set distance threshold, the target position may be determined as the second interface element drawing position; thus, at a specific moment, the second interface element is drawn at the target position.
  • the embodiment of the present application provides an estimation algorithm, and the algorithm may be moved by the second interface element distance estimator.
  • the specific algorithm is as follows:
  • Embodiment 2 determining, according to the target location, whether the first interface element moves outside the display area currently occupied by the second interface element when the first interface element moves to the target location; determining the second interface element according to the determination result Drawing a position; and drawing a second interface element at a second interface element drawing position at a time when the first interface element moves to the target position.
  • the target position is determined as the second interface element drawing position;
  • the location of the display area currently occupied by the second interface element is determined. Draw a position for the second interface element.
  • the first interface element is the focus and the second interface element is the focus frame. It is assumed that in the initial state, on the screen as shown in FIG. 1b, the focus and the focus frame are all in the same position, wherein the display area currently occupied by the focus frame is the gray area in the figure.
  • the focus moves to the position shown in Figure 1c, due to the position of the focus If it does not move to the gray area shown in FIG. 1c, then the position of the gray area is determined as the second interface element drawing position.
  • the manner of drawing the second interface element at the time when the first interface element moves to the target location may include: the second interface element according to the moment when the first interface element moves to the target location Draw the features of the third interface element of the position and draw the second interface element.
  • drawing the second interface element according to the feature of the third interface element may mean that the second interface element is drawn according to the way that the second interface element has the feature; or the second interface element is not The second interface element is drawn in a manner that has this feature.
  • the focus frame is drawn into a rectangle according to the shape feature of the poster B when the focus frame is drawn at the target position where the focus is reached.
  • the features of the third interface element include: shape, color, size, animation effect, and the like.
  • the above method may be completed by a drawing component of an interface element as shown in FIG. 1d, and the drawing component of the interface element specifically includes the following modules: a motion trend information acquiring module, a target position prediction module, a drawing module, a system timer, and a core calculation.
  • Module including a first interface element moving distance estimator and a second interface element moving distance estimator.
  • the above step 11 can be completed by the motion trend information acquisition module.
  • the target position to which the first interface element is to be moved in step 12 may be predicted by the target position prediction module according to the parameter calculated by the first interface element moving distance estimator.
  • the function implementation manner of the first interface element moving distance estimator is as described above, and is not described here.
  • the system timer can set a fixed time interval as the sampling period, so that the position parameter obtaining module can sample the position parameter of the first interface element according to the time interval set by the system timer.
  • the second interface element moves the distance estimator for determining the second interface element drawing position according to the target position and the set distance threshold.
  • the function realization manner of the second interface element moving distance estimator is as described above, and will not be described here.
  • the drawing module may draw the first interface element according to the predicted first interface element to be moved to the target position; and draw the position according to the determined second interface element to draw the second interface element.
  • the method can predict the target location to which the first interface element is to be moved by calculating the information of the motion trend of the first interface element, and then can be based on the target location at a specific moment.
  • Drawing the second interface element reduces the lag of the movement of the second interface element, thereby avoiding the repeated movement of the user, reducing the occupation of system resources and improving the operating efficiency of the system.
  • the execution bodies of the steps of the method provided in Embodiment 1 may all be the same device, or the method may also be performed by different devices.
  • the execution body of step 11 and step 12 may be a first interface element control
  • the execution body of step 13 may be a second interface element control;
  • the execution body of step 11 may be a first interface element control
  • step 12 and The execution body of step 13 may be a second interface element control;
  • the execution bodies of steps 11 to 13 may each be a second interface element control;
  • the embodiment of the present invention provides a method for drawing a focus or a focus frame in a digital television, which is used to solve the lag problem that occurs when the focus or the focus frame is moved in the prior art.
  • the schematic diagram of the specific implementation process of the method is shown in FIG. 2, and mainly includes the following steps:
  • Step 21 The digital television set top box acquires motion trend information of the first interface element
  • the motion trend information of the focus is obtained by using the position parameter of the focus at different times, including: acquiring at least two position parameters of the focus, and the focus is located at the two positions. The time at which the parameter represents the position; according to the at least two bits Setting a parameter to determine a moving direction of the focus; determining a moving speed of the focus according to the at least two positional parameters and a time when the focus is at a position respectively indicated by the two positional parameters.
  • the information on the movement trend of the focus is obtained, including the following two methods:
  • Step 22 predict, according to the motion trend information of the acquired focus, a target position to which the focus is to be moved;
  • step 22 may include: The target position to which the focus is to be moved is predicted according to the drawing period of the display interface, the moving speed, and the moving direction.
  • the digital television draws a display interface of each frame with a drawing period of 30 ms;
  • the focus moves along the horizontal X axis
  • the digital television receives the focus movement command, that is, the focus starts to move at time T0;
  • the position parameter of the focus at time T0 is (x0, y1).
  • sampling period is the same as the drawing period of the display interface, which is 30 ms.
  • the position parameter of the focus acquired at time t2 is (x2, y1).
  • the prediction of the target position to which the focus is to be moved includes the following sub-steps:
  • Sub-step 4 according to the position parameter of the focus acquired at time t1 and time t2, the moving direction of the focus can be obtained as the x-axis in the horizontal direction.
  • Sub-step 5 according to the moving direction of the acquired focus and the moving distance in a certain time, it can be predicted that the focus will be moved to reach the target position, that is, the focus moves in the horizontal direction X-axis to reach the target position.
  • the positional parameter is (x2+expectedDistanceX, y1).
  • Step 23 Draw a focus frame according to the target position.
  • the determined second interface element drawing position is the current position of the focus frame.
  • the parameter of the target position at which the first interface element moves to the target position is (x3, y3), and the position is still in focus.
  • the current position of the focus frame is redrawn to the focus frame.
  • the size, color, shape and the like of the TV poster A and the TV poster B may be different, in order to be in the TV poster B.
  • the size, the color, the shape, and the like of the focus frame are matched with the features of the TV poster B.
  • the embodiment of the present application provides a method for the third interface element where the focus is located when the focus is moved to the target position (ie, The focus frame is drawn by the feature of the third interface element at the position when the focus moves to the target position.
  • the focus frame is drawn in the target area.
  • you draw the focus frame into a triangle when you draw the focus frame into a triangle.
  • the method can predict the target position to which the focus (or the focus frame) is to be moved by calculating the motion trend information of the focus (or the focus frame), and thus can be at the target position.
  • the focus frame (or focal point) is drawn, which reduces the hysteresis of the movement of the second interface element, thereby avoiding unnecessary processing resource waste caused by the repeated movement operation of the user, and improving the operating efficiency of the system.
  • the embodiment of the present application provides a device for drawing an interface element, which is used to solve the lag problem that occurs when the interface element is moved in the prior art.
  • a schematic diagram of a specific implementation flow of the device is shown in FIG. 4, and mainly includes a motion trend information acquiring unit 31, a target position predicting unit 32, and a drawing unit 33.
  • the motion trend information acquiring unit is configured to obtain information that represents a trend of motion of the first interface element
  • a target location prediction unit configured to predict, according to the information, a target location to which the first interface element is to be moved
  • a drawing unit configured to draw the second interface element at the second interface element drawing position determined according to the target position.
  • the drawing unit is configured to: at a specific moment, draw the second interface element according to the target location.
  • the specific moment is no later than the moment when the first interface element moves to the target location.
  • the drawing unit is configured to: determine a first drawing position according to the target position and the set distance threshold; and draw a second interface element at the first drawing position at the specific moment; or Determining, according to the target location, whether the first interface element moves outside the display area currently occupied by the second interface element at the specific moment; determining a second drawing position according to the determination result; and at the specific moment, in the second Draw the position to draw the second interface element.
  • the drawing unit is configured to: when the distance between the target location and the location represented by the current location parameter of the second interface element is less than the set distance threshold, move the element in the first interface to At a time of the target position, drawing a second interface element at a position indicated by a current position parameter of the second interface element; a distance between the target position and a position indicated by a current position parameter of the second interface element is not less than At a fixed distance threshold, a second interface element is drawn at the target location at a time when the first interface element moves to the target location.
  • the drawing unit is configured to: move the first interface element when the distance between the target location and the location of the current location parameter of the second interface element is not less than a set distance threshold At the time of the target location, a second interface element is drawn at the target location.
  • the drawing unit is configured to: acquire a feature of the third interface element at the target position at the specific time; and draw a second interface element according to the feature.
  • the features include at least one of the following: color, size, animation effect, shape, and the like.
  • the motion trend information acquiring unit is configured to: after receiving the first interface element position change triggering instruction, acquire information indicating a motion trend of the first interface element; or according to a preset position parameter sampling period, Obtain information that characterizes the movement trend of the first interface element.
  • the motion trend information acquiring unit is configured to: obtain information that characterizes a motion trend of the first interface element according to a preset position parameter sampling period.
  • the information characterizing the motion trend of the first interface element includes: a moving speed and a moving direction of the first interface element.
  • the motion trend information acquiring unit is configured to: acquire at least two position parameters of the first interface element, and a time when the first interface element is located at the position respectively indicated by the two position parameters; Determining a moving direction of the first interface element according to the at least two position parameters; determining the first interface element according to the at least two position parameters and a time when the first interface element is located at the position respectively indicated by the two position parameters The speed of movement.
  • the target location prediction unit is configured to: according to a drawing period of the display interface, The moving speed and the moving direction predict a target position to which the first interface element is to be moved.
  • the target position prediction unit calculates the motion trend information of the first interface element
  • the target position to which the first interface element is to be moved can be predicted, and then the first interface element can be
  • the second interface element is drawn at the target position, which reduces the lag of the movement of the second interface element, thereby avoiding the use of repeated movement of the user, reducing the occupation of system resources, and improving The operating efficiency of the system.
  • embodiments of the present invention can be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment, or a combination of software and hardware. Moreover, the invention can take the form of a computer program product embodied on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) including computer usable program code.
  • computer-usable storage media including but not limited to disk storage, CD-ROM, optical storage, etc.
  • the computer program instructions can also be stored in a computer readable memory that can direct a computer or other programmable data processing device to operate in a particular manner, such that the instructions stored in the computer readable memory produce an article of manufacture comprising the instruction device.
  • the apparatus implements the functions specified in one or more blocks of a flow or a flow and/or block diagram of the flowchart.
  • These computer program instructions can also be loaded onto a computer or other programmable data processing device such that a series of operational steps are performed on a computer or other programmable device to produce computer-implemented processing for execution on a computer or other programmable device.
  • the instructions provide steps for implementing the functions specified in one or more of the flow or in a block or blocks of a flow diagram.
  • a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
  • processors CPUs
  • input/output interfaces network interfaces
  • memory volatile and non-volatile memory
  • the memory may include non-persistent memory, random access memory (RAM), and/or non-volatile memory in a computer readable medium, such as read only memory (ROM) or flash memory.
  • RAM random access memory
  • ROM read only memory
  • Memory is an example of a computer readable medium.
  • Computer readable media includes both permanent and non-persistent, removable and non-removable media.
  • Information storage can be implemented by any method or technology.
  • the information can be computer readable instructions, data structures, modules of programs, or other data.
  • Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read only memory. (ROM), electrically erasable programmable read only memory (EEPROM), flash memory or other memory technology, compact disk read only memory (CD-ROM), digital versatile disk (DVD) or other optical storage, Magnetic tape cartridges, magnetic tape storage or other magnetic storage devices or any other non-transportable media can be used to store information that can be accessed by a computing device.
  • computer readable media does not include temporary storage of computer readable media, such as modulated data signals and carrier waves.
  • embodiments of the present application can be provided as a method, system, or computer program product.
  • the present application can take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment in combination of software and hardware.
  • the application can take the form of a computer program product embodied on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) including computer usable program code.

Abstract

L'invention concerne un procédé de tracé pour élément d'interface, qui est utilisé pour résoudre le problème du gaspillage superflu de ressources de traitement causé par l'adoptien d'un procédé de suivi de position d'un élément d'interface dans l'état antérieur de la technique. Le procédé comporte les étapes consistant à: acquérir des informations indicatives d'une tendance de mouvement d'un premier élément d'interface; prédire, d'après les informations, une position de destination vers laquelle le premier élément d'interface est déplacé; et tracer, d'après la position de destination, un deuxième élément d'interface. L'invention concerne également un dispositif de tracé pour élément d'interface.
PCT/CN2016/080306 2015-05-27 2016-04-27 Procédé et dispositif de tracé pour élément d'interface WO2016188289A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510278764.1A CN106303652B (zh) 2015-05-27 2015-05-27 一种界面元素的绘制方法及装置
CN201510278764.1 2015-05-27

Publications (1)

Publication Number Publication Date
WO2016188289A1 true WO2016188289A1 (fr) 2016-12-01

Family

ID=57392532

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/080306 WO2016188289A1 (fr) 2015-05-27 2016-04-27 Procédé et dispositif de tracé pour élément d'interface

Country Status (2)

Country Link
CN (1) CN106303652B (fr)
WO (1) WO2016188289A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110990005A (zh) * 2019-11-20 2020-04-10 金现代信息产业股份有限公司 网页元素定位方法
WO2021249104A1 (fr) * 2020-06-09 2021-12-16 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Système et procédé de détermination de la position d'un nouvel élément d'interface sur une interface utilisateur

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107835461B (zh) * 2017-11-02 2021-09-28 深圳市雷鸟网络传媒有限公司 焦点移动控制方法、智能电视及计算机可读存储介质
CN108388465B (zh) * 2018-03-15 2021-04-23 阿里巴巴(中国)有限公司 动态形变开关组件的实现方法、装置及终端
CN110737963B (zh) * 2019-12-20 2020-03-31 广东博智林机器人有限公司 海报元素布局方法、系统和计算机可读存储介质
CN113821152A (zh) * 2020-11-06 2021-12-21 北京沃东天骏信息技术有限公司 界面元素的位置确定方法及装置、介质和设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101814031A (zh) * 2010-04-22 2010-08-25 四川长虹电器股份有限公司 用户界面元素的焦点框样式可定制的实现方法
WO2014031191A1 (fr) * 2012-08-20 2014-02-27 Google Inc. Focalisation sur un élément d'interface d'utilisateur en fonction du regard de l'utilisateur
WO2014200811A1 (fr) * 2013-06-12 2014-12-18 Microsoft Corporation Interface graphique utilisateur commandée par la focalisation de l'utilisateur à l'aide d'un dispositif monté sur la tête
CN104461256A (zh) * 2014-12-30 2015-03-25 广州视源电子科技股份有限公司 界面元素显示方法和系统
CN104486686A (zh) * 2014-12-23 2015-04-01 深圳市九洲电器有限公司 一种电子节目菜单的导航方法和系统

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103019518B (zh) * 2012-12-14 2016-06-22 广东欧珀移动通信有限公司 一种自动调整人机交互界面的方法
CN103530040B (zh) * 2013-10-22 2016-03-30 腾讯科技(深圳)有限公司 目标元素移动方法、装置及电子设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101814031A (zh) * 2010-04-22 2010-08-25 四川长虹电器股份有限公司 用户界面元素的焦点框样式可定制的实现方法
WO2014031191A1 (fr) * 2012-08-20 2014-02-27 Google Inc. Focalisation sur un élément d'interface d'utilisateur en fonction du regard de l'utilisateur
WO2014200811A1 (fr) * 2013-06-12 2014-12-18 Microsoft Corporation Interface graphique utilisateur commandée par la focalisation de l'utilisateur à l'aide d'un dispositif monté sur la tête
CN104486686A (zh) * 2014-12-23 2015-04-01 深圳市九洲电器有限公司 一种电子节目菜单的导航方法和系统
CN104461256A (zh) * 2014-12-30 2015-03-25 广州视源电子科技股份有限公司 界面元素显示方法和系统

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110990005A (zh) * 2019-11-20 2020-04-10 金现代信息产业股份有限公司 网页元素定位方法
CN110990005B (zh) * 2019-11-20 2023-05-30 金现代信息产业股份有限公司 网页元素定位方法
WO2021249104A1 (fr) * 2020-06-09 2021-12-16 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Système et procédé de détermination de la position d'un nouvel élément d'interface sur une interface utilisateur

Also Published As

Publication number Publication date
CN106303652A (zh) 2017-01-04
CN106303652B (zh) 2019-09-06

Similar Documents

Publication Publication Date Title
WO2016188289A1 (fr) Procédé et dispositif de tracé pour élément d'interface
US10445132B2 (en) Method and apparatus for switching applications
KR102139587B1 (ko) 디스플레이 성능 제어
CN106547420B (zh) 一种页面处理方法和装置
US20180011818A1 (en) Webpage Update Method And Apparatus
EP2592537B1 (fr) Procédé et appareil de désignation de zone complète par toucher de la zone partielle dans un équipement portatif
AU2016203156B2 (en) Processing touch gestures in hybrid applications
US20180109751A1 (en) Electronic device and method for controlling the same
CN105824491B (zh) 一种在移动设备中的分屏处理方法和装置
WO2017202314A1 (fr) Procédé et dispositif de traitement de données
WO2016101810A1 (fr) Procédé pour commuter un objet d'affichage dans un système à fenêtres multiples et dispositif associé
CN103888605A (zh) 一种信息处理的方法及电子设备
CN110102044B (zh) 基于智能手环的游戏控制方法、智能手环及存储介质
WO2016179912A1 (fr) Procédé et appareil de commande de programme d'application, et terminal mobile
US10061390B2 (en) Building space control
CN109656639B (zh) 一种界面滚动方法、装置、设备及介质
US20140362109A1 (en) Method for transforming an object and electronic device thereof
WO2015103819A1 (fr) Procédé et dispositif d'obtention d'une exploitation en lot rapide sur un écran tactile
JP2016531356A (ja) ウィジェットエリアの調整方法および調整装置
WO2016101816A1 (fr) Procédé et dispositif pour l'affichage d'informations dans une messagerie instantanée
US11314388B2 (en) Method for viewing application program, graphical user interface, and terminal
EP3614256A1 (fr) Procédé de traitement de données, dispositif informatique et support de stockage
CN105282521A (zh) 一种网络摄像机巡航中的运动检测方法及装置
CN114489314A (zh) 增强现实影像显示方法及相关装置
CN105824590B (zh) 一种在移动设备中的分屏处理方法和装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16799178

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16799178

Country of ref document: EP

Kind code of ref document: A1