WO2020087218A1 - 界面的控制方法及电子终端 - Google Patents

界面的控制方法及电子终端 Download PDF

Info

Publication number
WO2020087218A1
WO2020087218A1 PCT/CN2018/112469 CN2018112469W WO2020087218A1 WO 2020087218 A1 WO2020087218 A1 WO 2020087218A1 CN 2018112469 W CN2018112469 W CN 2018112469W WO 2020087218 A1 WO2020087218 A1 WO 2020087218A1
Authority
WO
WIPO (PCT)
Prior art keywords
interactive interface
boundary
display area
touch display
trigger
Prior art date
Application number
PCT/CN2018/112469
Other languages
English (en)
French (fr)
Inventor
王金周
付洋
Original Assignee
深圳市柔宇科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市柔宇科技有限公司 filed Critical 深圳市柔宇科技有限公司
Priority to CN201880096005.5A priority Critical patent/CN112740166A/zh
Priority to PCT/CN2018/112469 priority patent/WO2020087218A1/zh
Publication of WO2020087218A1 publication Critical patent/WO2020087218A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • This application relates to the technical field of interface adjustment, in particular to an interface control method and an electronic terminal.
  • the original display image is reduced and displayed in the designated small screen area by modifying the size and position of the underlying display area.
  • the screen cannot move and zoom freely, making the user experience poor.
  • This application provides an interface control method and an electronic terminal.
  • the present application provides an interface control method applied to an electronic terminal.
  • the electronic terminal includes a touch display area, and the touch display area displays an interactive interface.
  • the control method includes:
  • the interface control method can adjust the position and size of the interactive interface, so that the interactive interface can be zoomed and adjusted according to needs, thereby improving user experience.
  • a touch monitoring module configured to detect a trigger event in the touch display area
  • the processing module is used to determine whether the trigger event corresponds to the position or size adjustment trigger condition of the interactive interface; if it corresponds, determine whether a continuous trigger point is detected in the touch display area after the trigger event; After the trigger event, a continuous trigger point is detected in the touch display area, and the position or size of the interactive interface is adjusted according to the trigger event and the continuous trigger point.
  • a display module is used to display the adjusted interactive interface in the touch display area.
  • An electronic terminal includes a touch display area, a processor, and a memory, the processor is connected to the touch display area, the touch display area is used to display an interactive interface, and the memory stores a computer Read an instruction. When the instruction is executed by the processor, the processor is caused to execute the interface control method of the foregoing embodiment.
  • the electronic terminal can adjust the position and size of the interactive interface, so that the interactive interface can be zoomed and adjusted according to needs, and the user experience is improved.
  • FIG. 1 is a flowchart of an interface control method according to an embodiment of the present application.
  • FIG. 2 is a schematic diagram of a touch display area and an interactive interface of an electronic terminal according to an embodiment of the present application.
  • FIG. 3 is a flowchart of an interface control method according to an embodiment of the present application.
  • FIG. 4 is a sub-flow diagram of step S161 in FIG. 3.
  • FIG. 5 is a schematic diagram of adjustment of an interactive interface of an electronic terminal according to an embodiment of the present application.
  • FIG 6 is another schematic diagram of adjustment of the interactive interface of the electronic terminal according to the embodiment of the present application.
  • FIG 7 is another schematic diagram of adjustment of the interactive interface of the electronic terminal according to the embodiment of the present application.
  • FIG. 8 is a schematic diagram of another adjustment of the interactive interface of the electronic terminal according to the embodiment of the present application.
  • FIG. 9 is a sub-flow diagram of step S162 in FIG. 3.
  • FIG. 10 is a schematic diagram of yet another adjustment of the interactive interface of the electronic terminal according to the embodiment of the present application.
  • FIG. 11 is a schematic diagram of another adjustment of the interactive interface of the electronic terminal according to the embodiment of the present application.
  • FIG. 12 is a schematic diagram of another adjustment of the interactive interface of the electronic terminal according to the embodiment of the present application.
  • FIG. 13 is a schematic diagram of another adjustment of the interactive interface of the electronic terminal according to the embodiment of the present application.
  • FIG. 14 is a schematic diagram of another adjustment of the interactive interface of the electronic terminal according to the embodiment of the present application.
  • 15 is a schematic diagram of another adjustment of the interactive interface of the electronic terminal according to the embodiment of the present application.
  • FIG. 16 is another flowchart of the interface control method according to the embodiment of the present application.
  • FIG. 17 is another flowchart of the interface control method according to the embodiment of the present application.
  • Fig. 18 is a sub-flow diagram of step S18 in Fig. 12.
  • 19 is a schematic diagram of adjustment of an interactive interface of an electronic terminal according to an embodiment of the present application.
  • FIG. 20 is another schematic diagram of adjusting the interactive interface of the electronic terminal according to the embodiment of the present application.
  • 21 is a schematic diagram of another adjustment of the interactive interface of the electronic terminal according to the embodiment of the present application.
  • 22 is a schematic block diagram of an electronic terminal according to an embodiment of the present application.
  • FIG. 23 is another schematic block diagram of an electronic terminal according to an embodiment of the present application.
  • 24 is a schematic structural diagram of an electronic terminal according to an embodiment of the present application.
  • Electronic terminal 1000 electronic terminal 100, touch display area 101, input focus area 1011, processor 1001, memory 1002, touch display screen 1003, system bus 1004, zoom trigger point 11, interactive interface 12, mobile trigger point 13, touch monitoring Module 102, processing module 103, display module 104, and focus correction module 105.
  • first and second are used only for descriptive purposes and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Thus, features defined as “first” and “second” may explicitly or implicitly include one or more of the features. In the description of this application, the meaning of “plurality” is two or more, unless otherwise specifically limited.
  • connection should be understood in a broad sense, for example, it can be fixed connection or detachable Connected, or integrally connected; may be mechanical, electrical, or may communicate with each other; may be directly connected, or may be indirectly connected through an intermediary, may be the connection between two elements or the interaction of two elements relationship.
  • the first feature “above” or “below” the second feature may include the direct contact of the first and second features, or may include the first and second features Contact not directly but through another feature between them.
  • the first feature is “above”, “above” and “above” the second feature includes that the first feature is directly above and obliquely above the second feature, or simply means that the first feature is higher in level than the second feature.
  • the first feature is “below”, “below” and “below” the second feature includes that the first feature is directly below and obliquely below the second feature, or simply means that the first feature is less horizontal than the second feature.
  • the interface control method of the embodiment of the present application is applied to the electronic terminal 100.
  • the electronic terminal 100 includes a touch display area 101, and the touch display area 101 displays an interactive interface 12.
  • the control method includes:
  • Step S11 detecting the trigger event in the touch display area 101
  • Step S12 Determine whether the trigger event corresponds to the position or size of the interactive interface 12 and adjust the trigger condition
  • step S13 it is determined whether a continuous trigger point is detected in the touch display area 101 after the trigger event;
  • step S14 the position or size of the interactive interface 12 is adjusted according to the trigger event and continuous trigger points;
  • step S15 the adjusted interactive interface 12 is displayed on the touch display area 101.
  • the interface control method can adjust the position and size of the interactive interface 12, so that the interactive interface 12 can be zoomed and adjusted according to needs, thereby improving user experience.
  • the trigger event indicates that a touch point is detected for the first time in the touch area; in other words, there is no previous trigger point continuous with the trigger event.
  • the "continuous trigger point” referred to above means a continuous trigger point after the trigger point corresponding to the trigger event.
  • the above-mentioned continuous trigger point means that the touch display area 101 detects that the user presses the trigger point and drags the trigger point without disconnection in the middle. For example, if the user presses the trigger point and then leaves the trigger point, and then presses the trigger point again, it belongs to a discontinuous trigger point.
  • the continuous trigger point includes a sliding operation.
  • the electronic terminal 1000 includes a touch display screen 1003 that has a touch display area 101.
  • the touch display screen 1003 lights up, the touch display area 101 displays the interaction interface 12 to enable the user to interact with the electronic terminal 1000.
  • the touch display screen 1003 may be a flexible screen (such as an OLED screen).
  • the touch display screen 1003 may be a capacitive touch screen.
  • detecting the trigger event on the touch display area 101 includes detecting a trigger point on the touch display area 101.
  • the adjustment operation of the interactive interface 12 can be performed according to the trigger point.
  • the interactive interface 12 includes a first area and a second area.
  • the trigger event adjusts the trigger condition corresponding to the position of the interactive interface 12; when the trigger event is located in the second area, the trigger The event corresponds to the trigger condition for resizing the interactive interface 12.
  • the user Through the setting of the first area and the second area, it is convenient for the user to operate.
  • the user only needs to press the trigger point in the corresponding area to adjust the interactive interface 12, which is convenient for the user to operate with one hand and improve the user experience.
  • the first area includes: the area surrounded by the title bar and the border of the interactive interface 12; the second area includes: the border of the interactive interface 12 or the four corners of the interactive interface 12.
  • This setting facilitates the user's one-handed operation and enhances the user experience.
  • the interface control method includes:
  • Step S16 it is determined whether the starting point of the trigger event is the mobile trigger point 13 in the first area or the zoom trigger point 11 in the second area;
  • step S161 it is determined that the trigger event corresponds to the position of the interactive interface 12 to adjust the trigger condition
  • step S162 it is determined that the trigger event corresponds to the size adjustment trigger condition of the interactive interface 12.
  • the settings of the first area and the second area can conveniently monitor whether the user's intention is to move the interactive interface 12 or zoom the interactive interface 12, and then adjust the size or position of the interactive interface 12 according to the user's intention.
  • the electronic terminal 100 determines that the trigger event is to adjust the position of the interactive interface 12, so that the position of the interactive interface 12 can be adjusted according to the moving point at the end of the continuous trigger point
  • the adjustment enables the electronic terminal 100 to move the interactive interface 12 according to the distance the user moves to enable the interactive interface 12 to be displayed at different positions on the touch display area 101.
  • the electronic terminal 100 determines that the trigger event is to adjust the size of the interactive interface 12, so that the size of the interactive interface 12 can be adjusted according to the moving point at the end of the continuous trigger point The adjustment is performed so that the electronic terminal 100 can adjust the boundary of the interactive interface 12 according to the distance the user moves to the interactive interface 12 so that the boundary of the interactive interface 12 can be scaled in the same proportion as the boundary of the touch display area 101.
  • the above-mentioned moving point refers to a trigger point at the end of a continuous trigger point.
  • the zoom trigger point 11 in the second area and the movement trigger point 13 in the first area can be located at any position on the touch display area 101.
  • the zoom trigger point 11 and the movement trigger point 13 can be When users use the touch display area 101, they are generally used to contact points.
  • the zoom trigger point 11 in the second area is located in the border of the touch display area 101
  • the mobile trigger point 13 in the first area is located in the touch display screen 1003.
  • the movement trigger point 13 or the zoom trigger point 11 can be manipulated by the user's finger pressing or touching the zoom trigger point 11 or the movement trigger point 13 on the touch display area 101, or can be zoomed on the non-contact proximity touch display area 101 Trigger point 11 or move trigger point 13.
  • the user's finger is located at a position above the zoom trigger point 11 or the move trigger point 13.
  • step S161 includes:
  • Step S1611 Determine the offset of the interactive interface 12 according to the trigger event and the travel distance of the continuous trigger point;
  • Step S1612 Determine the post-trigger boundary of the interactive interface 12 according to the offset
  • Step S1613 Compare the post-triggered border of the interactive interface 12 with the border of the touch display area 101 to determine whether the post-triggered border of the interactive interface 12 crosses the border;
  • step S1614 perform boundary correction on the interactive interface 12
  • step S1615 the boundary after the trigger is set as the boundary of the interactive interface 12.
  • the interactive interface 12 can be accurately displayed on the touch display area 101, and the interactive accuracy of the interactive interface 12 is improved.
  • the length of the interactive interface 12 is 1536 and the width is 864.
  • the trigger point corresponding to the trigger event is the mobile trigger point 13 and the coordinates of the starting point C of the mobile trigger point 13 are (860, 1500).
  • the continuous trigger point controls the starting point C to be shifted to the right by 100.
  • the coordinate of the moving point C1 is changed to (960, 1500).
  • the offset of the moving point C1 from the starting point C to the right is 100 and the downward offset is 0.
  • the interactive interface 12 is offset to the right When it is 100, it is offset downward by 0.
  • judging whether the boundary of the interactive interface 12 crosses the boundary refers to manipulating the movement trigger point 13 to move, at the same time obtaining the coordinates of the moving point C1, calculating the offset of the movement, and thereby obtaining the coordinates of the point at the upper left corner of the moved interactive interface 12 Compare with the coordinates of the point in the lower right corner, compare the coordinates of the point in the upper left corner of the interactive interface 12 and the coordinates of the point in the lower right corner with the coordinates of the point in the upper left corner of the touch display area 101 and the coordinates of the point in the lower right corner.
  • the starting point C is shifted to the right by 300.
  • the coordinate of the moving point C2 of the moving trigger point 13 is changed to (1160, 1500), and the moving point C2
  • the offset to the right relative to the starting point C is 300, and the offset to the downward is 0.
  • the interaction interface 12 as a whole is offset to the right by 300 and downward by 0.
  • the interaction after moving The coordinate of the point A3 in the upper left corner of the interface 12 is changed to (300, 0), and the coordinate of the point B3 in the lower right corner is changed to (1164, 1536). From the figure, the size of the interactive interface 12 is too large and out of range.
  • the width of the interactive interface 12 is 1164, which is greater than the width 1080 of the touch display area 101.
  • the boundary of the interactive interface 12 has exceeded the boundary of the touch display area 101. At this time, the interactive interface 12 is beyond the boundary, and the boundary of the interactive interface 12 needs to be corrected .
  • the coordinate of point B3 in the lower right corner of the interactive interface 12 is (1164, 1536) at this time, the interactive interface 12 is out of bounds, and the interactive interface 12 is shifted to the right by 300, beyond the touch display area 101 The distance is 84.
  • the processing module 103 is used to shift the moving point C2 to the left by 84 while driving the interactive interface 12 to shift to the left by 84, as shown in FIG.
  • the interactive interface 12 is located in the touch display area 101, and the coordinates of the moving point C3 at this time Change to (1076, 1500), the coordinate of point A4 in the upper left corner of interactive interface 12 is changed to (216, 0), and the coordinate of point B4 in the lower right corner is changed to (1080, 1536), so that interactive interface 12 is located on the touch display In the area 101, there is no cross-border.
  • the interactive interface 12 can be automatically corrected when it crosses the boundary, so that the interactive interface 12 does not cross the boundary, which is convenient for the user to operate with one hand.
  • step S162 includes:
  • Step S1621 Determine the zoom factor of the interactive interface 12 according to the trigger event and the travel distance of the continuous trigger point;
  • Step S1622 Determine the post-trigger boundary of the interactive interface 12 according to the zoom factor
  • Step S1623 Compare the post-triggered boundary of the interactive interface 12 with the boundary of the touch display area 101 to determine whether the post-triggered boundary of the interactive interface 12 crosses the boundary;
  • step S1624 perform boundary correction on the interactive interface 12
  • step S1625 the boundary after the trigger is set as the boundary of the interactive interface 12.
  • the interactive interface 12 can be accurately displayed on the touch display area 101, and the interactive accuracy of the interactive interface 12 is improved.
  • the length of the interactive interface 12 is 1920, and the width is 1080.
  • the coordinate of the starting point D of the zoom trigger point 11 is (1080, 860)
  • the starting point D is shifted to the left by 216
  • the coordinate of the moving point D1 of the zoom trigger point 11 is changed to (864, 860), as shown in FIG. 10,
  • the offset of the moving point D1 from the starting point D to the left is 216.
  • the coordinate of the point A2 in the upper left corner of the interactive interface 12 is (0, 0)
  • the coordinate of the point B2 in the lower right corner is (864, 1920). 864 is divided by 1080 to obtain 0.8.
  • the zoom factor is 0.8.
  • the scaled interactive interface 12 is obtained, as shown in FIG. 11, the coordinate of the point A3 in the upper left corner of the scaled interactive interface 12 is (0, 0), and the coordinate of the point B3 in the lower right corner is (864, 1536), that is
  • the point B3 in the lower right corner of the interactive interface 12 has an upward offset of 348, and the point B3 in the lower right corner has an offset of 216 to the left.
  • the value of the point B3 in the lower right corner of the interactive interface 12 is that of the touch display area 101
  • the value of point B in the lower right corner is 0.8 times, that is, the scaling factor is 0.8.
  • the offset of the moving point D1 relative to the starting point D is calculated by changing the coordinates of the moving point D1 relative to the starting point D to obtain a scaling factor, and the interactive interface 12 is scaled according to the scaling factor.
  • the zoom factor refers to dividing the horizontal coordinate of the starting point of the zoom trigger point 11 and the horizontal coordinate of the moving point of the zoom trigger point 11 or the vertical coordinate of the starting point of the zoom trigger point 11 and the vertical coordinate of the moving point of the zoom trigger point 11 Divide to get the scaling factor.
  • the zoom factor interval is defined, that is, the zoom system is within a preset zoom factor interval, for example [0.3,1] .
  • judging whether the boundary of the interactive interface 12 crosses the boundary refers to manipulating the zoom trigger point 11 to move, and at the same time obtaining the coordinates of the moving point, obtaining a zoom factor, and comparing the obtained zoom factor with a numerical interval, if the obtained zoom factor is not The preset zoom factor interval is out of bounds.
  • the starting point D is shifted to the left by 864, and the coordinate of the moving point D2 of the zoom trigger point 11 is changed to (216, 860),
  • the offset of the moving point D2 to the left from the starting point D is 864.
  • the coordinate of the point A4 in the upper left corner of the interactive interface 12 is (0, 0)
  • the coordinate of the point B4 in the lower right corner is (216, 1920).
  • 216 is divided by 1080 to obtain 0.2.
  • the zoom factor is 0.2.
  • the zoomed interactive interface 12 is obtained, as shown in FIG. 13, at this time, the coordinate of the point A5 in the upper left corner of the zoomed interactive interface 12 is (0, 0), and the coordinate of the point B5 in the lower right corner is (216, 384), that is The point B5 in the lower right corner of the interactive interface 12 has an upward offset of 1536, and the point B3 in the lower right corner has an offset of 864 to the left.
  • the value of the point B3 in the lower right corner of the interactive interface 12 is that of the touch display area 101
  • the value of point B in the lower right corner is 0.2 times, that is, the zoom factor is 0.2.
  • the zoom factor is 0.2
  • the interaction The size of the interface 12 is too small and out of range.
  • the width 216 and the length 384 of the interactive interface 12 are too small for the user interface. Therefore, the boundary of the interactive interface 12 needs to be corrected.
  • the coordinate of point B5 in the lower right corner of the interactive interface 12 is (216, 384) at this time, the interactive interface 12 is out of bounds, and the zoom factor of the interactive interface 12 is 0.2, not in the numerical range 0.3 to 1. between.
  • the processing module 103 is used to change the zoom factor to 0.3.
  • the coordinate of the moving point D3 is changed to (324, 860), and the coordinate of the point A6 in the upper left corner of the interactive interface 12 is ( 0, 0), and the coordinate of the point B6 in the lower right corner is changed to (234, 576).
  • the interactive interface 12 is located in the touch display area 101 without crossing the boundary.
  • the interactive interface 12 can be automatically corrected when it crosses the boundary, so that the interactive interface 12 does not cross the boundary, which is convenient for the user to operate with one hand.
  • the interface control method includes:
  • the input focus area 1011 of the touch display area 101 is adjusted so that the input focus area 1011 of the interactive interface 12 matches the adjusted input focus area 1011 of the touch display area 101.
  • the interface control method includes:
  • Step S17 determine whether the continuous trigger point is interrupted
  • step S18 the input focus area 1011 of the touch display area 101 is adjusted according to the boundary of the interaction interface 12 that has been set so that the interaction interface 12 matches the input focus area 1011.
  • the electrical signal (such as voltage) output from the touch display area 101 will be reduced to the extent that the touch display area 101 is not touched, so that the judgment of the electrical signal can determine whether the continuous trigger point Interrupt. For example, if the finger presses the movement trigger point 13 or the zoom trigger point 11, and moves to a certain position on the touch display area 101, and raises the finger, the electrical signal output from the touch display area 101 will be reduced to The extent to which the touch display area 101 is not touched, therefore, it can be determined that the continuous trigger point has been interrupted. If the finger keeps pressing the touch display area 101 without moving, the output electrical signal is still large, and it can be determined that the continuous trigger point is not interrupted.
  • the adjusted interactive interface 12 is obtained.
  • the adjusted interactive interface 12 needs to be Input focus correction so that the content of the adjusted interactive interface 12 can match the content of the adjusted interactive interface 12 before adjustment.
  • interface control methods include:
  • Step S181 determining the boundary of the input focus area 1011 touching the display area 101;
  • Step S182 Determine the boundary of the input focus area 1011 of the interactive interface 12 after adjustment
  • Step S183 Adjust the boundary of the input focus area 1011 of the touch display area 101 according to the trigger event and continuous trigger points so that the boundary of the input focus area 1011 of the adjusted touch display area 101 and the input focus area 1011 of the adjusted interactive interface 12 Matching.
  • the content of the interactive interface 12 after adjustment can be matched with the content of the interactive interface 12 before adjustment.
  • the touch display area 101 is rectangular, the resolution of the touch display area 101 is 1080 * 1920, and the upper and left borders of the touch display area 101 are used as the xy axis to establish a coordinate system.
  • the coordinate of the point A in the upper left corner of () is (0, 0)
  • the coordinate of the point B in the lower right corner is (1080, 1920)
  • the length of the touch display area 101 is 1920 and the width is 1080.
  • the coordinate of the point A1 in the upper left corner of the interactive interface 12 is (0, 0) and the coordinate of the point B1 in the lower right corner is (1080, 1920).
  • the length of the interactive interface 12 is 1920 and the width is 1080.
  • the input focus area 1011 of the touch input area at this time is rectangular, and the coordinates of the point E at the upper left corner of the input focus area 1011 are (500, 800), and the coordinates of the point F at the lower right corner of the input focus area 1011 are (600, 1060 ), Assuming that the zoom factor is 0.5, after adjustment, please refer to FIG. 20, the coordinate of the point A2 in the upper left corner of the interactive interface 12 is (0, 0), and the coordinate of the point B2 in the lower right corner is (540, 960). The boundary of the input focus area 1011 needs to be adjusted.
  • the data of the points of the input focus area 1011 are multiplied by a scaling factor of 0.5. Please refer to FIG. 21.
  • the coordinates of the point E1 in the upper left corner of the input focus area 1011 is (250 , 400).
  • the coordinates of the point F1 in the lower right corner of the input focus area 1011 are (300, 530).
  • the input focus area 1011 of the adjusted interactive interface 12 and the input focus area 1011 of the touch display area 101 are matched to make the adjustment
  • the content of the interaction interface 12 after the match with the content of the touch input area improves the interaction accuracy of the interaction interface 12.
  • an electronic terminal 100 which includes:
  • the touch monitoring module 102 is used to detect a trigger event in the touch display area 101;
  • the processing module 103 is used to determine whether the trigger event corresponds to the position or size adjustment trigger condition of the interactive interface 12; if it corresponds, determine whether a continuous trigger point is detected in the touch display area 101 after the trigger event and the touch display after the trigger event A continuous trigger point is detected in the area 101, and the position or size of the interactive interface 12 is adjusted according to the trigger event and the continuous trigger point;
  • the display module 104 is used to display the adjusted interactive interface 12 on the touch display area 101.
  • the electronic terminal 100 can adjust the position and size of the interactive interface 12, so that the interactive interface 12 can be zoomed and adjusted according to needs, thereby improving user experience.
  • the interface control method according to the embodiment of the present application can be applied to the electronic terminal 100. It should be noted that the above explanations of the implementation manner and beneficial effects of the interface control method are also applicable to the electronic terminal 100 of this embodiment. In order to avoid redundancy, they will not be detailed here.
  • the touch monitoring module 102 is used to determine that when the trigger event is in the first area, the trigger event corresponds to the position of the interactive interface 12 to adjust the trigger condition; when the trigger event is in the second area, the trigger event corresponds to the interactive interface 12 The size adjustment trigger condition.
  • the processing module 103 when the position of the interactive interface 12 is adjusted by the trigger event and the continuous trigger point, the processing module 103 is used to determine the offset of the interactive interface 12 according to the travel distance of the trigger event and the continuous trigger point, according to The offset determines the post-trigger boundary of the interactive interface 12, and compares the post-triggered boundary of the interactive interface 12 with the boundary of the touch display area 101 to determine whether the post-triggered boundary of the interactive interface 12 crosses the boundary, and if so, is used to The boundary correction is performed, and if not, it is used to set the triggered boundary as the boundary of the interactive interface 12.
  • the processing module 103 is used to determine the zoom factor of the interactive interface 12 according to the travel distance of the trigger event and the continuous trigger point, according to The zoom factor determines the post-trigger boundary of the interactive interface 12, and compares the post-triggered boundary of the interactive interface 12 with the boundary of the touch display area 101 to determine whether the post-triggered boundary of the interactive interface 12 crosses the boundary, and if so, is used to The boundary correction is performed, and if not, it is used to set the triggered boundary as the boundary of the interactive interface 12.
  • the electronic terminal 100 further includes:
  • the focus correction module 105 is used to adjust the input focus area 1011 of the touch display area 101 after the size or position adjustment of the interaction interface 12 is completed so that the input focus area 1011 of the interaction interface 12 and the adjusted input focus area of the touch display area 101 1011 matches.
  • the focus correction module 105 is used to determine the boundary of the input focus area 1011 of the touch display area 101, determine the boundary of the input focus area 1011 of the adjusted interactive interface 12, and adjust the boundary of the input focus area 1011 of the touch display area 101 according to the trigger event
  • the boundary of the input focus area 1011 of the adjusted touch display area 101 matches the boundary of the input focus area 1011 of the adjusted interaction interface 12.
  • this application provides an electronic terminal 1000, including a touch display area 101, a processor 1001, and a memory 1002.
  • the processor 1001 is connected to the touch display area 101.
  • the touch display area 101 is used to display the interactive interface 12, and the memory 1002 stores There are computer readable instructions, which when executed by the processor 1001, cause the processor 1001 to execute the interface control method of the above embodiment.
  • the electronic terminal 1000 can adjust the position and size of the interactive interface 12, so that the interactive interface 12 can be zoomed and adjusted according to needs, thereby improving user experience.
  • the input focus is corrected for the adjusted interactive interface 12, so that the trigger of the interactive interface 12 matches the content of the interactive interface 12, and the interactive accuracy of the interactive interface 12 is improved.
  • the interface control method according to the embodiment of the present application may be applied to the electronic terminal 1000. It should be noted that the above-mentioned explanations of the implementation manner and beneficial effects of the interface control method are also applicable to the electronic terminal 1000 of this embodiment. To avoid redundancy, they will not be described in detail here.
  • the electronic terminal includes a touch display screen 1003 having a touch display area 101.
  • the electronic terminal 1000 connects the touch display area 101, the processor 1001, and the memory 1002 (for example, a non-volatile storage medium) through the system bus 1004.
  • the memory 1002 stores computer readable instructions.
  • the computer readable instructions can be executed by the processor 1001 to implement the interface control method of any one of the above embodiments.
  • the processor 1001 can be used to provide computing and control capabilities to support the operation of the entire electronic terminal 1000.
  • the electronic terminal 1000 may be a device that can be used for touch operation such as a mobile phone and a tablet computer. Those skilled in the art can understand.
  • the specific electronic terminal 1000 may include more More or fewer components, or some components combined, or have different component arrangements.
  • Any process or method description in a flowchart or otherwise described herein may be understood as representing a module, segment, or portion of code that includes one or more executable instructions for performing specific logical functions or steps of the process , And the scope of the preferred embodiment of the present application includes additional executions, which may not be performed in the order shown or discussed, including performing the functions in a substantially simultaneous manner or in reverse order according to the functions involved, which should It is understood by those skilled in the art to which the embodiments of the present application belong.
  • a "computer-readable medium” may be any device that can contain, store, communicate, propagate, or transmit a program for use by or in connection with an instruction execution system, apparatus, or device.
  • computer-readable media include the following, electrical connections (electronic devices) with one or more wiring, portable computer cartridges (magnetic devices), random access memory (RAM), Read only memory (ROM), erasable and editable read only memory (EPROM or flash memory), fiber optic devices, and portable compact disk read only memory (CDROM).
  • the computer-readable medium may even be paper or other suitable medium on which the program can be printed, because, for example, by optically scanning the paper or other medium, followed by editing, interpretation, or other appropriate if necessary Process to obtain the program electronically and then store it in computer memory.
  • each part of the present application may be implemented by hardware, software, firmware, or a combination thereof.
  • multiple steps or methods may be performed using software or firmware stored in memory and executed by a suitable instruction execution system.
  • it if it is executed by hardware, as in another embodiment, it can be executed by any one or a combination of the following techniques known in the art, having a logic gate circuit for performing a logic function on a data signal Discrete logic circuits, dedicated integrated circuits with appropriate combinational logic gates, programmable gate arrays (PGA), field programmable gate arrays (FPGA), etc.
  • each functional unit in each embodiment of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module.
  • the above-mentioned integrated modules may be executed in the form of hardware or software function modules. If the integrated module is executed in the form of a software function module and sold or used as an independent product, it may also be stored in a computer-readable storage medium.
  • the storage medium mentioned above may be a read-only memory, a magnetic disk or an optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种界面的控制方法,应用于电子终端(100),所述电子终端(100)包括触摸显示区域(101),所述触摸显示区域(101)显示有交互界面(12),所述控制方法包括:检测在所述触摸显示区域(101)的触发事件;判断所述触发事件是否对应所述交互界面(12)的位置或尺寸调整触发条件;若是,则判断所述触发事件后是否在所述触摸显示区域(101)检测到连续的触发点;若是,则根据所述触发事件和所述连续的触发点调整所述交互界面(12)的位置或尺寸;以及将调整后的所述交互界面(12)显示在所述触摸显示区域(101)。

Description

界面的控制方法及电子终端 技术领域
本申请涉及界面调整技术领域,尤其涉及一种界面的控制方法及电子终端。
背景技术
传统的屏幕缩放和移动时通过修改底层显示区域的大小及位置将原显示画面缩小后显示在指定的小屏区域。然而,这种方式会导致屏幕不能自由移动和缩放,使得用户体验较差。
发明内容
本申请提供一种界面的控制方法及电子终端。
本申请提供一种界面的控制方法应用于电子终端,所述电子终端包括触摸显示区域,所述触摸显示区域显示有交互界面,所述控制方法包括:
检测在所述触摸显示区域的触发事件;
判断所述触发事件是否对应所述交互界面的位置或尺寸调整触发条件;
若是,则判断所述触发事件后是否在所述触摸显示区域检测到连续的触发点;
若是,则根据所述触发事件和所述连续的触发点调整所述交互界面的位置或尺寸;以及
将调整后的所述交互界面显示在所述触摸显示区域。
本申请实施方式的界面的控制方法能够调整交互界面的位置和尺寸,使得交互界面能够根据需要进行缩放和位置调整,提升用户体验。
本申请实施方式的一种电子终端,其包括:
触摸显示区域,用于显示交互界面;
触摸监听模块,用于检测在所述触摸显示区域的触发事件;
处理模块,用于判断所述触发事件是否对应所述交互界面的位置或尺寸调整触发条件;若对应,则判断所述触发事件后是否在所述触摸显示区域检测到连续的触发点;若所述触发事件后在所述触摸显示区域检测到连续的触发点,根据所述触发事件和所述连续的触发点调整所述交互界面的位置或尺寸。
显示模块,用于将调整后的所述交互界面显示在所述触摸显示区域。
本申请实施方式的一种电子终端,其包括触摸显示区域、处理器和存储器,所述处理器连接所述触摸显示区域,所述触摸显示区域用于显示交互界面,所述存储器存储有计算机可读指令,所述指令被所述处理器执行时,使得所述处理器执行上述实施方式的界面的控制方法。
本申请实施方式的电子终端能够调整交互界面的位置和尺寸,使得交互界面能够根据需要进行缩放和位置调整,提升用户体验。
本申请的附加方面和优点将在下面的描述中部分给出,部分将从下面的描述中变得明显,或通过本申请的实践了解到。
附图说明
本申请的上述和/或附加的方面和优点从结合下面附图对实施方式的描述中将变得明显和容易理解,其中:
图1是本申请实施方式的界面的控制方法的流程图。
图2是本申请实施方式的电子终端的触摸显示区域和交互界面的示意图。
图3是本申请实施方式的界面的控制方法的流程图。
图4是图3中步骤S161的子流程图。
图5是本申请实施方式的电子终端的交互界面的调整示意图。
图6是本申请实施方式的电子终端的交互界面的又一调整示意图。
图7是本申请实施方式的电子终端的交互界面的另一调整示意图。
图8是本申请实施方式的电子终端的交互界面的再一调整示意图。
图9是图3中步骤S162的子流程图。
图10是本申请实施方式的电子终端的交互界面的再一调整示意图。
图11是本申请实施方式的电子终端的交互界面的再一调整示意图。
图12是本申请实施方式的电子终端的交互界面的再一调整示意图。
图13是本申请实施方式的电子终端的交互界面的再一调整示意图。
图14是本申请实施方式的电子终端的交互界面的再一调整示意图。
图15是本申请实施方式的电子终端的交互界面的再一调整示意图。
图16是本申请实施方式的界面的控制方法的又一流程图。
图17是本申请实施方式的界面的控制方法的又一流程图。
图18是图12中步骤S18的子流程图。
图19是本申请实施方式的电子终端的交互界面的调整示意图。
图20是本申请实施方式的电子终端的交互界面的再一调整示意图。
图21是本申请实施方式的电子终端的交互界面的再一调整示意图。
图22是本申请实施方式的电子终端的模块示意图。
图23是本申请实施方式的电子终端的又一模块示意图。
图24是本申请实施方式的电子终端的结构示意图。
主要元件符号说明:
电子终端1000、电子终端100、触摸显示区域101、输入焦点区域1011、处理器1001、存储器1002、触摸显示屏1003、系统总线1004、缩放触发点11、交互界面12、移动触发点13、触摸监听模块102、处理模块103、显示模块104、焦点矫正模块105。
具体实施方式
下面详细描述本申请的实施方式,所述实施方式的示例在附图中示出,其中自始至终相同或类似的标号表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实施方式是示例性的,仅用于解释本申请,而不能理解为对本申请的限制。
在本申请的描述中,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个所述特征。在本申请的描述中,“多个”的含义是两个或两个以上,除非另有明确具体的限定。
在本申请的描述中,需要说明的是,除非另有明确的规定和限定,术语“安装”、“相连”、“连接”应做广义理解,例如,可以是固定连接,也可以是可拆卸连接,或一体地连接;可以是机械连接,也可以是电连接或可以相互通讯;可以是直接相连,也可以通过中间媒介间接相连,可以是两个元件内部的连通或两个元件的相互作用关系。对于本领域的普通技术人员而言,可以根据具体情况理解上述术语在本申请中的具体含义。
在本申请中,除非另有明确的规定和限定,第一特征在第二特征之“上”或之“下”可以包括第一和第二特征直接接触,也可以包括第一和第二特征不是直接接触而是通过它们之间的另外的特征接触。而且,第一特征在第二特征“之上”、“上方”和“上面”包括第一特征在第二特征正上方和斜上方,或仅仅表示第一特征水平高度高于第二特征。第一特征在第二特征“之下”、“下方”和“下面”包括第一特征在第二特征正下方和斜下方,或仅仅表示第一特征水平高度小于第二特征。
下文的公开提供了许多不同的实施方式或例子用来实现本申请的不同结构。为了简化本申请的公开,下文中对特定例子的部件和设置进行描述。当然,它们仅仅为示例,并且目的不在于限制本申请。此外,本申请可以在不同例子中重复参考数字和/或参考字母,这种重复是为了简化和清楚的目 的,其本身不指示所讨论各种实施方式和/或设置之间的关系。此外,本申请提供了的各种特定的工艺和材料的例子,但是本领域普通技术人员可以意识到其他工艺的应用和/或其他材料的使用。
请参阅图1、图2和图24,本申请实施方式的界面的控制方法应用于电子终端100,电子终端100包括触摸显示区域101,触摸显示区域101显示有交互界面12,控制方法包括:
步骤S11,检测在触摸显示区域101的触发事件;
步骤S12,判断触发事件是否对应交互界面12的位置或尺寸调整触发条件;
若是,步骤S13,则判断触发事件后是否在触摸显示区域101检测到连续的触发点;
若是,步骤S14,则根据触发事件和连续的触发点调整交互界面12的位置或尺寸;
步骤S15,将调整后的交互界面12显示在触摸显示区域101。
本申请实施方式的界面的控制方法能够调整交互界面12的位置和尺寸,使得交互界面12能够根据需要进行缩放和位置调整,提升用户体验。
其中,触发事件表示触摸区域首次检测到一触发点;换言之,该触发事件之前并不存在与之连续的在先触发点。而上文所指的“连续的触发点”表示该触发事件对应的触发点在后的连续的触发点。
上述连续的触发点指的是在触摸显示区域101检测到用户按压触发点,并且拖动触发点,中间没有断开。例如,用户按压触发点之后离开触发点,之后又再次按压触发点,则属于不连续的触发点。
在某些实施方式中,连续的触发点包括滑动操作。
如此,滑动操作的方式在触摸显示区域101的操作更加方便快捷,使得用户操作更加简单,提升用户体验。在一个例子中,电子终端1000包括触摸显示屏1003,触摸显示屏1003具有触摸显示区域101。在触摸显示屏1003 亮屏时,触摸显示区域101显示有交互界面12,以使用户能够与电子终端1000实现交互。触摸显示屏1003可为柔性屏(如OLED屏)。触摸显示屏1003可为电容式触摸屏。
在某些实施方式中,检测在触摸显示区域101的触发事件包括在触摸显示区域101上检测到一触发点。
当触摸显示区域101检测到触发点之后,才能够根据触发点来进行接下来的对交互界面12进行调整的操作。
在某些实施方式中,交互界面12包括第一区域和第二区域,当触发事件位于第一区域,则触发事件对应交互界面12的位置调整触发条件;当触发事件位于第二区域,则触发事件对应交互界面12的尺寸调整触发条件。
通过第一区域和第二区域的设置,方便用户操作。用户只需要按压相应区域内的触发点,就能实现交互界面12的调整,方便用户单手操作,提升用户体验。
在某些实施方式中,第一区域包括:标题栏、交互界面12的边框所包围的区域;第二区域包括:交互界面12的边框或者交互界面12的四个边角。
如此设置,方便用户单手操作,提升用户体验。
请参阅图3,在某些实施方式中,界面的控制方法包括:
步骤S16,确定触发事件的起点是第一区域内的移动触发点13还是第二区域内的缩放触发点11;
若触发事件的起点为移动触发点13,步骤S161,确定触发事件对应交互界面12的位置调整触发条件;
若触发事件的起点为缩放触发点11,步骤S162,确定触发事件对应交互界面12的尺寸调整触发条件。
如此,第一区域和第二区域的设置可方便监测用户的意图是想移动交互界面12还是缩放交互界面12,随后根据用户的意图来对交互界面12进行尺寸的调节或者位置的调节。
例如,当第一区域内的移动触发点13被点击时,电子终端100确定触发事件为对交互界面12进行位置调整,从而可根据连续的触发点结束时候的移动点对交互界面12的位置进行调整,使得电子终端100能够根据用户对交互界面12移动的距离以使交互界面12能够在触摸显示区域101上的不同位置显示。
又例如,当第二区域内的缩放触发点11被点击时,电子终端100确定触发事件为对交互界面12进行尺寸调整,从而可根据连续的触发点结束时候的移动点对交互界面12的尺寸进行调整,使得电子终端100能够根据用户对交互界面12移动的距离来调整交互界面12的边界以使交互界面12的边界能够和触摸显示区域101的边界同比例缩放。
具体地,上述移动点指的是连续的触发点结束时的触发点。
请参阅图2,通常情况下,第二区域内的缩放触发点11和第一区域内的移动触发点13可以位于触摸显示区域101上任意一个位置,缩放触发点11和移动触发点13可以为使用者在使用触摸显示区域101时普遍习惯接触的点。例如,第二区域内的缩放触发点11位于触摸显示区域101的边框,第一区域内的移动触发点13位于触摸显示屏1003内。
移动触发点13或者缩放触发点11被操控可以是用户的手指按压或者碰到触摸显示区域101上缩放触发点11或移动触发点13,也可以为非接触式的靠近感应触摸显示区域101上缩放触发点11或移动触发点13。例如用户的手指位于缩放触发点11或移动触发点13的上方的某一个位置。
请参阅图4,在某些实施方式中,步骤S161包括:
步骤S1611,根据触发事件和连续的触发点的行进距离确定交互界面12的偏移量;
步骤S1612,根据偏移量确定交互界面12的触发后边界;
步骤S1613,将交互界面12的触发后边界与触摸显示区域101的边界进行对比以确定交互界面12的触发后边界是否越界;
若是,步骤S1614,对交互界面12进行边界修正;
若否,步骤S1615,将触发后边界设置为交互界面12的边界。
如此,可通过对交互界面12的边界进行修正能够使得交互界面12能够准确地在触摸显示区域101显示,提升交互界面12的交互准确度。
请参阅图5和图6,以下是对步骤S161中如何得到交互界面12的边界以及边界修正进行说明。
假设触摸显示区域101为矩形,触摸显示区域101的分辨率为1080*1920,以触摸显示区域101的上边界和左边界为xy轴建立坐标系,此时假定触摸显示区域101的左上角的点A的坐标为(0,0)、右下角的点B的坐标为(1080,1920),即x=1080,y=1920。此时,触摸显示区域101的长度为1920,宽度为1080。同时假定交互界面12左上角的点A1的坐标为(0,0)、右下角的点B1的坐标为(864,1536)。此时,交互界面12的长度为1536,宽度为864。假定触发事件对应的触发点为移动触发点13,且移动触发点13的起点C的坐标为(860,1500),连续的触发点操控起点C向右偏移100,此时移动触发点13的移动点C1的坐标变更为(960,1500),如图6,移动点C1相对起点C向右的偏移量为100,向下的偏移量为0,如此,交互界面12向右偏移了100,向下偏移了0,通过计算得出,移动后的交互界面12的左上角的点A2的坐标变更为(100,0)、右下角的点B2的坐标变更为(964,1536)。如此,通过移动点C1相对起点C的坐标的变化来计算出移动点C1相对起点C的上下位置的偏移量,从而得到交互界面12的边界。
其中,判断交互界面12的边界是否越界指的是,操控移动触发点13移动,同时获得移动点C1的坐标,计算移动的偏移量,从而获取移动后的交互界面12左上角的点的坐标和右下角的点的坐标,将交互界面12左上角的点的坐标和右下角的点的坐标与触摸显示区域101的左上角的点的坐标和右下角的点的坐标进行对比,当交互界面12的左上角的点的坐标和右下 角的点的坐标的数值不在触摸显示区域101的左上角的点的坐标和右下角的点的坐标的数值之间时,即为越界,此时需要重新获取移动点的坐标。
例如,请参阅图5和图7,以上述举例的数值为例,操控起点C向右偏移300,此时移动触发点13的移动点C2的坐标变更为(1160,1500),移动点C2相对起点C向右的偏移量为300,向下的偏移量为0,如此,交互界面12整体向右偏移了300,向下偏移了0,通过计算得出,移动后的交互界面12的左上角的点A3的坐标变更为(300,0)、右下角的点B3的坐标变更为(1164,1536),从图中可以,交互界面12的尺寸过大,越界。此时,交互界面12的宽度为1164,大于触摸显示区域101的宽度1080,交互界面12的边界已经超过触摸显示区域101的边界,此时交互界面12为越界,需要对交互界面12的边界修正。
例如,请参阅图7,此时交互界面12的右下角的点B3的坐标为(1164,1536),此时交互界面12为越界,交互界面12向右偏移300,超出触摸显示区域101的距离为84。此时处理模块103用于将移动点C2向左偏移84同时带动交互界面12会向左偏移84,如图8,使得交互界面12位于触摸显示区域101内,此时移动点C3的坐标变更为(1076,1500),交互界面12的左上角的点A4的坐标变更为(216,0)、右下角的点B4的坐标变更为(1080,1536),如此,交互界面12位于触摸显示区域101内,没有越界。通过修正的方式能够使得交互界面12在越界的时候能够自动修正,使得交互界面12不出现越界的情况,方便用户单手操作。
如此,能够防止用户将交互界面12移动出界而导致用户与交互界面12的交互准确度降低的情况发生,并且便于用户单手操作,提升用户体验。
请参阅图9,在某些实施方式中,步骤S162包括:
步骤S1621,根据触发事件和连续的触发点的行进距离确定交互界面12的缩放系数;
步骤S1622,根据缩放系数确定出交互界面12的触发后边界;
步骤S1623,将交互界面12的触发后边界与触摸显示区域101的边界进行对比以确定交互界面12的触发后边界是否越界;
若是,步骤S1624,对交互界面12进行边界修正;
若否,步骤S1625,将触发后边界设置为交互界面12的边界。
通过对交互界面12的边界进行修正能够使得交互界面12能够准确的在触摸显示区域101显示,提升交互界面12的交互准确度。
请参阅图10至图12,以下是对步骤S152中如何得到交互界面12的边界以及边界修正进行说明。
假设触摸显示区域101为矩形,触摸显示区域101的分辨率为1080*1920,以触摸显示区域101的上边界和左边界为xy轴建立坐标系,此时假定触摸显示区域101的左上角的点A的坐标为(0,0)、右下角的点B的坐标为(1080,1920),即x=1080,y=1920。此时,触摸显示区域101的长度为1920,宽度为1080。同时假定交互界面12左上角的点A1的坐标为(0,0)、右下角的点B1的坐标为(1080,1920)。此时,交互界面12的长度为1920,宽度为1080。假定缩放触发点11的起点D的坐标为(1080,860),操控起点D向左偏移216,此时缩放触发点11的移动点D1的坐标变更为(864,860),如图10,移动点D1相对起点D向左的偏移量为216,此时交互界面12的左上角的点A2的坐标为(0,0)、右下角的点B2的坐标为(864,1920),将864除以1080得到0.8,此时,缩放系数为0.8,为了使得交互界面12能够相对触摸显示区域101同比例缩放,因此,将右下角的点B2的坐标中未变化的数值乘以0.8,从而得到缩放后的交互界面12,如图11,此时缩放后的交互界面12的左上角的点A3的坐标为(0,0)、右下角的点B3的坐标为(864,1536),即交互界面12的右下角的点B3向上的偏移量为348、右下角的点B3向左的偏移量为216,此时,交互界面12右下角的点B3的数值为触摸显示区域101的右下角的点B的数值的0.8倍,即缩放系数0.8。如此,通过移动点D1相对起点D的坐标的变化来 计算出移动点D1相对起点D的上下位置的偏移量从而得到缩放系数,根据缩放系数来实现交互界面12的等比例缩放。缩放系数指的是将缩放触发点11的起点的横坐标和缩放触发点11的移动点的横坐标相除或者缩放触发点11的起点的纵坐标和缩放触发点11的移动点的纵坐标相除,得到缩放系数。
其中,为了避免图像放大超出屏幕的物理尺寸或缩放太小导致用户无法操作交互界面12,对此,限定了缩放系数的区间,即缩放系统在预设的缩放系数区间,例如[0.3,1]。
具体地,判断交互界面12的边界是否越界指的是,操控缩放触发点11移动,同时获得移动点的坐标,得到缩放系数,将得到的缩放系数与数值区间作对比,如若得到的缩放系数不在预设的缩放系数区间,即为越界。
例如,请参阅图9、图13和图14,以上述距离的数值为例,操控起点D向左偏移864,此时缩放触发点11的移动点D2的坐标变更为(216,860),移动点D2相对起点D向左的偏移量为864,此时交互界面12的左上角的点A4的坐标为(0,0)、右下角的点B4的坐标为(216,1920),将216除以1080得到0.2,此时,缩放系数为0.2,为了使得交互界面12能够相对触摸显示区域101同比例缩放,因此,将右下角的点B4的坐标中未变化的数值乘以0.2,从而得到缩放后的交互界面12,如图13,此时缩放后的交互界面12的左上角的点A5的坐标为(0,0)、右下角的点B5的坐标为(216,384),即交互界面12的右下角的点B5向上的偏移量为1536、右下角的点B3向左的偏移量为864,此时,交互界面12右下角的点B3的数值为触摸显示区域101的右下角的点B的数值的0.2倍,即缩放系数0.2,从图中可知,在缩放系数为0.2时,交互界面12的尺寸过小,越界。此时,交互界面12的宽度216,长度384,这个尺寸的交互界面12过小,不方便用户操作,因此需要对交互界面12的边界修正。
例如,请参阅图14,此时交互界面12的右下角的点B5的坐标为(216,384),此时交互界面12为越界,交互界面12的缩放系数的0.2,不在数值 区域0.3到1之间。此时,处理模块103用于将缩放系数会更变为0.3,请参阅图15,此时移动点D3的坐标变更为(324,860),交互界面12的左上角的点A6的坐标为(0,0)、右下角的点B6的坐标变更为(234,576),如此,交互界面12位于触摸显示区域101内,没有越界。通过修正的方式能够使得交互界面12在越界的时候能够自动修正,使得交互界面12不出现越界的情况,方便用户单手操作。
如此,能够防止用户将交互界面12移动出界而导致用户与交互界面12的交互准确度降低的情况发生,并且便于用户单手操作,提升用户体验。
在某些实施方式中,界面的控制方法包括:
在交互界面12尺寸或位置调整完成之后,则调整触摸显示区域101的输入焦点区域1011以使交互界面12的输入焦点区域1011与调整后的触摸显示区域101的输入焦点区域1011相匹配。
具体地,请参阅图16和图17,在某些实施方式中,界面的控制方法包括:
步骤S17,确定连续的触发点是否中断;
若是,步骤S18,根据已设置的交互界面12的边界调整触摸显示区域101的输入焦点区域1011以使交互界面12与输入焦点区域1011相匹配。
通常地,当连续的触发点是否中断,触摸显示区域101输出的电信号(如电压)会减少至触摸显示区域101没被触摸的程度,这样通过对电信号的判断可以确定连续的触发点是否中断。例如,手指按压到移动触发点13或缩放触发点11,并在触摸显示区域101上移动至某一位置,并将手指抬起,此时,触摸显示区域101所输出的电信号就会减少至触摸显示区域101没被触摸的程度,因此,可以确定连续的触发点已中断。如若手指一直按压触摸显示区域101而不移动,输出的电信号仍然较大,可以确定连续的触发点未中断。
在触发事件结束时,需要重新获取起点和移动点,然后重复步骤S6和 步骤S7,直到连续的触发点已中断。
当连续的触发点中断以后,得到调整后的交互界面12。
由于交互界面12调整之后的界面区域的大小或位置会变化,因此,为了保证调整后的交互界面12的内容能够和调整前的交互界面12的内容相匹配,需要对调整之后的交互界面12进行输入焦点矫正,以使调整后的交互界面12的内容能够和调整前的交互界面12的内容相匹配。
请参阅图18,其中,界面的控制方法包括:
步骤S181,确定触摸显示区域101的输入焦点区域1011的边界;
步骤S182,确定调整后交互界面12的输入焦点区域1011的边界;
步骤S183,根据触发事件及连续的触发点,调整触摸显示区域101的输入焦点区域1011的边界使调整后的触摸显示区域101的输入焦点区域1011的边界与调整后交互界面12的输入焦点区域1011的边界匹配。
如此能够使得调整之后的交互界面12的内容能够和调整前的交互界面12的内容相匹配。
请参阅图19至图21,以下是对步骤S173中如何将调整后的交互界面12的输入焦点区域1011和触摸显示区域101的输入焦点区域1011的匹配进行说明。
具体地,以下是对针对交互界面12尺寸调整之后如何将调整后的交互界面12的输入焦点区域1011和触摸显示区域101的输入焦点区域1011的匹配进行说明。
请参阅图19,假设触摸显示区域101为矩形,触摸显示区域101的分辨率为1080*1920,以触摸显示区域101的上边界和左边界为xy轴建立坐标系,此时假定触摸显示区域101的左上角的点A的坐标为(0,0)、右下角的点B的坐标为(1080,1920),即x=1080,y=1920。此时,触摸显示区域101的长度为1920,宽度为1080。假定交互界面12左上角的点A1的坐标为(0,0)、右下角的点B1的坐标为(1080,1920)。此时,交互 界面12的长度为1920,宽度为1080。同时假定此时触摸输入区域的输入焦点区域1011为矩形,且输入焦点区域1011左上角的点E的坐标为(500,800)、输入焦点区域1011右下角的点F的坐标为(600,1060),假定缩放系数为0.5,通过调整之后,请参阅图20,交互界面12的左上角的点A2的坐标为(0,0)、右下角点B2的坐标为(540,960),此时需要对输入焦点区域1011的边界进行调整,输入焦点区域1011的边界的点的数据均乘以缩放系数0.5,请参阅图21,此时,输入焦点区域1011左上角的点E1的坐标为(250,400)、输入焦点区域1011右下角的点F1的坐标为(300,530),如此实现将调整后的交互界面12的输入焦点区域1011和触摸显示区域101的输入焦点区域1011匹配从而使得调整后的交互界面12的内容和触摸输入区域的内容相匹配,提升交互界面12的交互准确度。
针对交互界面12位置调整之后如何将调整后的交互界面12的输入焦点区域1011和触摸显示区域101的输入焦点区域1011的匹配情况与上述类似,因此不一一赘述。
请参阅图22,本申请实施方式提供一种电子终端100,其包括:
触摸显示区域101,用于显示交互界面12;
触摸监听模块102,用于检测在触摸显示区域101的触发事件;
处理模块103,用于判断触发事件是否对应交互界面12的位置或尺寸调整触发条件;若对应,则判断触发事件后是否在触摸显示区域101检测到连续的触发点,及触发事件后在触摸显示区域101检测到连续的触发点,额根据触发事件和连续的触发点调整交互界面12的位置或尺寸;
显示模块104,用于将调整后的交互界面12显示在触摸显示区域101。
本申请实施方式的电子终端100能够调整交互界面12的位置和尺寸,使得交互界面12能够根据需要进行缩放和位置调整,提升用户体验。
本申请实施方式的界面的控制方法可以应用电子终端100。需要说明的是,上述对界面的控制方法的实施方式和有益效果的解释说明,也适用于本 实施方式的电子终端100,为避免冗余,在此不再详细展开。
在某些实施方式中,触摸监听模块102用于判断当触发事件位于第一区域,则触发事件对应交互界面12的位置调整触发条件;当触发事件位于第二区域,则触发事件对应交互界面12的尺寸调整触发条件。
在某些实施方式中,在触发事件及连续触发点对交互界面12的位置进行调整时,处理模块103用于根据触发事件和连续的触发点的行进距离确定交互界面12的偏移量,根据偏移量确定交互界面12的触发后边界,将交互界面12的触发后边界与触摸显示区域101的边界进行对比以确定交互界面12的触发后边界是否越界,及若是,用于对交互界面12进行边界修正,以及若否,用于将触发后边界设置为交互界面12的边界。
在某些实施方式中,在触发事件及连续的触发点对交互界面12的尺寸进行调整时,处理模块103用于根据触发事件和连续的触发点的行进距离确定交互界面12的缩放系数,根据缩放系数确定出交互界面12的触发后边界,将交互界面12的触发后边界与触摸显示区域101的边界进行对比以确定交互界面12的触发后边界是否越界,及若是,用于对交互界面12进行边界修正,以及若否,用于将触发后边界设置为交互界面12的边界。
请参阅图23,在某些实施方式中,电子终端100还包括:
焦点矫正模块105,用于在交互界面12尺寸或位置调整完成之后,调整触摸显示区域101的输入焦点区域1011以使交互界面12的输入焦点区域1011与调整后的触摸显示区域101的输入焦点区域1011相匹配。
焦点矫正模块105用于确定触摸显示区域101的输入焦点区域1011的边界,确定调整后交互界面12的输入焦点区域1011的边界,根据触发事件,调整触摸显示区域101的输入焦点区域1011的边界使调整后的触摸显示区域101的输入焦点区域1011的边界与调整后交互界面12的输入焦点区域1011的边界匹配。
请参图24,本申请提供一种电子终端1000,包括触摸显示区域101、 处理器1001和存储器1002,处理器1001连接触摸显示区域101,触摸显示区域101用于显示交互界面12,存储器1002存储有计算机可读指令,指令被处理器1001执行时,使得处理器1001执行上述实施方式的界面的控制方法。
本申请实施方式的电子终端1000能够调整交互界面12的位置和尺寸,使得交互界面12能够根据需要进行缩放和位置调整,提升用户体验。并且对调整后的交互界面12矫正输入焦点,使得交互界面12的触发和交互界面12的内容相匹配,提升交互界面12的交互准确度。
本申请实施方式的界面的控制方法可以应用于电子终端1000。需要说明的是,上述对界面的控制方法的实施方式和有益效果的解释说明,也适用于本实施方式的电子终端1000,为避免冗余,在此不再详细展开。另外,电子终端包括触摸显示屏1003,触摸显示屏1003具有触摸显示区域101。
图24是一个实施例中的电子终端1000的内部模块示意图。电子终端1000通过系统总线1004连接触摸显示区域101、处理器1001和存储器1002(例如为非易失性存储介质)。其中,存储器1002存储有计算机可读指令。该计算机可读指令可被处理器1001执行,以实现上述任意一实施方式的界面的控制方法。该处理器1001可用于提供计算和控制能力,支撑整个电子终端1000的运行。该电子终端1000可以是手机及平板电脑等可用于触摸操作的设备。本领域技术人员可以理解。图24中示出的结构,仅仅是与本申请方案相关的部分结构的示意图,并不构成对本申请方案所应用于其上的电子终端1000的限定,具体的电子终端1000可以包括比图中更多或更少的部件,或者组合某些部件,或者具有不同的部件布置。
在本说明书的描述中,参考术语“一个实施方式”、“一些实施方式”、“示意性实施方式”、“示例”、“具体示例”、或“一些示例”等的描述意指结合所述实施方式或示例描述的具体特征、结构、材料或者特点包含于本申请的至少一个实施方式或示例中。在本说明书中,对上述术语的示意性 表述不一定指的是相同的实施方式或示例。而且,描述的具体特征、结构、材料或者特点可以在任何的一个或多个实施方式或示例中以合适的方式结合。
流程图中或在此以其他方式描述的任何过程或方法描述可以被理解为,表示包括一个或更多个用于执行特定逻辑功能或过程的步骤的可执行指令的代码的模块、片段或部分,并且本申请的优选实施方式的范围包括另外的执行,其中可以不按所示出或讨论的顺序,包括根据所涉及的功能按基本同时的方式或按相反的顺序,来执行功能,这应被本申请的实施例所属技术领域的技术人员所理解。
在流程图中表示或在此以其他方式描述的逻辑和/或步骤,例如,可以被认为是用于执行逻辑功能的可执行指令的定序列表,可以具体执行在任何计算机可读介质中,以供指令执行系统、装置或设备(如基于计算机的系统、包括处理器的系统或其他可以从指令执行系统、装置或设备取指令并执行指令的系统)使用,或结合这些指令执行系统、装置或设备而使用。就本说明书而言,"计算机可读介质"可以是任何可以包含、存储、通信、传播或传输程序以供指令执行系统、装置或设备或结合这些指令执行系统、装置或设备而使用的装置。计算机可读介质的更具体的示例(非穷尽性列表)包括以下,具有一个或多个布线的电连接部(电子装置),便携式计算机盘盒(磁装置),随机存取存储器(RAM),只读存储器(ROM),可擦除可编辑只读存储器(EPROM或闪速存储器),光纤装置,以及便携式光盘只读存储器(CDROM)。另外,计算机可读介质甚至可以是可在其上打印所述程序的纸或其他合适的介质,因为可以例如通过对纸或其他介质进行光学扫描,接着进行编辑、解译或必要时以其他合适方式进行处理来以电子方式获得所述程序,然后将其存储在计算机存储器中。
应在理解,本申请的各部分可以用硬件、软件、固件或它们的组合来执行。在上述实施方式中,多个步骤或方法可以用存储在存储器中且由合适的 指令执行系统执行的软件或固件来执行。例如,如果用硬件来执行,和在另一实施方式中一样,可用本领域公知的下列技术中的任一项或他们的组合来执行,具有用于对数据信号执行逻辑功能的逻辑门电路的离散逻辑电路,具有合适的组合逻辑门电路的专用集成电路,可编程门阵列(PGA),现场可编程门阵列(FPGA)等。
本技术领域的普通技术人员可以理解执行上述实施方法携带的全部或部分步骤是可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,该程序在执行时,包括方法实施例的步骤之一或其组合。
此外,在本申请各个实施例中的各功能单元可以集成在一个处理模块中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个模块中。上述集成的模块既可以采用硬件的形式执行,也可以采用软件功能模块的形式执行。所述集成的模块如果以软件功能模块的形式执行并作为独立的产品销售或使用时,也可以存储在一个计算机可读取存储介质中。
上述提到的存储介质可以是只读存储器,磁盘或光盘等。尽管上面已经示出和描述了本申请的实施例,可以理解的是,上述实施例是示例性的,不能理解为对本申请的限制,本领域的普通技术人员在本申请的范围内可以对上述实施例进行变化、修改、替换和变型。

Claims (15)

  1. 一种界面的控制方法,应用于电子终端,所述电子终端包括触摸显示区域,所述触摸显示区域显示有交互界面,其特征在于,所述控制方法包括:
    检测在所述触摸显示区域的触发事件;
    判断所述触发事件是否对应所述交互界面的位置或尺寸调整触发条件;
    若是,则判断所述触发事件后是否在所述触摸显示区域检测到连续的触发点;
    若是,则根据所述触发事件和所述连续的触发点调整所述交互界面的位置或尺寸;以及
    将调整后的所述交互界面显示在所述触摸显示区域。
  2. 如权利要求1所述的界面的控制方法,其特征在于,所述检测在所述触摸显示区域的触发事件包括在所述触摸显示区域上检测到一触发点。
  3. 如权利要求1所述的界面的控制方法,其特征在于,所述交互界面包括第一区域和第二区域,当所述触发事件位于所述第一区域,则所述触发事件对应所述交互界面的位置调整触发条件;当所述触发事件位于所述第二区域,则所述触发事件对应所述交互界面的尺寸调整触发条件。
  4. 如权利要求3所述的界面的控制方法,其特征在于,所述第一区域包括:标题栏、所述交互界面的边框所包围的区域;所述第二区域包括:所述交互界面的边框或者所述交互界面的四个边角。
  5. 如权利要求3所述的界面的控制方法,其特征在于,当所述触发事件 位于所述第一区域,则根据所述触发事件和所述连续的触发点调整所述交互界面的位置包括:
    根据所述触发事件和所述连续的触发点的行进距离确定所述交互界面的偏移量;
    根据所述偏移量确定所述交互界面的触发后边界;
    将所述交互界面的触发后边界与所述触摸显示区域的边界进行对比以确定所述交互界面的触发后边界是否越界;
    若是,对所述交互界面进行边界修正;
    若否,将所述触发后边界设置为所述交互界面的边界。
  6. 如权利要求3所述的界面的控制方法,其特征在于,当所述触发事件位于所述第二区域,则根据所述触发事件和所述连续的触发点调整所述交互界面的尺寸包括:
    根据所述触发事件和所述连续的触发点的行进距离确定所述交互界面的缩放系数;
    根据所述缩放系数确定出所述交互界面的触发后边界;
    将所述交互界面的触发后边界与所述触摸显示区域的边界进行对比以确定所述交互界面的触发后边界是否越界;
    若是,对所述交互界面进行边界修正;
    若否,将所述触发后边界设置为所述交互界面的边界。
  7. 如权利要求1所述的界面的控制方法,其特征在于,所述界面的控制方法包括:
    在所述交互界面尺寸或位置调整完成之后,则调整所述触摸显示区域的输入焦点区域以使所述交互界面的输入焦点区域与调整后的所述触摸显示区域的输入焦点区域相匹配。
  8. 如权利要求7所述的界面的控制方法,其特征在于,所述界面的控制方法包括:
    确定所述触摸显示区域的输入焦点区域的边界;
    确定调整后所述交互界面的输入焦点区域的边界;
    根据所述触发事件及连续的触发点,调整所述触摸显示区域的输入焦点区域的边界使调整后的所述触摸显示区域的输入焦点区域的边界与调整后所述交互界面的输入焦点区域的边界匹配。
  9. 一种电子终端,其特征在于,包括:
    触摸显示区域,用于显示交互界面;
    触摸监听模块,用于检测在所述触摸显示区域的触发事件;
    处理模块,用于判断所述触发事件是否对应所述交互界面的位置或尺寸调整触发条件;若对应,则判断所述触发事件后是否在所述触摸显示区域检测到连续的触发点;若所述触发事件后在所述触摸显示区域检测到连续的触发点,根据所述触发事件和所述连续的触发点调整所述交互界面的位置或尺寸。
    显示模块,用于将调整后的所述交互界面显示在所述触摸显示区域。
  10. 如权利要求9所述的电子终端,其特征在于,所述触摸监听模块用于判断当所述触发事件位于所述第一区域,则所述触发事件对应所述交互界面的位置调整触发条件;当所述触发事件位于所述第二区域,则所述触发事件对应所述交互界面的尺寸调整触发条件。
  11. 如权利要求9所述的电子终端,其特征在于,在所述触发事件及所述连接的触发点对所述交互界面的位置进行调整时,所述处理模块用于根据 所述触发事件和所述连续的触发点的行进距离确定所述交互界面的偏移量,根据所述偏移量确定所述交互界面的触发后边界,将所述交互界面的触发后边界与所述触摸显示区域的边界进行对比以确定所述交互界面的触发后边界是否越界,及若是,用于对所述交互界面进行边界修正,以及若否,用于将所述触发后边界设置为所述交互界面的边界。
  12. 如权利要求9所述的电子终端,其特征在于,在所述触发事件及所述连续的触发点对所述交互界面的尺寸进行调整时,所述处理模块用于根据所述触发事件和所述连续的触发点的行进距离确定所述交互界面的缩放系数,根据所述缩放系数确定出所述交互界面的触发后边界,将所述交互界面的触发后边界与所述触摸显示区域的边界进行对比以确定所述交互界面的触发后边界是否越界,及若是,用于对所述交互界面进行边界修正,以及若否,用于将所述触发后边界设置为所述交互界面的边界。
  13. 如权利要求11或12所述的电子终端,其特征在于,包括:
    焦点矫正模块,用于在交互界面尺寸或位置调整完成之后,调整触摸显示区域的输入焦点区域以使交互界面的输入焦点区域与调整后的触摸显示区域的输入焦点区域相匹配。
  14. 如权利要求13所述的电子终端,其特征在于,所述焦点矫正模块用于确定所述触摸显示区域的输入焦点区域的边界,确定调整后所述交互界面的输入焦点区域的边界,根据所述触发事件,调整所述触摸显示区域的输入焦点区域的边界使调整后的所述触摸显示区域的输入焦点区域的边界与调整后所述交互界面的输入焦点区域的边界匹配。
  15. 一种电子终端,其特征在于,包括触摸显示区域、处理器和存储器, 所述处理器电连接所述触摸显示区域,所述触摸显示区域用于显示交互界面,所述存储器存储有计算机可读指令,所述指令被所述处理器执行时,使得所述处理器执行权利要求1至8中任一项所述的控制方法。
PCT/CN2018/112469 2018-10-29 2018-10-29 界面的控制方法及电子终端 WO2020087218A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201880096005.5A CN112740166A (zh) 2018-10-29 2018-10-29 界面的控制方法及电子终端
PCT/CN2018/112469 WO2020087218A1 (zh) 2018-10-29 2018-10-29 界面的控制方法及电子终端

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/112469 WO2020087218A1 (zh) 2018-10-29 2018-10-29 界面的控制方法及电子终端

Publications (1)

Publication Number Publication Date
WO2020087218A1 true WO2020087218A1 (zh) 2020-05-07

Family

ID=70464222

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/112469 WO2020087218A1 (zh) 2018-10-29 2018-10-29 界面的控制方法及电子终端

Country Status (2)

Country Link
CN (1) CN112740166A (zh)
WO (1) WO2020087218A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111782103A (zh) * 2020-07-15 2020-10-16 网易(杭州)网络有限公司 一种交互控件位置调整的方法及装置、设备、介质

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115098205A (zh) * 2022-06-17 2022-09-23 来也科技(北京)有限公司 基于rpa和ai实现ia的流程编辑界面的控制方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104932821A (zh) * 2015-06-02 2015-09-23 青岛海信移动通信技术股份有限公司 一种智能终端操作界面的显示方法及智能终端
CN105867715A (zh) * 2015-10-30 2016-08-17 乐视移动智能信息技术(北京)有限公司 界面显示处理方法、装置及终端设备
US20160306540A1 (en) * 2013-12-26 2016-10-20 Yulong Computer Telecommunication Scientific (Shenzhen) Co., Ltd. Terminal operating method and terminal
CN106527860A (zh) * 2016-11-07 2017-03-22 上海与德信息技术有限公司 一种屏幕界面显示方法及装置

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102479041A (zh) * 2010-11-25 2012-05-30 英业达股份有限公司 小型触控屏幕单手缩放画面的操作方法
CN102981596A (zh) * 2012-12-21 2013-03-20 东莞宇龙通信科技有限公司 终端和屏幕界面显示方法
CN104461232A (zh) * 2014-09-30 2015-03-25 小米科技有限责任公司 在屏幕显示过程中确定缩小比例的方法及装置
CN104298433A (zh) * 2014-09-30 2015-01-21 小米科技有限责任公司 屏幕显示方法、装置及移动终端
CN105677173A (zh) * 2015-12-25 2016-06-15 珠海格力电器股份有限公司 终端的单手操作模式控制方法和装置
KR20170088691A (ko) * 2016-01-25 2017-08-02 엘지전자 주식회사 페어링된 장치, 알림 및 어플리케이션의 제어에 관한 한 손 조작 모드를 적용한 이동 통신 단말기
CN106445354A (zh) * 2016-11-24 2017-02-22 北京小米移动软件有限公司 终端设备的触摸控制方法及装置
CN108696638B (zh) * 2018-05-10 2020-06-26 维沃移动通信有限公司 一种移动终端的控制方法及移动终端

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160306540A1 (en) * 2013-12-26 2016-10-20 Yulong Computer Telecommunication Scientific (Shenzhen) Co., Ltd. Terminal operating method and terminal
CN104932821A (zh) * 2015-06-02 2015-09-23 青岛海信移动通信技术股份有限公司 一种智能终端操作界面的显示方法及智能终端
CN105867715A (zh) * 2015-10-30 2016-08-17 乐视移动智能信息技术(北京)有限公司 界面显示处理方法、装置及终端设备
CN106527860A (zh) * 2016-11-07 2017-03-22 上海与德信息技术有限公司 一种屏幕界面显示方法及装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111782103A (zh) * 2020-07-15 2020-10-16 网易(杭州)网络有限公司 一种交互控件位置调整的方法及装置、设备、介质
CN111782103B (zh) * 2020-07-15 2022-02-08 网易(杭州)网络有限公司 一种交互控件位置调整的方法及装置、设备、介质

Also Published As

Publication number Publication date
CN112740166A (zh) 2021-04-30

Similar Documents

Publication Publication Date Title
US20160004373A1 (en) Method for providing auxiliary information and touch control display apparatus using the same
TWI463361B (zh) 觸控面板之局部控制方法與系統
US10019139B2 (en) System and method for content size adjustment
US8751955B2 (en) Scrollbar user interface for multitouch devices
US9696871B2 (en) Method and portable terminal for moving icon
TWI656472B (zh) 介面調整方法及電子裝置
EP3079340A1 (en) Processing method and apparatus, and terminal
WO2020199723A1 (zh) 移动终端显示画面的控制方法、装置、设备和存储介质
WO2017059734A1 (zh) 一种图片缩放方法及电子设备
US20090135152A1 (en) Gesture detection on a touchpad
US11740754B2 (en) Method for interface operation and terminal, storage medium thereof
WO2020052347A1 (zh) 一种窗口调节的方法、移动终端及计算机可读存储介质
WO2021068381A1 (zh) 界面显示方法、装置、设备及存储介质
WO2020087218A1 (zh) 界面的控制方法及电子终端
US20190339858A1 (en) Method and apparatus for adjusting virtual key of mobile terminal
JP2023530395A (ja) アプリアイコン制御方法、装置及び電子機器
US10296130B2 (en) Display control apparatus, display control method, and storage medium storing related program
WO2023092992A1 (zh) 一种页面元素的处理方法、设备及计算机可读存储介质
JP5657866B2 (ja) 入力装置、ポインタの表示位置調整方法およびプログラム
JP2015138360A (ja) オブジェクト操作システム及びオブジェクト操作制御プログラム並びにオブジェクト操作制御方法
US9632697B2 (en) Information processing apparatus and control method thereof, and non-transitory computer-readable medium
US10599326B2 (en) Eye motion and touchscreen gestures
TWI462000B (zh) 觸控面板的信號處理方法與觸控面板系統
WO2017101340A1 (zh) 多点触控调整视频窗口的方法及设备
CN106325613B (zh) 触控显示装置及其方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18938821

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18938821

Country of ref document: EP

Kind code of ref document: A1