CN108958619A - Operation method of user interface, equipment and computer readable storage medium - Google Patents

Operation method of user interface, equipment and computer readable storage medium Download PDF

Info

Publication number
CN108958619A
CN108958619A CN201710385948.7A CN201710385948A CN108958619A CN 108958619 A CN108958619 A CN 108958619A CN 201710385948 A CN201710385948 A CN 201710385948A CN 108958619 A CN108958619 A CN 108958619A
Authority
CN
China
Prior art keywords
user interface
operating point
operating
operational order
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710385948.7A
Other languages
Chinese (zh)
Inventor
郑容艳
刘欢
方刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Advanced New Technologies Co Ltd
Advantageous New Technologies Co Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN201710385948.7A priority Critical patent/CN108958619A/en
Publication of CN108958619A publication Critical patent/CN108958619A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials

Abstract

This application discloses a kind of operation method of user interface, equipment and computer readable storage mediums.It include: that operating point is provided in the user interface;When detecting around the operating gesture of the operating point, according to the position of the position of the operating point and the operating gesture, operational order is determined;According to the operational order, controls the user interface and execute moving operation.In this way, the operating point is that on-fixed point can be as a reference point with the operating point by single finger touch-control user interface due to providing operating point in the user interface, the operational order that single finger touch-control generates is determined, to realize the exact operations to the user interface;Furthermore, due to providing operating point in the user interface, it may be implemented to realize that user interface executes rotation process by single finger touch-control user interface, effectively improve the flexibility and accuracy of operating user interface, and then improve the user experience at user to user interface.

Description

Operation method of user interface, equipment and computer readable storage medium
Technical field
This application involves field of communication technology more particularly to a kind of operation method of user interface, equipment and computer-readable Storage medium.
Background technique
User interface (User Interface;UI) the medium interacted as equipment with user-to-user information realizes meter Conversion between calculation machine linguistic form and the acceptable linguistic form of people.
Currently, one-handed performance can be used when operating to user interface, can also be operated using both hands.Regardless of It is one-handed performance mode or both hands mode of operation, corresponding operation instruction can be generated, make user interface according to the operation Instruction is moved.
However, it has been investigated that, when using one-handed performance mode, a kind of situation is set by a finger touch user Standby touch screen, generates operational order, and user interface is moved according to the operational order;Another situation is that passing through two hands Refer to the touch screen of touch-control user equipment, generate operational order, user interface is moved according to the operational order.Both of these case It has the following problems:
1, the operational order generated by the touch screen of a finger touch user equipment, in control user interface according to this It when operational order occurs mobile, is easy to appear that mobile range is excessive, leads to the problem of operating accuracy difference.Meanwhile it cannot achieve Rotation process is executed by a finger touch user equipment.
2, the operational order generated by the touch screen of two finger touch user equipmenies, in control user interface according to this When operational order occurs mobile, when especially control user interface executes rotation process, due to the mutual system between two fingers About, once the finger for being defined as rotation center leaves touch screen, it will lead to the failure of this rotation process.
It can be seen that there are operational stability difference and operating accuracy are low for operation method of user interface in the prior art The problem of.
Summary of the invention
In view of this, the embodiment of the present application provides a kind of operation method of user interface, equipment and computer-readable storage Medium, for solving the problems, such as that operating accuracy existing for one-handed performance user interface in the prior art is low.
The embodiment of the present application adopts the following technical solutions:
The embodiment of the present application provides a kind of operation method of user interface, applied in the electronic equipment with touch screen, The described method includes:
Operating point is provided in the user interface;
When detecting around the operating gesture of the operating point, according to the position of the operating point and the operating gesture Position, determine operational order;
According to the operational order, controls the user interface and execute moving operation.
The embodiment of the present application also provides a kind of operating user interface equipment, applied to the electronic equipment with touch screen In, the operation equipment includes:
Unit is provided, operating point is provided in the user interface;
Determination unit, when detecting around the operating gesture of the operating point, according to the position of the operating point and institute The position for stating operating gesture, determines operational order;
Control unit controls the user interface and executes moving operation according to the operational order.
The embodiment of the present application also provides a kind of electronic equipment, comprising: one or more processors and memory, it is described to deposit Reservoir is stored with program, and is configured to execute following steps by one or more of processors:
Operating point is provided in the user interface;
When detecting around the operating gesture of the operating point, according to the position of the operating point and the operating gesture Position, determine operational order;
According to the operational order, controls the user interface and execute moving operation.
The embodiment of the present application also provides a kind of computer readable storage mediums, including the journey being used in combination with electronic equipment Sequence, described program can be executed by processor to complete following steps:
Operating point is provided in the user interface;
When detecting around the operating gesture of the operating point, according to the position of the operating point and the operating gesture Position, determine operational order;
According to the operational order, controls the user interface and execute moving operation.
The embodiment of the present application use at least one above-mentioned technical solution can reach it is following the utility model has the advantages that
By technical solution provided by the embodiments of the present application, operating point is provided in the user interface;Institute is surrounded when detecting When stating the operating gesture of operating point, according to the position of the position of the operating point and the operating gesture, operational order is determined;Root According to the operational order, controls the user interface and execute moving operation.In this way, due to providing operation in the user interface Point, the operating point are that on-fixed point can be as a reference point with the operating point by single finger touch-control user interface, are determined single The operational order that finger touch generates, to realize the exact operations to the user interface;Further, since providing in the user interface Operating point may be implemented to realize that user interface executes rotation process by single finger touch-control user interface, effectively improve use The flexibility and accuracy of family interface operation, and then improve the user experience at user to user interface.
Detailed description of the invention
The drawings described herein are used to provide a further understanding of the present application, constitutes part of this application, this Shen Illustrative embodiments and their description please are not constituted an undue limitation on the present application for explaining the application.In the accompanying drawings:
Fig. 1 is a kind of flow diagram of operation method of user interface provided by the embodiments of the present application;
Fig. 2 (a) is the schematic diagram of user's touch-control user interface provided by the embodiments of the present application;
Fig. 2 (b) is the schematic diagram of determining operational order provided by the embodiments of the present application;
Fig. 2 (c) is the schematic diagram that control user interface provided by the embodiments of the present application executes operation;
Fig. 3 is a kind of structural schematic diagram of operating user interface equipment provided by the embodiments of the present application.
Specific embodiment
In order to realize the purpose of the application, the embodiment of the present application provides a kind of operation method of user interface, equipment and calculating Machine readable storage medium storing program for executing, by providing operating point in the user interface;When detecting around the operating gesture of the operating point, According to the position of the position of the operating point and the operating gesture, operational order is determined;According to the operational order, institute is controlled It states user interface and executes moving operation.This effectively realizes the purpose of single finger rotation process user interface, improves interaction nature Property and high efficiency.
To keep the purposes, technical schemes and advantages of the application clearer, below in conjunction with the application specific embodiment and Technical scheme is clearly and completely described in corresponding attached drawing.Obviously, described embodiment is only the application one Section Example, instead of all the embodiments.Based on the embodiment in the application, those of ordinary skill in the art are not doing Every other embodiment obtained under the premise of creative work out, shall fall in the protection scope of this application.
Below in conjunction with attached drawing, the technical scheme provided by various embodiments of the present application will be described in detail.
Fig. 1 is a kind of flow diagram of operation method of user interface provided by the embodiments of the present application.The method is as follows It is shown.The operation method of user interface is applied in the electronic equipment with touch screen, the executing subject of the embodiment of the present application It can be electronic equipment, such as smart phone, tablet computer equipment, or the client of user interface is provided.Here with Electronic equipment for executing subject as being illustrated.
Step 101: operating point is provided in the user interface.
In the embodiment of the present application, an operating point is provided in the user interface, which is on-fixed point, Ke Yi Client determines the position of the operating point according to the content currently shown when running, that is to say, that is shown in user interface is interior Hold difference, the position of operating point may also be different.
It should be noted that for the convenience and accuracy of one-finger operation user interface, in the embodiment of the present application In provided technical solution, a variable operating point can be provided in the user interface, which grasps as single finger Make the reference point of user interface, can accurately control operation of the single finger to user interface.
It is described below in detail and how to provide operating point in the user interface.
First way: the relative position between predetermined registration operation point and user interface.
Specifically, in the software development phase of client, the condition that can be met with the position of predetermined registration operation point, such as: Assuming that the position coordinates of operating point are (x, y), wherein x can pass through the cross of 4 apex coordinates of the user interface currently shown Coordinate determines that y can be determined by the ordinate of 4 apex coordinates of the user interface currently shown.Again for example: operating point Position coordinates can be determined according to 4 apex coordinates of the touch screen for the electronic equipment for running the client, not done here specific It limits.
Specifically, it is determined that the position of the user interface currently shown;And according to the position and preset operating point and institute The relative position between user interface is stated, determines physical location of the operating point in the user interface.
Such as: assuming that the relative position between preset operating point and the user interface is that operating point is located at user interface Center and a quarter on the lower side position, then if the position of the user interface currently shown be represented by (x1, y1) (x2, Y1) (x1, y2) (x2, y2), wherein x1 is less than x2, and y1 is less than y2.
Based on the mode of above-mentioned record, it can determine that the corresponding coordinate in the position of operating point is:
[the half of (x2-x1);(y2-y1) a quarter].It determines and meets [the half of (x2-x1); (y2-y1) a quarter] condition position, and the icon of display operation point on the position.
The second way: operating point is determined according to the touch control operation of user.
Specifically, the touch command that user sends is received, the position comprising user's touch-control is sat in the touch command Mark;And map to the position coordinates in the user interface, obtain the position of operating point.
In the embodiment of the present application, when realizing one-finger operation, it can be directed toward user interface transmission touch-control by one hand and refer to It enables, and determines the position coordinates of single finger touch-control according to the touch command, it can be using the position coordinates as the position of operating point Coordinate.
It executes rotation process it should be noted that this mode and both hands in the prior art refer to there are different: existing A finger for needing both hands to refer in technology touch-control user interface always, cannot leave user interface, once leaving, will lead to rotation Turn operation executes failure;In the scheme documented by the embodiment of the present application, rotation is executed to user interface if referred to by both hands Turn operation, when the finger touch user interface that both hands refer to, client will store the position of the finger touch user interface Coordinate, and using the position coordinates as the position coordinates of operating point, accordingly even when the finger has left user interface, can also protect Demonstrate,prove the normal execution of rotation process.
The case where for above-mentioned record, in the embodiment of the present application, if rotation process executes completion, client will at this time Delete the position coordinates of operating point determined by this rotation process.
The third mode: the position of operating point is determined by positioning method.
Current location is determined by preset positioning method;And according to the current location, use is provided in the user interface In the operating point for determining operational order.
Specifically, if user interface is the corresponding user interface of map application, in user circle for loading the map application When face, current location can be determined by preset positioning method, and current location is determined as to the position of operating point.
It should be noted that operating point documented by the embodiment of the present application can be with real-time display in the user interface, it can also To detect single finger touch-control user interface so that user interface is displayed in the user interface when occurring mobile, other when should Operating point is in recessive display state in user interface, and the display state of operating point is not specifically limited here.
Step 102: when detecting the operating gesture around the operating point, according to the position of the operating point and described The position of operating gesture, determines operational order.
In the embodiment of the present application, the touch command that user sends is detected, includes to surround the behaviour in the touch command Make the position coordinates of at least one touch point of point;According to the position coordinates of the touch point, determine around the operating point Operating gesture.
Specifically, detection user is from the first touch point to the touch command of the second touch point, then can be according to the first touching The position coordinates of the position coordinates and the second touch point of controlling point determine that user surrounds the operating gesture of the operating point.Fig. 2 (a) is The schematic diagram of user's touch-control user interface provided by the embodiments of the present application.User be can be seen that from Fig. 2 (a) from the first touch-control O'clock to the second touch point touch command.
Specifically, according to the position of the position of the operating point and the operating gesture, operational order is determined, comprising:
Firstly, determining that with the operating point be movement according to the position of the position of the operating point and the operating gesture The moving parameter at center includes one or more of move angle, moving distance, moving direction in the moving parameter.
Specifically, it is determined that the first position coordinate of the operating point and the second position coordinate of the operating gesture;And According to the first position coordinate and the second position coordinate, calculate using the operating point as the move angle of Mobility Center with And moving distance.
Fig. 2 (b) is the schematic diagram that operational order is determined provided by the embodiment of the present application.It can be seen that from Fig. 2 (b) Operating gesture corresponds to the first touch point A and the second touch point B, the i.e. coordinate of the coordinate of the first touch point A and the second touch point B It can be referred to as the second position coordinate of operating gesture.
It should be noted that documented first position coordinate and second position coordinate respectively indicate in the embodiment of the present application The position of operating gesture and the position of operating point, " first " and " second " do not have particular meaning, respectively indicate different coordinates.
It is illustrated so that moving operation is rotation process as an example in the embodiment of the present application.
After detecting from the first touch point A to the operating gesture of the second touch point B, calculate around operating point C from the The rotation angle O and direction of rotation F of one touch point A to the second touch point B.
Wherein, operating gesture can be from the first touch point A to the rotation gesture I of the second touch point B or from the first touch-control One in the straight line gesture II of point A to the second touch point B or curve gesture III from the first touch point A to the second touch point B Kind, it is not specifically limited here.
It should be noted that in practical operation, the operating gesture of user's institute's touch-control in the user interface is not all ideal The operating gesture of state, such as rotation gesture I are not necessarily one section of camber line on standard round, and straight line gesture II is not necessarily straight Straight line, curve gesture III is not necessarily the identical reference waveform of wave crest, trough height.But this does not influence subsequent determining operation Instruction.
The rotation angle O how calculated around operating point C from the first touch point A to the second touch point B is described below in detail And direction of rotation F.
First way:
Firstly, calculating it of first distance a, the second touch point B and operating point C between the first touch point A and operating point C Between second distance b and the first touch point A and the second touch point B between third distance c.
Secondly, calculating rotation angle O according to first distance a, second distance b and third distance c.
Specifically, calculation method includes but is not limited to: utilizing the cosine lawCos O is found out, then Rotation angle O is further found out according to anticosine theorem.
Wherein, it for the calculating step of first distance a, second distance b, third distance c, can include but is not limited to:
When operating point C is provided in a step 101, the coordinate (x0, y0) of operating point C is stored;
One in from the first touch point A to operating gesture I, II or III of the second touch point B is detected in a step 102 When kind, the coordinate (x1, y1) of the first touch point A and the coordinate (x2, y2) of the second touch point B are stored;
So according to the coordinate of the coordinate of the first touch point A, the coordinate of the second touch point B and operating point C, calculate separately To first distance
Second distance is calculated
Third distance is calculated
Therefore,
The second way:
Firstly, according to the coordinate of the coordinate of the first touch point A, the coordinate of the second touch point B and operating point C, calculating operation Primary vector of the point C to the first touch point AAnd operating point C to the second touch point B Secondary vector
Secondly, according to primary vectorAnd secondary vectorDetermine rotation angle O.
Specifically, rotation angle O is primary vectorAnd secondary vectorBetween angle, therefore using vector Dot product,Later, according to anti- The cosine law finds out rotation angle O.
Finally, calculating around operating point C from the first touch point A to the direction of rotation F of the second touch point B.
Specifically, the primary vector of the first touch point A to operating point C is calculated And operating point C is to the secondary vector of the second touch point B
According to primary vectorAnd secondary vectorJudge the direction of rotation F.
Specifically, it utilizesWithBetween cross product
When cross product calculated value is timing, direction of rotation F is to rotate clockwise;
When cross product calculated value is negative, direction of rotation F is rotation counterclockwise.Alternatively, according to being reversed.
It should be noted that the moving parameter of left and right translation or upper and lower translation may refer to aforesaid way and be calculated, Here it is not specifically limited.
Secondly, generating operational order according to the moving parameter.
Specifically, it when obtaining rotation angle and direction of rotation, according to rotation angle and direction of rotation, generates operation and refers to It enables.
It should be noted that documented operating gesture can be around operating point generation in the embodiment of the present application, It may also mean that any operating gesture on a user interface, be not specifically limited here.
Step 103: according to the operational order, controlling the user interface and execute moving operation.
In the embodiment of the present application, according to move angle, moving distance and the movement side for including in the operational order To controlling the user interface and moved using the operating point as Mobility Center.
Assuming that comprising rotation angle and direction of rotation in operational order, then controlling user interface according to the operational order Around operating point, according to calculating, gained rotates angle and rotation center rotates.Fig. 2 (c) is control provided by the embodiments of the present application use Family interface executes the schematic diagram of operation.From Fig. 2 (c) as can be seen that according to the operational order, user interface is controlled around operation According to calculating, gained rotates angle to point and rotation center rotates.
It should be noted that being wrapped in the way of above-mentioned record when installing map application software in the electronic device After the operational order of the angle containing rotation and direction of rotation, electronic equipment can transfer map SDK (Software Development Kit, Software Development Kit) it is rotated to control map.This can be realized the rotation of one-finger operation electronic map interface, go forward side by side one Step changes operating point by single-point touch to control any point of electronic map interface in user interface and rotate.
In a further embodiment, the figure currently shown in user interface can also be controlled according to the operational order of generation Case executes rotation process to adjust the visual angle that user checks.
By technical solution provided by the embodiments of the present application, operating point is provided in the user interface;Institute is surrounded when detecting When stating the operating gesture of operating point, according to the position of the position of the operating point and the operating gesture, operational order is determined;Root According to the operational order, controls the user interface and execute moving operation.In this way, due to providing operation in the user interface Point, the operating point are that on-fixed point can be as a reference point with the operating point by single finger touch-control user interface, are determined single The operational order that finger touch generates, to realize the exact operations to the user interface;Further, since providing in the user interface Operating point may be implemented to realize that user interface executes rotation process by single finger touch-control user interface, effectively improve use The flexibility and accuracy of family interface operation, and then improve the user experience at user to user interface.
Fig. 3 is a kind of structural schematic diagram of operating user interface equipment provided by the embodiments of the present application.The operation equipment It include: that unit 301, determination unit 302 and control unit 303 are provided, in which:
Unit 301 is provided, operating point is provided in the user interface;
Determination unit 302, when detecting the operating gesture around the operating point, according to the position of the operating point and The position of the operating gesture, determines operational order;
Control unit 303 controls the user interface and executes moving operation according to the operational order.
In another embodiment of the application, the offer unit 301 provides operating point in the user interface, comprising:
Determine the position of the user interface currently shown;
According to the relative position between the position and preset operating point and the user interface, the operating point is determined Physical location in the user interface.
In another embodiment of the application, the offer unit 301 provides operating point in the user interface, comprising:
The touch command that user sends is received, includes the position coordinates of user's touch-control in the touch command;
The position coordinates are mapped in the user interface, the position of operating point is obtained.
In another embodiment of the application, the offer unit 301 provides operating point in the user interface, comprising:
Current location is determined by preset positioning method;
According to the current location, the operating point for determining operational order is provided in the user interface.
In another embodiment of the application, position and the behaviour of the determination unit 302 according to the operating point The position made a sign with the hand, determines operational order, comprising:
According to the position of the position of the operating point and the operating gesture, determine using the operating point as Mobility Center Moving parameter includes one or more of move angle, moving distance, moving direction in the moving parameter;
According to the moving parameter, operational order is generated.
In another embodiment of the application, position and the behaviour of the determination unit 302 according to the operating point The position made a sign with the hand is determined using the operating point as the moving parameter of Mobility Center, comprising:
Determine the first position coordinate of the operating point and the second position coordinate of the operating gesture;
According to the first position coordinate and the second position coordinate, calculate using the operating point as the shifting of Mobility Center Dynamic angle and moving distance.
In another embodiment of the application, the determination unit 302 detects the manipulator around the operating point Gesture, comprising:
The touch command that user sends is detected, includes at least one touch-control around the operating point in the touch command The position coordinates of point;
According to the position coordinates of the touch point, the operating gesture for surrounding the operating point is determined.
In another embodiment of the application, described control unit 303 controls the user according to the operational order Interface executes moving operation, comprising:
According to move angle, moving distance and the moving direction for including in the operational order, user circle is controlled It is moved using the operating point as Mobility Center in face.
It should be noted that operation equipment documented by the embodiment of the present application can be realized by software mode, it can also be with It is realized by hardware mode, is not specifically limited here.Documented operation equipment can be applied to have in the embodiment of the present application Have in the electronic equipment of touch screen, by providing operating point in the user interface;When detecting the operation around the operating point When gesture, according to the position of the position of the operating point and the operating gesture, operational order is determined;Referred to according to the operation It enables, controls the user interface and execute moving operation.In this way, due to providing operating point in the user interface, which is On-fixed point can be as a reference point with the operating point by single finger touch-control user interface, determines what single finger touch-control generated Operational order, to realize the exact operations to the user interface;It, can be with further, since provide operating point in the user interface It realizes and realizes that user interface executes rotation process by single finger touch-control user interface, effectively improve the spirit of operating user interface Activity and accuracy, and then improve the user experience at user to user interface.
A kind of electronic equipment is additionally provided in the embodiment of the present application, comprising: one or more processors and memory, it is described Memory is stored with program, and is configured to execute following steps by one or more of processors:
Operating point is provided in the user interface;
When detecting around the operating gesture of the operating point, according to the position of the operating point and the operating gesture Position, determine operational order;
According to the operational order, controls the user interface and execute moving operation.
It should be noted that the electronic equipment has the function that aforesaid operations equipment has, no longer repeat one by one here.
A kind of computer readable storage medium is additionally provided in the embodiment of the present application, including what is be used in combination with electronic equipment Program, described program can be executed by processor to complete following steps:
Operating point is provided in the user interface;
When detecting around the operating gesture of the operating point, according to the position of the operating point and the operating gesture Position, determine operational order;
According to the operational order, controls the user interface and execute moving operation.
It should be noted that the computer readable storage medium has the function that aforesaid operations equipment has, here not It repeats one by one again.
In the 1990s, the improvement of a technology can be distinguished clearly be on hardware improvement (for example, Improvement to circuit structures such as diode, transistor, switches) or software on improvement (improvement for method flow).So And with the development of technology, the improvement of current many method flows can be considered as directly improving for hardware circuit. Designer nearly all obtains corresponding hardware circuit by the way that improved method flow to be programmed into hardware circuit.Cause This, it cannot be said that the improvement of a method flow cannot be realized with hardware entities module.For example, programmable logic device (Programmable Logic Device, PLD) (such as field programmable gate array (Field Programmable Gate Array, FPGA)) it is exactly such a integrated circuit, logic function determines device programming by user.By designer Voluntarily programming comes a digital display circuit " integrated " on a piece of PLD, designs and makes without asking chip maker Dedicated IC chip.Moreover, nowadays, substitution manually makes IC chip, this programming is also used instead mostly " is patrolled Volume compiler (logic compiler) " software realizes that software compiler used is similar when it writes with program development, And the source code before compiling also write by handy specific programming language, this is referred to as hardware description language (Hardware Description Language, HDL), and HDL is also not only a kind of, but there are many kind, such as ABEL (Advanced Boolean Expression Language)、AHDL(Altera Hardware Description Language)、Confluence、CUPL(Cornell University Programming Language)、HDCal、JHDL (Java Hardware Description Language)、Lava、Lola、MyHDL、PALASM、RHDL(Ruby Hardware Description Language) etc., VHDL (Very-High-Speed is most generally used at present Integrated Circuit Hardware Description Language) and Verilog.Those skilled in the art also answer This understands, it is only necessary to method flow slightly programming in logic and is programmed into integrated circuit with above-mentioned several hardware description languages, The hardware circuit for realizing the logical method process can be readily available.
Controller can be implemented in any suitable manner, for example, controller can take such as microprocessor or processing The computer for the computer readable program code (such as software or firmware) that device and storage can be executed by (micro-) processor can Read medium, logic gate, switch, specific integrated circuit (Application Specific Integrated Circuit, ASIC), the form of programmable logic controller (PLC) and insertion microcontroller, the example of controller includes but is not limited to following microcontroller Device: ARC 625D, Atmel AT91SAM, Microchip PIC18F26K20 and Silicone Labs C8051F320 are deposited Memory controller is also implemented as a part of the control logic of memory.It is also known in the art that in addition to Pure computer readable program code mode is realized other than controller, can be made completely by the way that method and step is carried out programming in logic Controller is obtained to come in fact in the form of logic gate, switch, specific integrated circuit, programmable logic controller (PLC) and insertion microcontroller etc. Existing identical function.Therefore this controller is considered a kind of hardware component, and to including for realizing various in it The device of function can also be considered as the structure in hardware component.Or even, it can will be regarded for realizing the device of various functions For either the software module of implementation method can be the structure in hardware component again.
System, device, module or the unit that above-described embodiment illustrates can specifically realize by computer chip or entity, Or it is realized by the product with certain function.It is a kind of typically to realize that equipment is computer.Specifically, computer for example may be used Think personal computer, laptop computer, cellular phone, camera phone, smart phone, personal digital assistant, media play It is any in device, navigation equipment, electronic mail equipment, game console, tablet computer, wearable device or these equipment The combination of equipment.
For convenience of description, it is divided into various units when description apparatus above with function to describe respectively.Certainly, implementing this The function of each unit can be realized in the same or multiple software and or hardware when application.
It should be understood by those skilled in the art that, the embodiment of the present invention can provide as method, system or computer program Product.Therefore, complete hardware embodiment, complete software embodiment or reality combining software and hardware aspects can be used in the present invention Apply the form of example.Moreover, it wherein includes the computer of computer usable program code that the present invention, which can be used in one or more, The computer program implemented in usable storage medium (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.) produces The form of product.
The present invention be referring to according to the method for the embodiment of the present invention, the process of equipment (system) and computer program product Figure and/or block diagram describe.It should be understood that every one stream in flowchart and/or the block diagram can be realized by computer program instructions The combination of process and/or box in journey and/or box and flowchart and/or the block diagram.It can provide these computer programs Instruct the processor of general purpose computer, special purpose computer, Embedded Processor or other programmable data processing devices to produce A raw machine, so that being generated by the instruction that computer or the processor of other programmable data processing devices execute for real The device for the function of being specified in present one or more flows of the flowchart and/or one or more blocks of the block diagram.
These computer program instructions, which may also be stored in, is able to guide computer or other programmable data processing devices with spy Determine in the computer-readable memory that mode works, so that it includes referring to that instruction stored in the computer readable memory, which generates, Enable the manufacture of device, the command device realize in one box of one or more flows of the flowchart and/or block diagram or The function of being specified in multiple boxes.
These computer program instructions also can be loaded onto a computer or other programmable data processing device, so that counting Series of operation steps are executed on calculation machine or other programmable devices to generate computer implemented processing, thus in computer or The instruction executed on other programmable devices is provided for realizing in one or more flows of the flowchart and/or block diagram one The step of function of being specified in a box or multiple boxes.
In a typical configuration, calculating equipment includes one or more processors (CPU), input/output interface, net Network interface and memory.
Memory may include the non-volatile memory in computer-readable medium, random access memory (RAM) and/or The forms such as Nonvolatile memory, such as read-only memory (ROM) or flash memory (flash RAM).Memory is computer-readable medium Example.
Computer-readable medium includes permanent and non-permanent, removable and non-removable media can be by any method Or technology come realize information store.Information can be computer readable instructions, data structure, the module of program or other data. The example of the storage medium of computer includes, but are not limited to phase change memory (PRAM), static random access memory (SRAM), moves State random access memory (DRAM), other kinds of random access memory (RAM), read-only memory (ROM), electric erasable Programmable read only memory (EEPROM), flash memory or other memory techniques, read-only disc read only memory (CD-ROM) (CD-ROM), Digital versatile disc (DVD) or other optical storage, magnetic cassettes, tape magnetic disk storage or other magnetic storage devices Or any other non-transmission medium, can be used for storage can be accessed by a computing device information.As defined in this article, it calculates Machine readable medium does not include temporary computer readable media (transitory media), such as the data-signal and carrier wave of modulation.
It should also be noted that, the terms "include", "comprise" or its any other variant are intended to nonexcludability It include so that the process, method, commodity or the equipment that include a series of elements not only include those elements, but also to wrap Include other elements that are not explicitly listed, or further include for this process, method, commodity or equipment intrinsic want Element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including described want There is also other identical elements in the process, method of element, commodity or equipment.
The application can describe in the general context of computer-executable instructions executed by a computer, such as program Module.Generally, program module includes routines performing specific tasks or implementing specific abstract data types, programs, objects, group Part, data structure etc..The application can also be practiced in a distributed computing environment, in these distributed computing environments, by Task is executed by the connected remote processing devices of communication network.In a distributed computing environment, program module can be with In the local and remote computer storage media including storage equipment.
All the embodiments in this specification are described in a progressive manner, same and similar portion between each embodiment Dividing may refer to each other, and each embodiment focuses on the differences from other embodiments.Especially for system reality For applying example, since it is substantially similar to the method embodiment, so being described relatively simple, related place is referring to embodiment of the method Part explanation.
The above description is only an example of the present application, is not intended to limit this application.For those skilled in the art For, various changes and changes are possible in this application.All any modifications made within the spirit and principles of the present application are equal Replacement, improvement etc., should be included within the scope of the claims of this application.

Claims (18)

1. a kind of operation method of user interface, applied in the electronic equipment with touch screen, which comprises
Operating point is provided in the user interface;
When detecting around the operating gesture of the operating point, according to the position of the position of the operating point and the operating gesture It sets, determines operational order;
According to the operational order, controls the user interface and execute moving operation.
2. operation method of user interface as described in claim 1, provides operating point in the user interface, comprising:
Determine the position of the user interface currently shown;
According to the relative position between the position and preset operating point and the user interface, determine the operating point in institute State the physical location in user interface.
3. operation method of user interface as described in claim 1, provides operating point in the user interface, comprising:
The touch command that user sends is received, includes the position coordinates of user's touch-control in the touch command;
The position coordinates are mapped in the user interface, the position of operating point is obtained.
4. operation method of user interface as described in claim 1, provides operating point in the user interface, comprising:
Current location is determined by preset positioning method;
According to the current location, the operating point for determining operational order is provided in the user interface.
5. such as the described in any item operation method of user interface of Claims 1-4, according to the position of the operating point and the behaviour The position made a sign with the hand, determines operational order, comprising:
According to the position of the position of the operating point and the operating gesture, determine using the operating point as the movement of Mobility Center Parameter includes one or more of move angle, moving distance, moving direction in the moving parameter;
According to the moving parameter, operational order is generated.
6. operation method of user interface as claimed in claim 5, according to the position of the operating point and the operating gesture Position is determined using the operating point as the moving parameter of Mobility Center, comprising:
Determine the first position coordinate of the operating point and the second position coordinate of the operating gesture;
According to the first position coordinate and the second position coordinate, calculate using the operating point as the traveling angle of Mobility Center Degree and moving distance.
7. operation method of user interface as described in claim 1 detects the operating gesture around the operating point, comprising:
The touch command that user sends is detected, includes at least one touch point around the operating point in the touch command Position coordinates;
According to the position coordinates of the touch point, the operating gesture for surrounding the operating point is determined.
8. operation method of user interface as claimed in claim 5 controls the user interface and executes according to the operational order Moving operation, comprising:
According to move angle, moving distance and the moving direction for including in the operational order, control the user interface with The operating point is that Mobility Center is moved.
9. a kind of operating user interface equipment, applied in the electronic equipment with touch screen, the operation equipment includes:
Unit is provided, operating point is provided in the user interface;
Determination unit, when detecting around the operating gesture of the operating point, according to the position of the operating point and the behaviour The position made a sign with the hand, determines operational order;
Control unit controls the user interface and executes moving operation according to the operational order.
10. operating user interface equipment as claimed in claim 9, the offer unit provides operating point in the user interface, Include:
Determine the position of the user interface currently shown;
According to the relative position between the position and preset operating point and the user interface, determine the operating point in institute State the physical location in user interface.
11. operating user interface equipment as claimed in claim 9, the offer unit provides operating point in the user interface, Include:
The touch command that user sends is received, includes the position coordinates of user's touch-control in the touch command;
The position coordinates are mapped in the user interface, the position of operating point is obtained.
12. operating user interface equipment as claimed in claim 9, the offer unit provides operating point in the user interface, Include:
Current location is determined by preset positioning method;
According to the current location, the operating point for determining operational order is provided in the user interface.
13. the determination unit is according to the operating point such as claim 9 to 12 described in any item operating user interface equipment Position and the operating gesture position, determine operational order, comprising:
According to the position of the position of the operating point and the operating gesture, determine using the operating point as the movement of Mobility Center Parameter includes one or more of move angle, moving distance, moving direction in the moving parameter;
According to the moving parameter, operational order is generated.
14. operating user interface equipment as claimed in claim 13, the determination unit according to the position of the operating point and The position of the operating gesture is determined using the operating point as the moving parameter of Mobility Center, comprising:
Determine the first position coordinate of the operating point and the second position coordinate of the operating gesture;
According to the first position coordinate and the second position coordinate, calculate using the operating point as the traveling angle of Mobility Center Degree and moving distance.
15. operating user interface equipment as claimed in claim 9, the determination unit detects the behaviour around the operating point It makes a sign with the hand, comprising:
The touch command that user sends is detected, includes at least one touch point around the operating point in the touch command Position coordinates;
According to the position coordinates of the touch point, the operating gesture for surrounding the operating point is determined.
16. operation method of user interface as claimed in claim 13, described control unit controls institute according to the operational order It states user interface and executes moving operation, comprising:
According to move angle, moving distance and the moving direction for including in the operational order, control the user interface with The operating point is that Mobility Center is moved.
17. a kind of electronic equipment, comprising: one or more processors and memory, the memory are stored with program, and by It is configured to execute following steps by one or more of processors:
Operating point is provided in the user interface;
When detecting around the operating gesture of the operating point, according to the position of the position of the operating point and the operating gesture It sets, determines operational order;
According to the operational order, controls the user interface and execute moving operation.
18. a kind of computer readable storage medium, including the program being used in combination with electronic equipment, described program can be by processor It executes to complete following steps:
Operating point is provided in the user interface;
When detecting around the operating gesture of the operating point, according to the position of the position of the operating point and the operating gesture It sets, determines operational order;
According to the operational order, controls the user interface and execute moving operation.
CN201710385948.7A 2017-05-26 2017-05-26 Operation method of user interface, equipment and computer readable storage medium Pending CN108958619A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710385948.7A CN108958619A (en) 2017-05-26 2017-05-26 Operation method of user interface, equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710385948.7A CN108958619A (en) 2017-05-26 2017-05-26 Operation method of user interface, equipment and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN108958619A true CN108958619A (en) 2018-12-07

Family

ID=64494777

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710385948.7A Pending CN108958619A (en) 2017-05-26 2017-05-26 Operation method of user interface, equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN108958619A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109814784A (en) * 2019-01-07 2019-05-28 平安科技(深圳)有限公司 Picture rotation method, apparatus, computer equipment and storage medium
CN109828807A (en) * 2018-12-24 2019-05-31 天津字节跳动科技有限公司 Method, apparatus, electronic equipment and the storage medium of the small routine gesture switching page
CN109831687A (en) * 2018-12-12 2019-05-31 深圳慧源创新科技有限公司 Unmanned plane figure passes video editing method and technology
CN113961130A (en) * 2021-11-15 2022-01-21 宝宝巴士股份有限公司 Method for simulating knob operation based on Unity3D

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103246476A (en) * 2013-04-27 2013-08-14 华为技术有限公司 Method, device and terminal device for rotating screen contents
US20140279029A1 (en) * 2013-03-15 2014-09-18 Paschar Llc Mobile device user interface with dynamic advertising control interface area
US20150362998A1 (en) * 2014-06-17 2015-12-17 Amazon Technologies, Inc. Motion control for managing content
CN105242841A (en) * 2014-07-10 2016-01-13 阿里巴巴集团控股有限公司 Method and device for controlling display object zoom
CN105867819A (en) * 2016-03-30 2016-08-17 惠州Tcl移动通信有限公司 Display content rotating detection method and device thereof
CN106227451A (en) * 2016-07-26 2016-12-14 维沃移动通信有限公司 The operational approach of a kind of mobile terminal and mobile terminal
CN106484207A (en) * 2015-08-25 2017-03-08 阿里巴巴集团控股有限公司 A kind of touch control device interface Zoom method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140279029A1 (en) * 2013-03-15 2014-09-18 Paschar Llc Mobile device user interface with dynamic advertising control interface area
CN103246476A (en) * 2013-04-27 2013-08-14 华为技术有限公司 Method, device and terminal device for rotating screen contents
US20150362998A1 (en) * 2014-06-17 2015-12-17 Amazon Technologies, Inc. Motion control for managing content
CN105242841A (en) * 2014-07-10 2016-01-13 阿里巴巴集团控股有限公司 Method and device for controlling display object zoom
CN106484207A (en) * 2015-08-25 2017-03-08 阿里巴巴集团控股有限公司 A kind of touch control device interface Zoom method and device
CN105867819A (en) * 2016-03-30 2016-08-17 惠州Tcl移动通信有限公司 Display content rotating detection method and device thereof
CN106227451A (en) * 2016-07-26 2016-12-14 维沃移动通信有限公司 The operational approach of a kind of mobile terminal and mobile terminal

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109831687A (en) * 2018-12-12 2019-05-31 深圳慧源创新科技有限公司 Unmanned plane figure passes video editing method and technology
CN109828807A (en) * 2018-12-24 2019-05-31 天津字节跳动科技有限公司 Method, apparatus, electronic equipment and the storage medium of the small routine gesture switching page
CN109814784A (en) * 2019-01-07 2019-05-28 平安科技(深圳)有限公司 Picture rotation method, apparatus, computer equipment and storage medium
CN109814784B (en) * 2019-01-07 2022-07-08 平安科技(深圳)有限公司 Picture rotation method and device, computer equipment and storage medium
CN113961130A (en) * 2021-11-15 2022-01-21 宝宝巴士股份有限公司 Method for simulating knob operation based on Unity3D

Similar Documents

Publication Publication Date Title
CN108958619A (en) Operation method of user interface, equipment and computer readable storage medium
CN106651987B (en) Paths planning method and device
US8581901B2 (en) Methods and apparatus for interactive rotation of 3D objects using multitouch gestures
US8427440B2 (en) Contact grouping and gesture recognition for surface computing
US9542068B2 (en) System and method for constrained manipulations of 3D objects by multitouch inputs
US9612675B2 (en) Emulating pressure sensitivity on multi-touch devices
CN103412720B (en) Process method and the device thereof of touch control type input signal
CN109214632A (en) A kind of risk control method and equipment
CN102722331A (en) Touch unlocking method and device and electronic equipment
WO2019119975A1 (en) Information input method and device
US9478070B2 (en) Coordinate information updating device
CN110389810A (en) A kind of method, device and equipment for quickly putting UI control on virtual canvas
CN110276024A (en) A kind of method and device that information is shown
Kang et al. Editing 3D models on smart devices
CN103761094A (en) Method for polygon combination in planar drawing
CN110530398A (en) A kind of method and device of electronic map accuracy detection
CN110119381A (en) A kind of index updating method, device, equipment and medium
CN110032328A (en) A kind of size adjustment method and device of operation object
CN109657088A (en) A kind of picture risk checking method, device, equipment and medium
CN109656946A (en) A kind of multilist relation query method, device and equipment
TWI544403B (en) System for adjusting user interface and adjustment method thereof
CN107766703A (en) Watermark addition processing method, device and client
US10379639B2 (en) Single-hand, full-screen interaction on a mobile device
CN110334554A (en) The methods of exhibiting and device of graphic code
JP2014175012A (en) Mouse pointer control method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20201019

Address after: Cayman Enterprise Centre, 27 Hospital Road, George Town, Grand Cayman Islands

Applicant after: Innovative advanced technology Co.,Ltd.

Address before: Cayman Enterprise Centre, 27 Hospital Road, George Town, Grand Cayman Islands

Applicant before: Advanced innovation technology Co.,Ltd.

Effective date of registration: 20201019

Address after: Cayman Enterprise Centre, 27 Hospital Road, George Town, Grand Cayman Islands

Applicant after: Advanced innovation technology Co.,Ltd.

Address before: A four-storey 847 mailbox in Grand Cayman Capital Building, British Cayman Islands

Applicant before: Alibaba Group Holding Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20181207