WO2018177207A1 - 操作控制方法、装置及存储介质 - Google Patents

操作控制方法、装置及存储介质 Download PDF

Info

Publication number
WO2018177207A1
WO2018177207A1 PCT/CN2018/080205 CN2018080205W WO2018177207A1 WO 2018177207 A1 WO2018177207 A1 WO 2018177207A1 CN 2018080205 W CN2018080205 W CN 2018080205W WO 2018177207 A1 WO2018177207 A1 WO 2018177207A1
Authority
WO
WIPO (PCT)
Prior art keywords
release
target object
display position
movement
icon
Prior art date
Application number
PCT/CN2018/080205
Other languages
English (en)
French (fr)
Inventor
弓弢
黄源超
龙海
寇敬
张逸轩
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Publication of WO2018177207A1 publication Critical patent/WO2018177207A1/zh
Priority to US16/433,914 priority Critical patent/US10845981B2/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/22Setup operations, e.g. calibration, key configuration or button assignment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/533Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0238Programmable keyboards
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03543Mice or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • G06F3/04897Special input arrangements or commands for improving display capability
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1018Calibration; Key and button assignment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1025Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals details of the interface with the game device, e.g. USB version detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0381Multimodal input, i.e. interface arrangements enabling the user to issue commands by simultaneous use of input devices of different nature, e.g. voice plus gesture on digitizer

Definitions

  • the embodiments of the present invention relate to the field of terminal technologies, and in particular, to an operation control method, apparatus, and storage medium.
  • An embodiment of the present invention provides an operation control method, where the method includes:
  • buttons instruction includes valid button information, when a triggering operation for a designated button on a keyboard of the paired connection is obtained;
  • An embodiment of the present invention further provides an operation control apparatus, the apparatus comprising: a memory, a processor; wherein the memory stores computer readable instructions, and the processor executes the computer readable instructions in the storage
  • an operation control apparatus comprising: a memory, a processor; wherein the memory stores computer readable instructions, and the processor executes the computer readable instructions in the storage
  • the button instruction including valid button information when acquiring a triggering operation for a designated button on the paired connection keyboard;
  • the embodiment of the present application further provides a non-transitory computer readable storage medium storing computer readable instructions, which may cause at least one processor to perform the method described above.
  • FIG. 1 is a schematic diagram of an implementation environment involved in an operation control method according to an embodiment of the present application
  • FIG. 3 is a schematic diagram of an operation interface provided by another embodiment of the present application.
  • FIG. 4 is a schematic diagram of an operation control process provided by another embodiment of the present application.
  • FIG. 5 is a schematic diagram of an operation interface according to another embodiment of the present application.
  • FIG. 6 is a schematic diagram of an operation control process provided by another embodiment of the present application.
  • FIG. 7 is a schematic diagram of an operation interface provided by another implementation of the present application.
  • FIG. 8 is a schematic diagram of an operation control process provided by another embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of an apparatus for operation control according to another embodiment of the present application.
  • FIG. 10 is a schematic structural diagram of an operation control terminal according to an embodiment of the present application.
  • Operation object An object that the terminal needs to control during the running of the specified application.
  • the specified application may be a game application, and the operating system compatible with the specified application may be an Android operating system, an IOS operating system, or the like.
  • the operation object may be a virtual character of a game-like application, a virtual creature, or the like.
  • Object property The property that the action object has in the specified app.
  • the object properties are the skills that the operational object has, including attack, treatment, defense, and so on.
  • Attribute icon An icon that carries the object properties on the operator interface.
  • the property icon can be the skill wheel in the game.
  • Move icon An icon that moves the control operation object on the operation interface.
  • the mobile icon can be a moving roulette in the game.
  • Release range The actual scope of the object attribute.
  • the release range is the skill range.
  • the control of the operation object mainly includes the following two aspects: one is to control the movement of the operation object, and the other is to control the release of the object attribute of the operation object.
  • the prior art is mainly implemented based on four buttons of the AWDS on the keyboard and the combination thereof, wherein the button W is used to control the movement of the operation object, the button A is used to control the movement of the operation object to the left, and the button D is used for the control.
  • the operation object moves to the right, the button S is used to control the operation object to move downward, the button A+W is used to control the operation object to move to the upper left, the button A+S is used to control the operation object to move to the lower left, and the button D+W is used for control.
  • the operation object moves to the upper right, and the button D+S is used to control the operation object to move to the lower right.
  • the existing operation control process is: receiving a button message sent by the keyboard, the button message includes the triggered button information; according to the triggered button information, The control direction of the operation object is determined; according to the control direction, the operation object is controlled to move.
  • the prior art can only implement the upper, lower, left, right, upper left, lower left, upper right, and lower right directions when controlling the movement of the operation object.
  • Control while in the actual control operation, the moving direction of the operating object may be 360 degrees. It can be seen that the existing operation control method controls the moving direction of the operating object to be relatively simple.
  • the present application controls the movement of the operation object according to the click position of the mouse and the position of the operation object, and expands the movement direction of the operation object.
  • the existing operation control process is: receiving, during the running of the game application, a click command sent by the mouse connected with the connection, the click command includes a click position; determining whether the click position is located at a center position of the attribute icon If the click position is located at the center of any of the attribute icons, and the drag command sent by the mouse is received, the drag operation of the attribute icon is performed according to the drag instruction; during the dragging of the attribute icon, If the display instruction sent by the mouse is received, the control operation object takes the current position as a starting point, and the direction of the drag track parallel to the attribute icon is the display direction, and the object attribute corresponding to the attribute icon is displayed.
  • the operation process is complicated because the user needs to click the center position of the property icon with the mouse and needs to drag the property icon.
  • the application controls the release of the object attribute of the operation object, which not only reduces the operation complexity, but also improves the release precision.
  • FIG. 1 illustrates an implementation environment involved in an operation control method according to an embodiment of the present invention.
  • the implementation environment includes a terminal 101 , a keyboard 102 , and a mouse 103 .
  • the terminal 101 may be a notebook computer, a desktop computer, or the like.
  • the terminal 101 is installed with an operating system simulator, and the operating system simulator can simulate the running environment of the specified application on the terminal 101, so that the terminal can smoothly run the specified application. .
  • the keyboard 102 is one of a wired keyboard and a wireless keyboard for inputting key information to the terminal 101.
  • the mouse 103 is one of a wired mouse and a wireless mouse for transmitting a control command to the terminal 102.
  • the terminal 101 is also paired with the keyboard 102 and the mouse 103, so that an operation command can be acquired by the keyboard 102 and the mouse 103 during the operation of the designated application.
  • the manner in which the terminal 101 is connected to the keyboard 102 and the mouse 103 includes, but is not limited to, one of a wired connection and a wireless connection.
  • the embodiment of the present invention provides an operation control method, which is applied to an operation control device.
  • the method flow provided by the embodiment of the present invention includes:
  • the terminal displays an operation interface of the specified application.
  • the first display position of the operation object and the second display position of the paired mouse are displayed on the operation interface.
  • the operation object is an object managed by a user account registered to the specified application, and the operation object has at least one object attribute, such as an operation performed by the operation object, such as an attack, a defense, a treatment, and the like.
  • the second display position is actually the cursor position.
  • the terminal When acquiring a trigger operation on a designated button on the paired connection keyboard, the terminal acquires a button instruction.
  • the terminal when the terminal obtains a triggering operation on the designated button on the paired connection keyboard, the terminal may acquire a button instruction, and the button instruction includes valid button information.
  • the designated button can be defined in advance by the user.
  • the valid button information includes the identifier of the triggered button, the trigger duration of the triggered button, and the like.
  • the terminal acquires the target object attribute from the object attribute of the operation object according to the valid key information.
  • the terminal obtains the target object attribute from at least one object attribute of the operation object according to the valid key information, including but not limited to the following methods:
  • the terminal may maintain an object database, where the object attribute database stores a correspondence between the button identifier and the object attribute, and also stores a release range of each object attribute and the like. Based on the object database, the terminal may acquire the target object attribute from at least one object attribute of the operation object. For example, the object attribute corresponding to the button identifier Q in the object database is used to represent the attribute of the attack function, and the object attribute corresponding to the button identifier W is used to represent the treatment function. If the button identifier in the valid button information is Q, the target object attribute can be obtained. Used to indicate the attack function.
  • the terminal may set different values for each button identifier in advance, and set different values for each object attribute of the operation object.
  • the terminal may obtain a valid button. Obtaining the value of the button identifier in the information, and comparing the value of the button identifier with the value of each object property of the operation object.
  • the value of the button identifier is the same as the value of any object property of the operation object, the The object property is the target object property.
  • the object attribute is a function of the package.
  • the operation object may perform a corresponding action.
  • the target object attribute is an object attribute corresponding to the specified key, and the target object attribute is acquired.
  • the method includes: determining an object attribute corresponding to the specified key, using the object attribute as a target object attribute, and giving the operation object permission to call a function corresponding to the target object attribute.
  • the terminal acquires a release instruction for the target object attribute.
  • the terminal acquires a release instruction for the target object attribute, which is used to control the operation object to release the target object attribute in the specified area, under the trigger of the release operation.
  • the terminal determines the designated area according to the first display position and the second display position.
  • the terminal may use the stripe area with the first display position as the starting point and the second display position as the end point as the designated area, or may be the first display position.
  • the sector area including the first display area and the second display area is used as the designated area.
  • the terminal may determine the designated area by other means, which is not specifically limited in this embodiment of the present invention.
  • the terminal controls the operation object to release the target object attribute in the specified area.
  • the terminal may wait for the first preset duration, and when the duration of the release command reaches the first preset duration, Controls the action object to release the target object property within the specified area.
  • the first preset duration is determined by the responsiveness of the processor, and the first preset duration may be 50 milliseconds, 60 milliseconds, or the like.
  • the release instruction refers to an operation in which the user releases the specified key;
  • the release target object attribute refers to a function corresponding to the target object attribute being invoked in response to the release instruction to control the execution of the operation object and the target object attribute. Corresponding action.
  • the terminal when the release operation of the operation object is acquired, the terminal may also display a releaser of the target object attribute within the specified area.
  • the method provided by the embodiment of the present invention also displays the releaser of the target object attribute in the specified area when the release operation on the operation object is acquired.
  • the releaser is used to indicate whether the target object attribute has been released, and has an indication direction that is the same as the release direction of the target object attribute.
  • the display properties of the release device of different object properties on the operation interface may be the same or different.
  • each object attribute of the operation object has a release scope, and each object attribute of the operation object can only be released within the corresponding release range, and cannot be released outside the release range.
  • the size of the release range is determined by the operation object, and the display ranges of the different operation objects are different.
  • the shape of the release range is generally circular.
  • the shape of the release range is elliptical during actual operation.
  • 3 is an operation interface of the game application.
  • the release range of the object attribute W302 of the operation object 301 is an ellipse
  • the release range of the object attribute W is the ellipse area 303 in FIG.
  • the terminal In order to facilitate the user to control the operation object through an input device such as a mouse or a keyboard, the terminal also displays an object icon carrying each object attribute on the operation interface of the specified application.
  • Each object icon has a range of movement, the shape of the range is generally circular, and the range of movement of the object icons of the same operation object is the same. In the actual operation, each object icon can only be dragged within the moving range, and cannot be dragged outside the moving range.
  • 3 is an operation interface of a game application.
  • the object icons displayed on the operation interface include an icon 302 for representing the skill wheel W, an icon 304 for representing the skill wheel S, and a skill for representing the skill.
  • the icon 305 of the wheel D is exemplified by an icon 302 for the skill wheel W, and the movement range of the skill wheel W302 is a circular area 306 of a broken line in FIG.
  • the method provided by the embodiment of the present invention also displays the release range of the target object attribute and the movement of the target object icon on the operation interface, in order to enable the user to perform operation guidance during the operation of the specified application. range. Based on the release range of the target object attribute and the moving range of the target object icon displayed on the operation interface, the method provided by the embodiment of the present invention, according to the acquired release instruction, also releases the range according to the second display position and the target object attribute.
  • a moving range of the target object icon determining a first mapping position of the second display position within the moving range, and controlling the target object icon to move to the first mapping position when the control operation object releases the target object attribute in the specified area, and simultaneously
  • the moving track of the target object icon to the first mapping position is displayed on the operation interface.
  • the terminal determines that the second display position is within the moving range according to the second display position, the release range of the target object attribute, and the moving range of the target object icon.
  • the first mapping location includes but is not limited to the following two cases:
  • the second display position is outside the release range of the target object attribute.
  • the terminal When the second display position of the mouse is outside the release range of the target object attribute, the terminal cannot control the operation object to release the target object attribute, and the terminal may according to the second display position of the mouse and the center point position of the release range of the target object attribute, Determine a line and calculate the intersection of the line and the release range of the target object attribute, and then use the intersection as the first map position.
  • the shape of the release range is ellipse
  • the long axis length of the ellipse is a
  • the short axis length of the ellipse is b
  • the coordinates of the center point position of the ellipse are (0,0)
  • the intersection coordinates of the straight line and the ellipse can be calculated, and the intersection coordinates are further calculated.
  • the second display position is within the release range of the target object attribute.
  • the terminal can zoom the display position of the mouse according to the release range of the target object attribute, and obtain a zoom.
  • the position after the determination, and the first mapping position is determined according to the zoomed position and the moving range of the target object icon.
  • the terminal can scale the horizontal axis coordinate x of the mouse according to the long axis length of the ellipse,
  • the position coordinates are (x/a, y/b).
  • the method provided by the embodiment of the invention maps the position of the mouse on the operation interface to the moving range, and ensures that when the user controls the operation object by using the mouse, the direction pointed by the mouse cursor is the actual display direction of the target object attribute.
  • the user in order to accurately release the target object attribute to the area that the user wishes to release, the user can continuously adjust the second display position of the mouse, so that the terminal acquires the second display position of the mouse on the operation interface.
  • the mouse movement operation is also acquired in real time. If the mouse movement operation is not acquired, when the operation object release operation is acquired, the operation object can be controlled to release the target object in the specified area.
  • An attribute, and a releaser that displays the target object attribute in the specified area if the mouse is moved, the third display position of the mouse is acquired, and when the release operation is acquired, according to the first display position and the third display position, Adjust the specified area to get the adjusted area, and control the operation object to release the target object attribute in the adjusted area, and display the release of the target object attribute in the adjusted area when the release operation of the operation object is obtained.
  • the terminal when acquiring a moving operation of the mouse, acquires a third display position of the mouse, and when acquiring a release instruction for the target object attribute, according to the third display position, the release range, and Moving the range, determining a second mapping position of the third display position within the movement range, and then controlling the target object icon to move to the second mapping position when the control operation object releases the target object attribute in the specified area, and acquiring the target object
  • the movement track of the target object icon to the second mapping position is displayed on the operation interface.
  • the terminal may wait for the second preset duration after the control target object icon moves to the second mapping position.
  • the second preset duration is reached, the update operation to the second mapping location is performed again.
  • the second preset duration is determined by the responsiveness of the processor, and the second preset duration may be 10 milliseconds, 20 milliseconds, or the like.
  • the control operation object performs the release process of the object attribute.
  • FIG. 3 will be described below as an example.
  • Step 400 Determine whether the Q key is pressed.
  • Step 402 When acquiring a trigger operation on the Q button on the paired connection keyboard, the terminal acquires a button instruction, the button instruction includes Q key information, and determines a skill corresponding to the Q key information.
  • Step 406 Determine the mapping position according to the display position of the mouse, the skill range of the Q-key corresponding skill, and the movement range of the skill wheel corresponding to the Q key;
  • Step 408 Detect whether the position of the mouse occurs.
  • Step 410 Determine whether the Q key is released, if the execution step 412 is released, if not, return to step 404; 412: When the release operation to the Q key is acquired, the terminal controls the operation object to release the skill corresponding to the Q key.
  • step 401 may also be performed: the terminal will also simulate the operation of the skill wheel corresponding to the Q button on the operation interface, and Controls skill wheel movement based on the mapped position.
  • step 403 During the movement of the control skill roulette, determine whether the skill roulette reaches the mapping position.
  • step 405 is performed. If the mapping position is not reached, step 407 is performed; step 405: waiting for the mapping position update Step 407: Simulate the finger to move to the mapping position for a short distance to continue to control the skill wheel movement; Step 409: After waiting for 10 milliseconds, return to step 403 to perform the above determination process until the step of releasing the Q key is performed. 411: The simulated finger is released. When the release operation to the Q key is acquired, the terminal stops controlling the skill wheel movement, so that the skill wheel returns to the initial position.
  • the terminal acquires the click instruction.
  • the click command includes the click position.
  • the terminal After obtaining the click command, the terminal will display the click position of the mouse on the operation interface in a preset manner, thereby being able to intuitively display the click position of the mouse to the user.
  • the preset mode may be a ripple response mode or the like. Referring to FIG. 5, the position 501 of 1 in FIG. 5 is the click position of the mouse.
  • the terminal determines the moving direction and the moving distance of the operating object according to the click position and the first display position.
  • the terminal constructs a ray with the first display position of the operation object as the starting point and the click position as the end point, and uses the direction of the ray as the moving direction of the operation object. Since the direction of the ray can be any direction, the method provided by the embodiment of the present invention extends the moving direction of the operation object, and can control the operation object to move within a range of 360 degrees.
  • the terminal acquires the coordinates of the click position and the coordinates of the first display position of the operation object, and calculates the moving distance of the operation object by using a two-point distance formula.
  • Set the click position coordinate to (x 1 , y 1 ), and the first display position coordinate of the operation object is (x 2 , y 2 ), then the moving distance of the operation object squart ⁇ (x 1 -x 2 ) 2 + ( y 1 -y 2 ) 2 ⁇ .
  • the terminal calculates the moving duration of the operation object according to the moving distance.
  • the terminal when the moving speed of the operating object in the specified application is set to a fixed value, the terminal can calculate the moving duration of the operating object according to the moving distance and the moving speed of the operating object.
  • the terminal controls the operation object to move to the click position according to the moving direction of the operation object and the moving time of the operation object.
  • the terminal controls the operation object to move to the click position according to the moving direction of the operation object.
  • the terminal determines whether the operation object moves to the click position according to the movement duration. If the movement time of the operation object to the click position is less than the movement duration, indicating that the operation object has not moved to the click position, the operation object is controlled. Continue to move to the click position; if the motion duration of the operation object to the click position is equal to the movement duration, indicating that the operation object has moved to the click position, the control operation object stops moving.
  • the click position coordinate acquired in the above step and the first display position coordinate of the operation object are coordinates on the current operation interface, and in the present application, when the terminal controls the operation object to move to the click position, it is required to click according to the click.
  • the coordinates of the position coordinates and the position coordinates of the operation object are in the scene map of the specified application, and the coordinates of the click position coordinates and the position coordinates of the operation object in the scene map of the specified application are generally difficult to obtain, in order to realize the movement of the operation object. Controlling, the embodiment of the present invention can control the movement of the operation object according to the movement duration.
  • the terminal displays the movement track of the operation object on the operation interface.
  • the terminal when the control operation object moves to the click position, the terminal further controls the movement icon to move according to the movement trajectory of the operation object, and when the movement operation of the movement icon is acquired, on the operation interface Shows the movement track of the mobile icon.
  • the mobile icon is an icon that controls the movement of the operation object.
  • the specified application is a game-like application.
  • the terminal can control the movement of the mobile icon 503 to move in the direction of the arrow according to the movement trajectory of the operation object 502.
  • Step 601 Acquire an click position of the mouse when a position on the mouse click operation interface is detected.
  • Step 602 Determine a moving direction and a moving distance according to a click position of the mouse and a position of a center point of the operation object, wherein the moving direction is a direction in which the user wants to move the operation object, and the moving distance is a click position and an operation object of the mouse. The distance between the positions of the center points.
  • Step 603 Determine the moving time according to the moving distance, and further according to the moving direction and the moving time.
  • Step 604 Drag the moving wheel to control the operation object to move to the click position.
  • the specified application will be detected after startup. Specify whether the application supports the application of the operation control method provided in the present application (that is, whether the new operation mode is supported). If the specified application supports the operation control method provided by the embodiment of the present invention, the user is guided to turn on the corresponding control function, if the designation The application does not support the operation control method provided by the embodiment of the present invention, or the user does not enable the corresponding function, and the operation method of the operation object in the specified application is controlled by a conventional method.
  • FIG. 7 and FIG. 8 For the movement control and skill release control of the operation object by using the method provided by the embodiment of the present invention, the following description will be made by taking FIG. 7 and FIG. 8 as an example, including the following steps:
  • step 801 after the game application is started, the terminal detects whether the game application supports a new operation mode. If the game application supports a new operation mode, step 802 is performed, and if the game application does not support the new operation mode, Or if the user does not enable the new function, go to step 803.
  • Step 802 The user is started to open a new function. If the user is turned on, step 804 is performed. If the user is not turned on, step 803 is performed.
  • Step 803 Perform operation control using a conventional scheme.
  • Step 804 Performing a game using the new technical solution, including using the movement of the mouse in the game and releasing the skill by using the mouse and the mouse. Specifically, when controlling the movement of the operation object in the game, the terminal can acquire the click position of the mouse based on the mouse. , thereby controlling the operation object to move to the click position of the mouse.
  • the click position 702 of the mouse is displayed, and the movement icon 703 is controlled to move according to the movement trajectory of the operation object 701.
  • the terminal can acquire the display position of the mouse and the skill corresponding to the triggered button on the keyboard based on the mouse and the keyboard, thereby controlling the operation object to release the skill when receiving the button release message.
  • the skill wheel 704 is controlled to move according to the position 702 of the mouse.
  • the method provided by the embodiment of the present invention determines the target object attribute according to the valid key message of the triggered key, and further determines the operation object to be determined by the first display position and the second display position when the release operation of the designated key is acquired. Release the target object property within the specified area.
  • the process does not require a mouse click on the center point of the target object icon, and there is no need to drag the target object icon, which not only reduces the operation complexity, but also improves the release precision.
  • the present application controls the movement of the operation object by the click position of the mouse and the position of the operation object, thereby expanding the movement direction of the operation object.
  • an embodiment of the present invention provides an operation control apparatus, where the apparatus includes:
  • the display module 901 is configured to display an operation interface of the specified application, where the first display position of the operation object and the second display position of the paired connection mouse are displayed, and the operation object is an object managed by the user account of the login specified application. ;
  • the obtaining module 902 is configured to acquire a button instruction, where the button instruction includes valid button information, when acquiring a triggering operation on the designated button on the paired connection keyboard;
  • the obtaining module 902 is further configured to acquire the target object attribute from the at least one object attribute of the operation object according to the valid key information;
  • the obtaining module 902 is further configured to acquire a release instruction for the target object attribute when the release operation of the specified key is acquired;
  • a determining module 903 configured to determine a designated area according to the first display position and the second display position when the release instruction is acquired;
  • control module 904 configured to control the operation object to release the target object attribute in the specified area
  • the display module 905 is configured to display a releaser of the target object attribute in the specified area when the release operation of the operation object is acquired.
  • the release range of the target object attribute and the moving range of the target object icon are also displayed on the operation interface, and the target object icon is an icon that carries the attribute of the target object;
  • a determining module 903 configured to determine, according to the second display position, the release range, and the movement range, the first mapping position of the second display position within the movement range when the release instruction is acquired;
  • a control module 904 configured to: when the control operation object releases the target object attribute in the specified area, control the target object icon to move to the first mapping position;
  • the display module 905 is configured to display, when the movement operation of the target object icon is acquired, a movement trajectory of the target object icon to the first mapping position on the operation interface.
  • the determining module 903 is configured to determine a straight line according to the central point position of the second display position and the release range when the second display position is outside the release range; calculate the straight line and the release range Intersection point, the intersection point is taken as the first mapping position; or,
  • the determining module 903 is configured to: when the second display position is within the release range, scale the second display position according to the release range to obtain the scaled position; and determine the first mapping position according to the scaled position and the moving range.
  • the obtaining module 902 is configured to acquire a third display position of the mouse when the moving operation of the mouse is acquired;
  • the adjusting module is configured to adjust the designated area according to the first display position and the third display position when the release command is obtained, to obtain the adjusted area;
  • the control module 904 is configured to control the operation object to release the target object attribute in the adjusted area
  • the display module 905 is configured to display a releaser of the target object attribute in the adjusted area when the release operation of the operation object is acquired.
  • the obtaining module 902 is configured to acquire a third display position of the mouse when the moving operation of the mouse is acquired;
  • a determining module 903 configured to determine, according to the third display position, the release range, and the movement range, a second mapping position of the third display position within the movement range when the release instruction is acquired;
  • control module 904 configured to: when the control operation object releases the target object attribute in the specified area, control the target object icon to move to the second mapping position;
  • the display module 905 is configured to display, on the operation interface, a movement trajectory of the target object icon to the second mapping position when the movement operation of the target object icon is acquired.
  • the obtaining module 902 is configured to acquire a click instruction when the click operation of the mouse is acquired, where the click instruction includes a click position;
  • the display module 905 is configured to display a click position on the operation interface when the click instruction is obtained;
  • a determining module 903 configured to determine a moving direction and a moving distance of the operating object according to the click position and the first display position;
  • a calculation module configured to calculate a movement duration of the operation object according to the movement distance
  • the control module 904 is configured to control the operation object to move to the click position according to the moving direction and the moving time length;
  • the display module 905 is configured to display a movement track of the operation object on the operation interface when the movement operation of the operation object is acquired.
  • control module 904 is configured to control, when the control operation object moves to the click position, the movement icon to move according to the movement trajectory of the operation object, where the movement icon is an icon for controlling the operation object to move;
  • the display module 905 is configured to display a movement track of the mobile icon on the operation interface when the moving operation of the mobile icon is acquired.
  • the device determines the target object attribute according to the valid key message of the triggered button, and further controls the operation object to be displayed by the first display position and the second display when the release operation of the designated button is acquired.
  • the target object property is released within the specified area determined by the location.
  • the process does not require a mouse click on the center point of the target object icon, and there is no need to drag the target object icon, which not only reduces the operation complexity, but also improves the release precision.
  • the present application controls the movement of the operation object by the click position of the mouse and the position of the operation object, thereby expanding the movement direction of the operation object.
  • FIG. 10 is a schematic structural diagram of an operation control terminal according to an embodiment of the present invention.
  • the terminal may be used to implement the operation control method provided in the foregoing embodiment. Specifically:
  • the terminal 1000 may include an RF (Radio Frequency) circuit 110, a memory 120 including one or more computer readable storage media, an input unit 130, a display unit 140, a sensor 150, an audio circuit 160, and a WiFi (Wireless Fidelity, wireless).
  • the fidelity module 170 includes a processor 180 having one or more processing cores, and a power supply 190 and the like. It will be understood by those skilled in the art that the terminal structure shown in FIG. 10 does not constitute a limitation to the terminal, and may include more or less components than those illustrated, or combine some components, or different component arrangements. among them:
  • the RF circuit 110 can be used for transmitting and receiving information or during a call, and receiving and transmitting signals. Specifically, after receiving downlink information of the base station, the downlink information is processed by one or more processors 180. In addition, the data related to the uplink is sent to the base station. .
  • the RF circuit 110 includes, but is not limited to, an antenna, at least one amplifier, a tuner, one or more oscillators, a Subscriber Identity Module (SIM) card, a transceiver, a coupler, an LNA (Low Noise Amplifier). , duplexer, etc.
  • RF circuitry 110 can also communicate with the network and other devices via wireless communication.
  • the wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System of Mobile communication), GPRS (General Packet Radio Service), CDMA (Code Division Multiple Access). , Code Division Multiple Access), WCDMA (Wideband Code Division Multiple Access), LTE (Long Term Evolution), e-mail, SMS (Short Messaging Service), and the like.
  • GSM Global System of Mobile communication
  • GPRS General Packet Radio Service
  • CDMA Code Division Multiple Access
  • WCDMA Wideband Code Division Multiple Access
  • LTE Long Term Evolution
  • e-mail Short Messaging Service
  • the memory 120 can be used to store software programs and modules, and the processor 180 executes various functional applications and data processing by running software programs and modules stored in the memory 120.
  • the memory 120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may be stored according to The data created by the use of the terminal 1000 (such as audio data, phone book, etc.) and the like.
  • memory 120 can include high speed random access memory, and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, memory 120 may also include a memory controller to provide access to memory 120 by processor 180 and input unit 130.
  • the input unit 130 can be configured to receive input numeric or character information and to generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function controls.
  • input unit 130 can include touch-sensitive surface 131 as well as other input devices 132.
  • Touch-sensitive surface 131 also referred to as a touch display or trackpad, can collect touch operations on or near the user (such as a user using a finger, stylus, etc., on any suitable object or accessory on touch-sensitive surface 131 or The operation near the touch-sensitive surface 131) and driving the corresponding connecting device according to a preset program.
  • the touch-sensitive surface 131 can include two portions of a touch detection device and a touch controller.
  • the touch detection device detects the touch orientation of the user, and detects a signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts the touch information into contact coordinates, and sends the touch information.
  • the processor 180 is provided and can receive commands from the processor 180 and execute them.
  • the touch-sensitive surface 131 can be implemented in various types such as resistive, capacitive, infrared, and surface acoustic waves.
  • the input unit 130 can also include other input devices 132.
  • other input devices 132 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackballs, mice, joysticks, and the like.
  • the display unit 140 can be used to display information input by the user or information provided to the user and various graphical user interfaces of the terminal 1000, which can be composed of graphics, text, icons, video, and any combination thereof.
  • the display unit 140 may include a display panel 141.
  • the display panel 141 may be configured in the form of an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), or the like.
  • the touch-sensitive surface 131 may cover the display panel 141, and when the touch-sensitive surface 131 detects a touch operation thereon or nearby, it is transmitted to the processor 180 to determine the type of the touch event, and then the processor 180 according to the touch event The type provides a corresponding visual output on display panel 141.
  • touch-sensitive surface 131 and display panel 141 are implemented as two separate components to implement input and input functions, in some embodiments, touch-sensitive surface 131 can be integrated with display panel 141 for input. And output function.
  • Terminal 1000 can also include at least one type of sensor 150, such as a light sensor, motion sensor, and other sensors.
  • the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 141 according to the brightness of the ambient light, and the proximity sensor may close the display panel 141 when the terminal 1000 moves to the ear. / or backlight.
  • the gravity acceleration sensor can detect the magnitude of acceleration in all directions (usually three axes). When it is stationary, it can detect the magnitude and direction of gravity.
  • the terminal 1000 can also be configured with gyroscopes, barometers, hygrometers, thermometers, infrared sensors and other sensors, not here Let me repeat.
  • the audio circuit 160, the speaker 161, and the microphone 162 can provide an audio interface between the user and the terminal 1000.
  • the audio circuit 160 can transmit the converted electrical data of the received audio data to the speaker 161 for conversion to the sound signal output by the speaker 161; on the other hand, the microphone 162 converts the collected sound signal into an electrical signal by the audio circuit 160. After receiving, it is converted into audio data, and then processed by the audio data output processor 180, transmitted to the terminal, for example, via the RF circuit 110, or outputted to the memory 120 for further processing.
  • the audio circuit 160 may also include an earbud jack to provide communication of the peripheral earphones with the terminal 1000.
  • WiFi is a short-range wireless transmission technology
  • the terminal 1000 can help users to send and receive emails, browse web pages, and access streaming media through the WiFi module 170, which provides wireless broadband Internet access for users.
  • FIG. 10 shows the WiFi module 170, it can be understood that it does not belong to the essential configuration of the terminal 1000, and may be omitted as needed within the scope of not changing the essence of the invention.
  • the processor 180 is a control center of the terminal 1000 that connects various portions of the entire handset with various interfaces and lines, by running or executing software programs and/or modules stored in the memory 120, and recalling data stored in the memory 120, The various functions and processing data of the terminal 1000 are performed to perform overall monitoring of the mobile phone.
  • the processor 180 may include one or more processing cores; optionally, the processor 180 may integrate an application processor and a modem processor, where the application processor mainly processes an operating system, a user interface, and an application. Etc.
  • the modem processor primarily handles wireless communications. It can be understood that the above modem processor may not be integrated into the processor 180.
  • the terminal 1000 also includes a power source 190 (such as a battery) for powering various components.
  • the power source can be logically coupled to the processor 180 through a power management system to manage functions such as charging, discharging, and power management through the power management system.
  • Power supply 190 may also include any one or more of a DC or AC power source, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
  • the terminal 1000 may further include a camera, a Bluetooth module, and the like, and details are not described herein again.
  • the display unit of the terminal 1000 is a touch screen display
  • the terminal 1000 further includes a memory, and one or more programs, wherein one or more programs are stored in the memory and configured to be one or one The above processor executes.
  • the one or more programs include instructions for performing an operational control method:
  • the terminal determines the target object attribute according to the valid key message of the triggered key, and further determines the operation object to be determined by the first display position and the second display position when the release operation of the designated button is acquired. Release the target object property within the specified area.
  • the process does not require a mouse click on the center point of the target object icon, and there is no need to drag the target object icon, which not only reduces the operation complexity, but also improves the release precision.
  • the present application controls the movement of the operation object by the click position of the mouse and the position of the operation object, thereby expanding the movement direction of the operation object.
  • the embodiment of the present invention further provides a computer readable storage medium, which may be a computer readable storage medium included in the memory in the above embodiment; or may exist separately and not assembled into the terminal.
  • Computer readable storage medium may be a computer readable storage medium included in the memory in the above embodiment; or may exist separately and not assembled into the terminal.
  • Computer readable storage medium stores one or more programs that are used by one or more processors to perform an operational control method.
  • the computer readable storage medium determines the target object attribute according to the valid key message of the triggered key, and further controls the operation object to be in the first display position and the second when the release operation on the designated key is acquired.
  • the target object property is released within the specified area determined by the display position.
  • the process does not require a mouse click on the center point of the target object icon, and there is no need to drag the target object icon, which not only reduces the operation complexity, but also improves the release precision.
  • the present application controls the movement of the operation object by the click position of the mouse and the position of the operation object, thereby expanding the movement direction of the operation object.
  • a graphical user interface is provided in an embodiment of the present invention.
  • the graphic user interface is used on a terminal of an operation control method, and the terminal for performing the operation control method includes a touch screen display, a memory, and a program for executing one or more programs. Or more than one processor.
  • the graphic user interface determines the target object attribute according to the valid key message of the triggered key, and further controls the operation object to be in the first display position and the second display position when the release operation on the designated button is acquired.
  • the target object attribute is released within the specified specified area.
  • the process does not require a mouse click on the center point of the target object icon, and there is no need to drag the target object icon, which not only reduces the operation complexity, but also improves the release precision.
  • the present application controls the movement of the operation object by the click position of the mouse and the position of the operation object, thereby expanding the movement direction of the operation object.
  • operation control device provided by the above embodiment is only illustrated by the division of each functional module described above in the operation control. In actual applications, the function distribution may be completed by different functional modules as needed.
  • the internal structure of the operation control device is divided into different functional modules to perform all or part of the functions described above.
  • the operation control device and the operation control method embodiment are provided in the same concept, and the specific implementation process is described in detail in the method embodiment, and details are not described herein again.
  • a person skilled in the art may understand that all or part of the steps of implementing the above embodiments may be completed by hardware, or may be instructed by a program to execute related hardware, and the program may be stored in a computer readable storage medium.
  • the storage medium mentioned may be a read only memory, a magnetic disk or an optical disk or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本发明实施例公开了一种操作控制方法及装置,所述方法包括:当获取到对配对连接键盘上指定按键的触发操作时,获取有效按键信息;根据有效按键信息,获取目标对象属性;当获取到对指定按键的释放操作时,根据第一显示位置和第二显示位置,确定第一指定区域;控制操作对象在第一指定区域内释放目标对象属性。

Description

操作控制方法、装置及存储介质
本申请要求于2017年03月28日提交中国专利局、申请号为201710193663.3、名称为“操作控制方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明实施例涉及终端技术领域,特别涉及一种操作控制方法、装置以及存储介质。
背景
在现代生活中,为了缓解工作压力,很多用户喜欢在终端上玩游戏。由于游戏类应用的操作控制方式直接影响用户的游戏体验,因此,如何对游戏类应用进行操作控制,成为本领域人员较为关注的问题。
技术内容
本发明实施例提供了一种操作控制方法,所述方法包括:
显示指定应用的操作界面,所述操作界面上显示有操作对象的第一显示位置和配对连接鼠标的第二显示位置;
当获取到针对配对连接的键盘上指定按键的触发操作时,获取按键指令,所述按键指令包括有效按键信息;
根据所述有效按键信息,从操作对象的对象属性中,获取目标对象属性;
当获取到对所述指定按键的释放操作时,获取对所述目标对象属性的释放指令;
当获取到所述释放指令时,根据所述第一显示位置和所述第二显示位置,确定第一指定区域;
控制所述操作对象在所述第一指定区域内释放所述目标对象属性。
本发明实施例还提供了一种操作控制装置,所述装置包括:存储器、处理器;其中,所述存储器中存储有计算机可读指令,所述处理器执行所述存储中的计算机可读指令,用于:
显示指定应用的操作界面,所述操作界面上显示有操作对象的第一显示位置和配对连接鼠标的第二显示位置;
当获取到针对配对连接键盘上指定按键的触发操作时,获取按键指令,所述按键指令包括有效按键信息;
根据所述有效按键信息,从操作对象的至少一个对象属性中,获取目标对象属性;
当获取到对所述指定按键的释放操作时,获取对所述目标对象属性的释放指令;
当获取到所述释放指令时,根据所述第一显示位置和所述第二显示位置,确定第一指定区域;
控制所述操作对象在所述第一指定区域内释放所述目标对象属性。
本申请实施例还提出了一种非易失性计算机可读存储介质,存储有计算机可读指令,可以使至少一个处理器执行以上所述的方法。
附图说明
为了更清楚地说明本发明实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请一个实施例提供的一种操作控制方法所涉及的实施环境的示意图;
图2是本申请一个实施例提供的一种操作控制方法的流程图;
图3是本申请另一个实施例提供的一种操作界面的示意图;
图4是本申请另一个实施例提供的一种操作控制过程示意图;
图5是本申请另一个实施例提供的一种操作界面的示意图;
图6是本申请另一个实施例提供的一种操作控制过程的示意图;
图7是本申请另一个实施提供的一种操作界面的示意图;
图8是本申请另一个实施例提供的一种操作控制过程的示意图;
图9是本申请另一个实施例提供的一种操作控制的装置结构示意图;
图10其示出了本申请实施例所涉及的操作控制终端的结构示意图。
实施方式
为使本申请的目的、技术方案和优点更加清楚,下面将结合附图对本申请实施方式作进一步地详细描述。
在进行详细说明之前,首先对本发明实施例涉及的概念进行如下解释:
操作对象:为指定应用运行过程中,终端需要进行控制的对象。其中,指定应用可以为游戏类应用,该指定应用所兼容的操作系统可以为安卓操作系统、IOS操作系统等。当指定应用为游戏类应用时,操作对象可以为游戏类应用的虚拟人物、虚拟动物等等。
对象属性:为操作对象在指定应用中所具有的属性。当指定应用为游戏类应用时,对象属性为操作对象所具有的技能,包括攻击、治疗、防御等等。
属性图标:为操作界面上承载对象属性的图标。当指定应用为游戏类应用时,属性图标可以为游戏中的技能轮盘。
移动图标:为操作界面上控制操作对象进行移动的图标。当指定应用为游戏类应用时,移动图标可以为游戏中的移动轮盘。
释放范围:为对象属性的实际作用范围。当指定应用为游戏类应 用时,释放范围为技能范围。
移动范围:为属性图标的拖动范围。
目前,对于指定应用而言,对操作对象的控制主要包括以下两方面:一方面为对操作对象的移动进行控制,另一个方面为对操作对象的对象属性的释放进行控制。针对第一方面,现有技术主要基于键盘上AWDS四个按键及其组合按键实现,其中,按键W用于控制操作对象向上移动,按键A用于控制操作对象向左移动,按键D用于控制操作对象向右移动,按键S用于控制操作对象向下移动,按键A+W用于控制操作对象向左上移动,按键A+S用于控制操作对象向左下移动,按键D+W用于控制操作对象向右上移动,按键D+S用于控制操作对象向右下移动,现有的操作控制过程为:接收键盘发送的按键消息,该按键消息包括已触发按键信息;根据已触发按键信息,确定对操作对象的控制方向;根据该控制方向,控制操作对象进行移动。然而,受限于键盘上的按键及其组合形式,现有技术在对操作对象的移动进行控制时,仅能实现上、下、左、右、左上、左下、右上、右下八个方向上的控制,而在实际的控制操作过程中,操作对象的移动方向可能为360度,可见,现有的操作控制方法控制操作对象的移动方向比较单一。而本申请根据鼠标的点击位置和操作对象的位置,对操作对象进行移动控制,扩展了对操作对象的移动方向。
针对第二方面,现有的操作控制过程为:在游戏类应用的运行过程中,接收配对连接的鼠标发送的点击指令,该点击指令包括点击位置;判断该点击位置是否位于属性图标的中心位置;如果该点击位置位于任一属性图标的中心位置,且接收到鼠标发送的拖动指令,则根据拖动指令,执行对该属性图标的拖动操作;在对属性图标的拖动过程中,如果接收到鼠标发送的显示指令,则控制操作对象以当前位置为起点、以平行于属性图标的拖动轨迹方向为显示方向,显示该属性图标对应的对象属性。由于需要用户使用鼠标点击属性图标的中心位置,并需要拖动属性图标,导致操作过程较为复杂。而本申请根据鼠 标的显示位置及键盘的有效按键信息,对操作对象的对象属性的释放进行控制,不仅降低了操作复杂度,而且提升了释放精度。
请参考图1,其示出了本发明实施例提供的操作控制方法所涉及的实施环境,参见图1,该实施环境包括:终端101、键盘102及鼠标103。
其中,终端101可以为笔记本电脑、台式电脑等设备,该终端101中安装有操作系统模拟器,该操作系统模拟器能够在终端101上模拟指定应用的运行环境,从而使得终端能够流畅运行指定应用。
键盘102为有线键盘、无线键盘中的一种,用于向终端101输入按键信息。
鼠标103为有线鼠标、无线鼠标中的一种,用于向终端102发送控制指令。
上述终端101还与键盘102、鼠标103进行配对连接,从而在指定应用的运行过程中,能够通过键盘102、鼠标103获取操作指令。关于终端101与键盘102、鼠标103的连接方式,包括但不限于有线连接、无线连接中的一种。
本发明实施例提供了一种操作控制方法,应用于操作控制装置,参见图2,本发明实施例提供的方法流程包括:
201、在指定应用的运行过程中,终端显示指定应用的操作界面。
其中,操作界面上显示有操作对象的第一显示位置和配对连接鼠标的第二显示位置。其中,操作对象为登录指定应用的用户账号所管理的对象,该操作对象具有至少一种对象属性,例如该操作对象所执行的操作,例如攻击、防御、治疗等等。对鼠标而言,第二显示位置实际上为光标位置。
202、当获取到对配对连接键盘上指定按键的触发操作时,终端获取按键指令。
在本发明实施例中,终端在获取到对配对连接键盘上指定按键的 触发操作时,可获取按键指令,该按键指令包括有效按键信息。其中,指定按键可预先由用户进行定义。有效按键信息包括已触发按键的标识、已触发按键的触发时长等等。
203、终端根据有效按键信息,从操作对象的对象属性中,获取目标对象属性。
终端根据有效按键信息,从操作对象的至少一个对象属性中,获取目标对象属性时,包括但不限于如下几种方式:
在本申请的一个实施例中,终端可维护一个对象数据库,该对象属性数据库中存储有按键标识与对象属性之间的对应关系,还存储有每种对象属性的释放范围等等。基于该对象数据库,终端可从操作对象的至少一个对象属性中,获取目标对象属性。例如,对象数据库中按键标识Q对应的对象属性用于表示攻击功能的属性,按键标识W对应的对象属性用于表示治疗功能,如果有效按键信息中的按键标识为Q,则可获取目标对象属性用于表示攻击功能。
在本申请的另一个实施例中,终端可预先为每种按键标识设置不同的数值,并为操作对象的每种对象属性设置不同的数值,当获取到有效按键信息时,终端可从有效按键信息中获取按键标识的数值,并将该按键标识的数值与操作对象的每种对象属性的数值进行比较,当该按键标识的数值与操作对象的任一种对象属性的数值相同,则将该对象属性作为目标对象属性。
在一些实例中,上述对象属性为封装的一个函数,当调用并执行该函数时,可以使操作对象执行相应的动作,上述目标对象属性为与上述指定按键对应的对象属性,上述获取目标对象属性包括:确定与上述指定按键对应的对象属性,将该对象属性作为目标对象属性,并赋予操作对象可以调用该目标对象属性对应的函数的权限。
204、当获取到对指定按键的释放操作时,终端获取对目标对象属性的释放指令。
当获取到对指定按键的释放操作时,在释放操作的触发下,终端获取对目标对象属性的释放指令,该释放指令用于控制操作对象在指 定区域内释放目标对象属性。
205、当获取到释放指令时,终端根据第一显示位置和第二显示位置,确定指定区域。
终端在根据第一显示区域和第二显示区域,确定指定区域时,可将以第一显示位置为起点、第二显示位置为终点的条形区域作为指定区域,也可以将以第一显示位置为起点、包含第一显示区域和第二显示区域的扇形区域作为指定区域,当然,终端还可以采用其他方式确定指定区域,本发明实施例对此不作具体的限定。
206、终端控制操作对象在指定区域内释放目标对象属性。
由于处理器具有一定的响应时长,为使终端能够识别出所要执行的操作,当获取到释放指令时,终端可等待第一预设时长,当获取释放指令的时长达到第一预设时长时,控制操作对象在指定区域内释放目标对象属性。其中,第一预设时长由处理器的响应能力确定,该第一预设时长可以为50毫秒、60毫秒等等。这里的,释放指令是指用户松开上述指定按键的操作;上述释放目标对象属性,是指响应于上述释放指令,调用上述目标对象属性对应的函数,以控制上述操作对象执行与上述目标对象属性对应的动作。
在一些实施例中,当获取到操作对象的释放操作时,终端还可以在指定区域内显示目标对象属性的释放器。
为了直观地向用户展示目标对象属性是否被释放,本发明实施例提供的方法在获取到对操作对象的释放操作时,还将在指定区域内显示目标对象属性的释放器。该释放器用于指示目标对象属性是否已被释放,具有一个指示方向,该指示方向与目标对象属性的释放方向相同。对于操作对象来说,不同对象属性的释放器在操作界面上的显示属性(包括显示形状、显示颜色等等)可以相同,也可以不同。
在本发明实施例中,操作对象的每种对象属性都具有一个释放范围,操作对象的每种对象属性只能在相应的释放范围内释放,无法在释放范围外释放。该释放范围的大小由操作对象确定,且不同操作对象的显示范围是不同的。通常释放范围的形状一般为圆形,然而,由 于指定应用一般采用斜角度的3D显示方式进行显示,因而在实际操作过程中释放范围的形状为椭圆形。图3为游戏类应用的操作界面,由图3可知,操作对象301的对象属性W302的释放范围为椭圆形,该对象属性W的释放范围为图3中的椭圆区域303。
为了便于用户通过鼠标、键盘等输入设备对操作对象进行控制,终端还将在指定应用的操作界面上显示承载每种对象属性的对象图标。每种对象图标都具有一个移动范围,该移动范围的形状一般为圆形,且同一操作对象的对象图标的移动范围是相同的。在实际操作过程中,每个对象图标只能在移动范围内拖动,无法在移动范围之外拖动。图3为游戏类应用的操作界面,由图3可知,操作界面上所显示的对象图标包括用于表示技能轮盘W的图标302、用于表示技能轮盘S的图标304及用于表示技能轮盘D的图标305,以用于技能轮盘W的图标302为例,该技能轮盘W302的移动范围为图3中虚线的圆形区域306。
基于释放范围和移动范围的作用,在指定应用的运行过程中,为了能够对用户进行操作指导,本发明实施例提供的方法还在操作界面上显示目标对象属性的释放范围和目标对象图标的移动范围。基于操作界面上所显示的目标对象属性的释放范围和目标对象图标的移动范围,本发明实施例提供的方法在获取到的释放指令时,还将根据第二显示位置、目标对象属性的释放范围、目标对象图标的移动范围,确定第二显示位置在移动范围内的第一映射位置,在控制操作对象在指定区域内释放目标对象属性时,控制目标对象图标向第一映射位置移动,同时在获取到目标对象图标的移动操作时,在操作界面上显示目标对象图标向第一映射位置的移动轨迹。
根据第二显示位置与目标对象属性的释放范围之间的位置关系,终端在根据第二显示位置、目标对象属性的释放范围及目标对象图标的移动范围,确定第二显示位置在移动范围内的第一映射位置时,包括但不限于如下两种情况:
第一种情况、第二显示位置位于目标对象属性的释放范围之外。
当鼠标的第二显示位置位于目标对象属性的释放范围之外,终端无法控制操作对象释放目标对象属性,此时终端可根据鼠标的第二显示位置和目标对象属性的释放范围的中心点位置,确定一条直线,并计算该直线与目标对象属性的释放范围的交点,进而将交点作为第一映射位置。
设定鼠标的第二显示位置坐标为(x,y),释放范围的形状为椭圆,且椭圆的长轴长为a,椭圆的短轴长为b,也即是椭圆方程为x 2/a 2+y 2/b 2=1,椭圆的中心点位置坐标为(0,0),则根据鼠标的第二显示位置坐标(x,y)和释放范围的中心点位置坐标(0,0),确定的直线方程为y=kx,进而根据直线方程y=kx和椭圆方程x 2/a 2+y 2/b 2=1,可计算出该直线与椭圆的交点坐标,进而将该交点坐标作为第一映射位置。
上述过程还可采用如下代码实现:
Figure PCTCN2018080205-appb-000001
第二种情况、第二显示位置位于目标对象属性的释放范围之内。
当鼠标的第二显示位置位于目标对象属性的释放范围之内时,由于不同的目标对象属性的释放范围不同,因此终端可根据目标对象属性的释放范围,对鼠标的显示位置进行缩放,得到缩放后的位置,并根据缩放后的位置和目标对象图标的移动范围,确定第一映射位置。
设定鼠标的第二显示位置坐标为(x,y),释放范围的形状为椭圆,且椭圆的长轴长为a,椭圆的短轴长为b,也即是椭圆方程为x 2/a 2+y 2/b 2=1,目标对象图标的移动范围的形状为圆形,且移动范围的半径为r,则终端可根据椭圆的长轴长对鼠标的横轴坐标x进行缩放,得到缩放后的横轴坐标为x`=x/a,根据椭圆的短轴长对鼠标的纵 轴坐标y进行缩放,得到缩放后的纵轴坐标为y`=y/b,也即是缩放后的位置坐标为(x/a,y/b)。通过将缩放后的位置坐标与目标对象图标的移动范围的半径r相乘,可得到第一映射位置坐标为(rx/a,ry/b)。
本发明实施例提供的方法,通过将操作界面上鼠标的位置映射到移动范围内,保证了用户使用鼠标对操作对象进行控制时,鼠标的光标指向的方向为目标对象属性的实际显示方向。
在实际的操作过程中,为使目标对象属性精确地释放到用户希望释放的区域上,用户可不断地调整鼠标的第二显示位置,这样终端在操作界面上获取到鼠标的第二显示位置之后、控制操作对象释放目标对象属性之前,还将实时获取鼠标的移动操作,如果未获取到鼠标的移动操作,则当获取到操作对象的释放操作时,可控制操作对象在指定区域内释放目标对象属性,并在指定区域内显示目标对象属性的释放器;如果获取到鼠标的移动操作,则获取鼠标的第三显示位置,当获取到释放操作时,根据第一显示位置和第三显示位置,调整指定区域,得到调整后的区域,并控制操作对象在条调整后的区域内释放目标对象属性,并当获取到操作对象的释放操作时,在调整后的区域内显示目标对象属性的释放器。
在本申请的另一个实施例中,当获取到鼠标的移动操作时,终端获取鼠标的第三显示位置,并当获取到对目标对象属性的释放指令时,根据第三显示位置、释放范围及移动范围,确定第三显示位置在移动范围内的第二映射位置,进而在控制操作对象在指定区域内释放目标对象属性时,控制目标对象图标向第二映射位置移动,并当获取到目标对象图标的移动操作时,在操作界面上显示目标对象图标向第二映射位置的移动轨迹。在对鼠标的显示位置进行更新的过程中,为使终端的处理器能够识别出目标对象图标的移动操作,终端在控制目标对象图标向第二映射位置移动后,可等待第二预设时长,在达到第二预设时长时,再次执行对第二映射位置的更新操作。其中,第二预设时长由处理器的响应能力确定,该第二预设时长可以为10毫秒、20毫秒等等。
需要说明的是,对于游戏类应用而言,上述对控制操作对象在指定区域内释放目标对象属性的过程,实际上就是技能释放过程。
上述控制操作对象进行对象属性的释放过程,为了便于理解,下面将以图3为例进行说明。
参见图4,设定指定应用为游戏类应用,需要释放的技能为Q键对应的技能。步骤400:确定Q键是否按下;步骤402:当获取到对配对连接键盘上Q按键的触发操作时,终端获取按键指令,该按键指令包括Q键信息,并确定Q键信息对应的技能。步骤404:终端获取鼠标的位置;步骤406:根据鼠标的显示位置、Q键对应技能的技能范围及Q键对应的技能轮盘的移动范围,确定映射位置;步骤408:检测鼠标的位置是否发生变化,如果鼠标的显示位置发生变化,根据鼠标更新后的显示位置,对映射位置进行更新;步骤410:确定Q键是否松开,如果松开执行步骤412,如果没有松开返回步骤404;步骤412:当获取到对Q键的释放操作时,终端控制操作对象释放Q键对应的技能。在执行步骤400之后,也即获取到对配对连接键盘上Q按键的触发操作之后,还可以执行步骤401:终端还将在操作界面上模拟手指按下Q键对应的技能轮盘的操作,并根据映射位置,控制技能轮盘移动。步骤403:在控制技能轮盘的移动过程中,判断技能轮盘是否达到映射位置,如果达到映射位置,则执行步骤405,如果未达到映射位置,则执行步骤407;步骤405:等待映射位置更新;步骤407:模拟手指向映射位置移动一小段距离,以继续控制技能轮盘移动;步骤409:在等待10毫秒后,返回步骤403进行上述判断过程,直至获取到对Q键的释放操作执行步骤411:模拟手指松开。当获取到对Q键的释放操作后,终端停止控制技能轮盘移动,使得技能轮盘恢复到初始位置。
至此,通过上述步骤201至206完成了对操作对象的目标对象属性进行显示过程,对于对操作对象的进行移动过程,可参见下述步骤a~d:
a、在指定应用的运行过程中,当获取到鼠标的点击操作时,终 端获取点击指令。
其中,点击指令包括点击位置。
在获取点击指令之后,终端将采用预设方式在操作界面上显示鼠标的点击位置,从而能够直观地向用户展示鼠标的点击位置。其中,预设方式可以为波纹响应方式等。参见图5,图5中1的位置501即为鼠标的点击位置。
b、终端根据点击位置和第一显示位置,确定操作对象的移动方向和移动距离。
终端以操作对象的第一显示位置为起点、以点击位置为终点,构建一条射线,并将射线的方向作为操作对象的移动方向。由于该射线的方向可以为任意方向,因此,采用本发明实施例提供的方法扩展了操作对象的移动方向,可控制操作对象在360度范围内移动。
终端获取点击位置坐标和操作对象的第一显示位置坐标,采用两点距离公式,计算出操作对象的移动距离。设定点击位置坐标为(x 1,y 1),操作对象的第一显示位置坐标为(x 2,y 2),则操作对象的移动距离=squart{(x 1-x 2) 2+(y 1-y 2) 2}。
c、终端根据移动距离,计算操作对象的移动时长。
在本发明实施例中,设定指定应用中操作对象的移动速度为固定值,则终端可根据移动距离与操作对象的移动速度,计算出操作对象的移动时长。
d、终端根据操作对象的移动方向和操作对象的移动时长,控制操作对象向点击位置移动。
终端根据操作对象的移动方向,控制操作对象向点击位置移动。在控制操作对象的移动过程中,终端根据移动时长,判断操作对象是否移动到点击位置,如果操作对象向点击位置的运动时长小于该移动时长,说明操作对象未移动到点击位置,则控制操作对象继续向点击位置移动;如果操作对象向点击位置的运动时长等于该移动时长,说明操作对象已移动到点击位置,则控制操作对象停止移动。
需要说明的是,上述步骤中获取到的点击位置坐标和操作对象的 第一显示位置坐标为在当前操作界面上的坐标,而本申请中终端在控制操作对象向点击位置移动时,需要根据点击位置坐标和操作对象的位置坐标在指定应用的场景地图中的坐标,而点击位置坐标和操作对象的位置坐标在指定应用的场景地图中的坐标一般很难获取到,为了实现对操作对象的移动进行控制,本发明实施例可根据移动时长,对操作对象的移动进行控制。
e、当获取到操作对象的移动操作时,终端在操作界面上显示操作对象的移动轨迹。
在本申请的另一个实施例中,在控制操作对象向点击位置移动时,终端还将控制移动图标按照操作对象的移动轨迹进行移动,并当获取到移动图标的移动操作时,在操作界面上显示移动图标的移动轨迹。其中,移动图标为控制操作对象移动的图标。参见图5,指定应用为游戏类应用,在控制操作对象502的移动过程中,终端根据操作对象502的移动轨迹,可控制移动图标503移动沿着箭头方向移动。
对于上述对操作对象进行移动控制过程,为了便于理解,下面将以图6为例进行说明。
参见图6,在游戏类应用的运行过程中,执行以下步骤:
步骤601:当检测到鼠标点击操作界面上的任意位置,获取鼠标的点击位置。
步骤602:根据鼠标的点击位置和操作对象的中心点的位置,确定移动方向和移动距离,其中,移动方向为用户想使上述操作对象向哪个方向移动,移动距离为鼠标的点击位置和操作对象的中心点的位置之间的距离。
步骤603:根据移动距离,确定移动时长,进而根据移动方向和移动时长。
步骤604:拖动移动轮盘,控制操作对象向点击位置移动。
对于终端中所安装的游戏类应用,有些游戏类应用会涉及到移动及释放操作,有些游戏类应用不会涉及移动及释放操作,因此,为了节省资源,指定应用在启动之后,终端还将检测指定应用是否支持本 申请中提供的操作控制方法的应用(即是否支持新操作方式),如果该指定应用支持本发明实施例提供的操作控制方法,则引导用户开启相应的控制功能,如果该指定应用不支持本发明实施例提供的操作控制方法,或者用户未开启相应的功能,则采用传统的方法对指定应用中的操作对象进行操作控制。
对于采用本发明实施例提供的方法对操作对象进行移动控制和技能释放控制,下面将以图7和图8为例进行说明,包括以下步骤:。
参见图8,步骤801:在游戏应用启动后,终端检测该游戏应用是否支持新操作方式,如果该游戏应用支持新的操作方式,则执行步骤802,如果该游戏应用不支持新的操作方式,或者用户未开启新功能,则执行步骤803。步骤802:引导用户开启新功能,如果用户开启,执行步骤804,如果用户不开启,执行步骤803。步骤803:采用传统的方案进行操作控制。步骤804:使用新技方案进行游戏,包括在游戏中使用鼠标的移动以及使用键鼠释放技能,具体的,在对游戏中操作对象的移动进行控制时,终端可基于鼠标,获取鼠标的点击位置,从而控制操作对象移动到鼠标的点击位置。
参见图7,在控制操作对象701的移动过程中,显示鼠标的点击位置702,并根据操作对象701的移动轨迹,控制移动图标703移动。在对游戏中操作对象的技能释放进行控制时,终端可基于鼠标和键盘,获取鼠标的显示位置及键盘上已触发按键对应的技能,从而在接收到按键释放消息时,控制操作对象释放技能。参见图7,在控制操作对象701释放技能时,根据鼠标的位置702,控制技能轮盘704移动。
本发明实施例提供的方法,根据已触发按键的有效按键消息,确定目标对象属性,进而在获取到对指定按键的释放操作时,控制操作对象在由第一显示位置及第二显示位置所确定的指定区域内释放目标对象属性。该过程无需鼠标点击目标对象图标中心点,也无需拖动目标对象图标,不仅降低了操作复杂度,而且提高了释放精度。另外,本申请通过鼠标的点击位置和操作对象的位置,对操作对象的移动进行控制,从而扩展了操作对象的移动方向。
参见图9,本发明实施例提供了一种操作控制装置,该装置包括:
显示模块901,用于显示指定应用的操作界面,该操作界面上显示有操作对象的第一显示位置和配对连接鼠标的第二显示位置,该操作对象为登录指定应用的用户账号所管理的对象;
获取模块902,用于当获取到对配对连接键盘上指定按键的触发操作时,获取按键指令,该按键指令包括有效按键信息;
获取模块902,还用于根据有效按键信息,从操作对象的至少一个对象属性中,获取目标对象属性;
获取模块902,还用于当获取到对指定按键的释放操作时,获取对目标对象属性的释放指令;
确定模块903,用于当获取到释放指令时,根据第一显示位置和第二显示位置,确定指定区域;
控制模块904,用于控制操作对象在指定区域内释放目标对象属性;
显示模块905,用于当获取到操作对象的释放操作时,在指定区域内显示目标对象属性的释放器。
在本申请的另一个实施例中,操作界面上还显示有目标对象属性的释放范围和目标对象图标的移动范围,目标对象图标为承载目标对象属性的图标;
确定模块903,用于当获取到释放指令时,根据第二显示位置、释放范围及移动范围,确定第二显示位置在移动范围内的第一映射位置;
控制模块904,用于在控制操作对象在指定区域内释放目标对象属性时,控制目标对象图标向第一映射位置移动;
显示模块905,用于当获取到目标对象图标的移动操作时,在操作界面上显示目标对象图标向第一映射位置的移动轨迹。
在本申请的另一个实施例中,确定模块903,用于当第二显示位置位于释放范围之外,根据第二显示位置和释放范围的中心点位置,确定一条直线;计算直线与释放范围的交点,将交点作为第一映射位 置;或者,
确定模块903,用于当第二显示位置位于释放范围之内,根据释放范围,对第二显示位置进行缩放,得到缩放后的位置;根据缩放后的位置和移动范围,确定第一映射位置。
在本申请的另一个实施例中,获取模块902,用于当获取到鼠标的移动操作时,获取鼠标的第三显示位置;
调整模块,用于当获取到释放指令时,根据第一显示位置和第三显示位置,调整指定区域,得到调整后的区域;
控制模块904,用于控制操作对象在调整后的区域内释放目标对象属性;
显示模块905,用于当获取到操作对象的释放操作时,在调整后的区域内显示目标对象属性的释放器。
在本申请的另一个实施例中,获取模块902,用于当获取到鼠标的移动操作时,获取鼠标的第三显示位置;
确定模块903,用于当获取到释放指令时,根据第三显示位置、释放范围及移动范围,确定第三显示位置在移动范围内的第二映射位置;
控制模块904,用于在控制操作对象在指定区域内释放目标对象属性时,控制目标对象图标向第二映射位置移动;
显示模块905,用于当获取到目标对象图标的移动操作时,在操作界面上显示目标对象图标向第二映射位置的移动轨迹。
在本申请的另一个实施例中,获取模块902,用于当获取到鼠标的点击操作时,获取点击指令,该点击指令包括点击位置;
显示模块905,用于当获取到点击指令时,在操作界面上显示点击位置;
确定模块903,用于根据点击位置和第一显示位置,确定操作对象的移动方向和移动距离;
计算模块,用于根据移动距离,计算操作对象的移动时长;
控制模块904,用于根据移动方向和移动时长,控制操作对象向 点击位置移动;
显示模块905,用于当获取到操作对象的移动操作时,在操作界面上显示操作对象的移动轨迹。
在本申请的另一个实施例中,控制模块904,用于在控制操作对象向点击位置移动时,控制移动图标按照操作对象的移动轨迹进行移动,该移动图标为控制操作对象进行移动的图标;
显示模块905,用于当获取到移动图标的移动操作时,在操作界面上显示移动图标的移动轨迹。
综上,本发明实施例提供的装置,根据已触发按键的有效按键消息,确定目标对象属性,进而在获取到对指定按键的释放操作时,控制操作对象在由第一显示位置及第二显示位置所确定的指定区域内释放目标对象属性。该过程无需鼠标点击目标对象图标中心点,也无需拖动目标对象图标,不仅降低了操作复杂度,而且提高了释放精度。另外,本申请通过鼠标的点击位置和操作对象的位置,对操作对象的移动进行控制,从而扩展了操作对象的移动方向。
参见图10,其示出了本发明实施例所涉及的操作控制终端的结构示意图,该终端可以用于实施上述实施例中提供的操作控制方法。具体来讲:
终端1000可以包括RF(Radio Frequency,射频)电路110、包括有一个或一个以上计算机可读存储介质的存储器120、输入单元130、显示单元140、传感器150、音频电路160、WiFi(Wireless Fidelity,无线保真)模块170、包括有一个或者一个以上处理核心的处理器180、以及电源190等部件。本领域技术人员可以理解,图10中示出的终端结构并不构成对终端的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。其中:
RF电路110可用于收发信息或通话过程中,信号的接收和发送,特别地,将基站的下行信息接收后,交由一个或者一个以上处理器180处理;另外,将涉及上行的数据发送给基站。通常,RF电路110包 括但不限于天线、至少一个放大器、调谐器、一个或多个振荡器、用户身份模块(SIM)卡、收发信机、耦合器、LNA(Low Noise Amplifier,低噪声放大器)、双工器等。此外,RF电路110还可以通过无线通信与网络和其他设备通信。所述无线通信可以使用任一通信标准或协议,包括但不限于GSM(Global System of Mobile communication,全球移动通讯系统)、GPRS(General Packet Radio Service,通用分组无线服务)、CDMA(Code Division Multiple Access,码分多址)、WCDMA(Wideband Code Division Multiple Access,宽带码分多址)、LTE(Long Term Evolution,长期演进)、电子邮件、SMS(Short Messaging Service,短消息服务)等。
存储器120可用于存储软件程序以及模块,处理器180通过运行存储在存储器120的软件程序以及模块,从而执行各种功能应用以及数据处理。存储器120可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据终端1000的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器120可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。相应地,存储器120还可以包括存储器控制器,以提供处理器180和输入单元130对存储器120的访问。
输入单元130可用于接收输入的数字或字符信息,以及产生与用户设置以及功能控制有关的键盘、鼠标、操作杆、光学或者轨迹球信号输入。具体地,输入单元130可包括触敏表面131以及其他输入设备132。触敏表面131,也称为触摸显示屏或者触控板,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触敏表面131上或在触敏表面131附近的操作),并根据预先设定的程式驱动相应的连接装置。可选的,触敏表面131可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器; 触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器180,并能接收处理器180发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触敏表面131。除了触敏表面131,输入单元130还可以包括其他输入设备132。具体地,其他输入设备132可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆等中的一种或多种。
显示单元140可用于显示由用户输入的信息或提供给用户的信息以及终端1000的各种图形用户接口,这些图形用户接口可以由图形、文本、图标、视频和其任意组合来构成。显示单元140可包括显示面板141,可选的,可以采用LCD(Liquid Crystal Display,液晶显示器)、OLED(Organic Light-Emitting Diode,有机发光二极管)等形式来配置显示面板141。进一步的,触敏表面131可覆盖显示面板141,当触敏表面131检测到在其上或附近的触摸操作后,传送给处理器180以确定触摸事件的类型,随后处理器180根据触摸事件的类型在显示面板141上提供相应的视觉输出。虽然在图10中,触敏表面131与显示面板141是作为两个独立的部件来实现输入和输入功能,但是在某些实施例中,可以将触敏表面131与显示面板141集成而实现输入和输出功能。
终端1000还可包括至少一种传感器150,比如光传感器、运动传感器以及其他传感器。具体地,光传感器可包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板141的亮度,接近传感器可在终端1000移动到耳边时,关闭显示面板141和/或背光。作为运动传感器的一种,重力加速度传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别手机姿态的应用(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;至于终端1000还可配置的陀螺仪、气压计、湿度计、温度计、红外线传感器等其他传感器,在此不再赘述。
音频电路160、扬声器161,传声器162可提供用户与终端1000之间的音频接口。音频电路160可将接收到的音频数据转换后的电信号,传输到扬声器161,由扬声器161转换为声音信号输出;另一方面,传声器162将收集的声音信号转换为电信号,由音频电路160接收后转换为音频数据,再将音频数据输出处理器180处理后,经RF电路110以发送给比如另一终端,或者将音频数据输出至存储器120以便进一步处理。音频电路160还可能包括耳塞插孔,以提供外设耳机与终端1000的通信。
WiFi属于短距离无线传输技术,终端1000通过WiFi模块170可以帮助用户收发电子邮件、浏览网页和访问流式媒体等,它为用户提供了无线的宽带互联网访问。虽然图10示出了WiFi模块170,但是可以理解的是,其并不属于终端1000的必须构成,完全可以根据需要在不改变发明的本质的范围内而省略。
处理器180是终端1000的控制中心,利用各种接口和线路连接整个手机的各个部分,通过运行或执行存储在存储器120内的软件程序和/或模块,以及调用存储在存储器120内的数据,执行终端1000的各种功能和处理数据,从而对手机进行整体监控。可选的,处理器180可包括一个或多个处理核心;可选的,处理器180可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器180中。
终端1000还包括给各个部件供电的电源190(比如电池),优选的,电源可以通过电源管理系统与处理器180逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。电源190还可以包括一个或一个以上的直流或交流电源、再充电系统、电源故障检测电路、电源转换器或者逆变器、电源状态指示器等任意组件。
尽管未示出,终端1000还可以包括摄像头、蓝牙模块等,在此不再赘述。具体在本实施例中,终端1000的显示单元是触摸屏显示器,终端1000还包括有存储器,以及一个或者一个以上的程序,其 中一个或者一个以上程序存储于存储器中,且经配置以由一个或者一个以上处理器执行。所述一个或者一个以上程序包含用于执行操作控制方法的指令:
本发明实施例提供的终端,根据已触发按键的有效按键消息,确定目标对象属性,进而在获取到对指定按键的释放操作时,控制操作对象在由第一显示位置及第二显示位置所确定的指定区域内释放目标对象属性。该过程无需鼠标点击目标对象图标中心点,也无需拖动目标对象图标,不仅降低了操作复杂度,而且提高了释放精度。另外,本申请通过鼠标的点击位置和操作对象的位置,对操作对象的移动进行控制,从而扩展了操作对象的移动方向。
本发明实施例还提供了一种计算机可读存储介质,该计算机可读存储介质可以是上述实施例中的存储器中所包含的计算机可读存储介质;也可以是单独存在,未装配入终端中的计算机可读存储介质。该计算机可读存储介质存储有一个或者一个以上程序,该一个或者一个以上程序被一个或者一个以上的处理器用来执行操作控制方法。
本发明实施例提供的计算机可读存储介质,根据已触发按键的有效按键消息,确定目标对象属性,进而在获取到对指定按键的释放操作时,控制操作对象在由第一显示位置及第二显示位置所确定的指定区域内释放目标对象属性。该过程无需鼠标点击目标对象图标中心点,也无需拖动目标对象图标,不仅降低了操作复杂度,而且提高了释放精度。另外,本申请通过鼠标的点击位置和操作对象的位置,对操作对象的移动进行控制,从而扩展了操作对象的移动方向。
本发明实施例中提供了一种图形用户接口,该图形用户接口用在操作控制方法的终端上,该执行操作控制方法的终端包括触摸屏显示器、存储器和用于执行一个或者一个以上的程序的一个或者一个以上的处理器。
本发明实施例提供的图形用户接口,根据已触发按键的有效按键 消息,确定目标对象属性,进而在获取到对指定按键的释放操作时,控制操作对象在由第一显示位置及第二显示位置所确定的指定区域内释放目标对象属性。该过程无需鼠标点击目标对象图标中心点,也无需拖动目标对象图标,不仅降低了操作复杂度,而且提高了释放精度。另外,本申请通过鼠标的点击位置和操作对象的位置,对操作对象的移动进行控制,从而扩展了操作对象的移动方向。
需要说明的是:上述实施例提供的操作控制装置在进行操作控制时,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将操作控制装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。另外,上述实施例提供的操作控制装置与操作控制方法实施例属于同一构思,其具体实现过程详见方法实施例,这里不再赘述。
本领域普通技术人员可以理解实现上述实施例的全部或部分步骤可以通过硬件来完成,也可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,上述提到的存储介质可以是只读存储器,磁盘或光盘等。
以上所述仅为本申请的较佳实施例,并不用以限制本申请,凡在本申请的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。

Claims (18)

  1. 一种操作控制方法,应用于操作控制装置,所述方法包括:
    显示指定应用的操作界面,所述操作界面上显示有操作对象的第一显示位置和配对连接鼠标的第二显示位置;
    当获取到针对配对连接键盘上指定按键的触发操作时,获取按键指令,所述按键指令包括有效按键信息;
    根据所述有效按键信息,从所述操作对象的对象属性中,获取目标对象属性;
    当获取到对所述指定按键的释放操作时,获取对所述目标对象属性的释放指令;
    根据所述第一显示位置和所述第二显示位置,确定第一指定区域;
    控制所述操作对象在所述第一指定区域内释放所述目标对象属性。
  2. 根据权利要求1所述的方法,其中,所述控制所述操作对象在所述第一指定区域内释放所述目标对象属性,进一步包括:
    当获取到所述操作对象的释放操作时,在所述第一指定区域内显示所述目标对象属性的释放器。
  3. 根据权利要求2所述的方法,其中,所述操作界面上还显示有所述目标对象属性的释放范围和目标对象图标的移动范围,所述目标对象图标为承载所述目标对象属性的图标;
    其中,所述方法还包括:
    当获取到所述释放指令时,根据所述第二显示位置、所述释放范围及所述移动范围,确定所述第二显示位置在所述移动范围内的第一映射位置;
    在控制所述操作对象在所述第一指定区域内释放所述目标对象属性时,控制所述目标对象图标向所述第一映射位置移动;
    当获取到所述目标对象图标的移动操作时,在所述操作界面上显 示所述目标对象图标向所述第一映射位置的移动轨迹。
  4. 根据权利要求3所述的方法,其中,所述根据所述第二显示位置、所述释放范围及所述移动范围,确定所述第二显示位置在所述移动范围内的第一映射位置,包括:
    如果所述第二显示位置位于所述释放范围之外,则根据所述第二显示位置和所述释放范围的中心点位置,确定一条直线;
    计算所述直线与所述释放范围的交点,将所述交点作为所述第一映射位置。
  5. 根据权利要求3所述的方法,其中,所述根据所述第二显示位置、所述释放范围及所述移动范围,确定所述第二显示位置在所述移动范围内的第一映射位置,包括:
    如果所述第二显示位置位于所述释放范围之内,则根据所述释放范围,对所述第二显示位置进行缩放,得到缩放后的位置;
    根据所述缩放后的位置和所述移动范围,确定所述第一映射位置。
  6. 根据权利要求1所述的方法,其中,所述方法还包括:
    当获取到所述鼠标的移动操作时,获取所述鼠标的第三显示位置;
    当获取到所述释放指令时,根据所述第一显示位置和所述第三显示位置,调整所述第一指定区域,得到第二指定区域;
    控制所述操作对象在所述第二指定区域内释放所述目标对象属性;
    当获取到所述操作对象的释放操作时,在所述调整后的区域内显示所述目标对象属性的释放器。
  7. 根据权利要求3所述的方法,其中,所述方法还包括:
    当获取到所述鼠标的移动操作时,获取所述鼠标的第三显示位置;
    当获取到所述释放指令时,根据所述第三显示位置、所述释放范 围及所述移动范围,确定所述第三显示位置在所述移动范围内的第二映射位置;
    在控制所述操作对象在所述第二指定区域内释放所述目标对象属性时,控制所述目标对象图标向所述第二映射位置移动;
    当获取到所述目标对象图标的移动操作时,在所述操作界面上显示所述目标对象图标向所述第二映射位置的移动轨迹。
  8. 根据权利要求1所述的方法,其中,所述方法还包括:
    当获取到所述鼠标的点击操作时,获取点击指令,所述点击指令包括点击位置;
    当获取到所述点击指令时,在所述操作界面上显示所述点击位置;
    根据所述点击位置和所述第一显示位置,确定所述操作对象的移动方向和移动距离;
    根据所述移动距离,计算所述操作对象的移动时长;
    根据所述移动方向和所述移动时长,控制所述操作对象向所述点击位置移动;
    当获取到所述操作对象的移动操作时,在所述操作界面上显示所述操作对象的移动轨迹。
  9. 根据权利要求8所述的方法,其中,所述方法还包括:
    在控制所述操作对象向所述点击位置移动时,控制移动图标按照所述操作对象的移动轨迹进行移动,所述移动图标为控制所述操作对象进行移动的图标;
    当获取到所述移动图标的移动操作时,在所述操作界面上显示所述移动图标的移动轨迹。
  10. 一种操作控制装置,所述装置包括:存储器、处理器;其中,所述存储器中存储有计算机可读指令,所述处理器执行所述存储中的计算机可读指令,用于:
    显示指定应用的操作界面,所述操作界面上显示有操作对象的第一显示位置和配对连接鼠标的第二显示位置,所述操作对象为登录所述指定应用的用户账号所管理的对象;
    当获取到针对配对连接键盘上指定按键的触发操作时,获取按键指令,所述按键指令包括有效按键信息;
    根据所述有效按键信息,从所述操作对象的对象属性中,获取目标对象属性;
    当获取到对所述指定按键的释放操作时,获取对所述目标对象属性的释放指令;
    当获取到所述释放指令时,根据所述第一显示位置和所述第二显示位置,确定第一指定区域;
    控制所述操作对象在所述第一指定区域内释放所述目标对象属性;
    当获取到所述操作对象的释放操作时,在所述第一指定区域内显示所述目标对象属性的释放器。
  11. 根据权利要求10所述的装置,其中,所述处理器进一步执行所述计算机可读指令,用于:当获取到所述操作对象的释放操作时,在所述第一指定区域内显示所述目标对象属性的释放器。
  12. 根据权利要求11所述的装置,其中,所述操作界面上还显示有所述目标对象属性的释放范围和目标对象图标的移动范围,所述目标对象图标为承载所述目标对象属性的图标;
    所述处理器进一步执行所述计算机可读指令,用于:当获取到所述释放指令时,根据所述第二显示位置、所述释放范围及所述移动范围,确定所述第二显示位置在所述移动范围内的第一映射位置;
    在控制所述操作对象在所述第一指定区域内释放所述目标对象属性时,控制所述目标对象图标向所述第一映射位置移动;
    当获取到所述目标对象图标的移动操作时,在所述操作界面上显 示所述目标对象图标向所述第一映射位置的移动轨迹。
  13. 根据权利要求12所述的装置,其中,所述处理器进一步执行所述计算机可读指令,用于:当所述第二显示位置位于所述释放范围之外,根据所述第二显示位置和所述释放范围的中心点位置,确定一条直线;计算所述直线与所述释放范围的交点,将所述交点作为所述第一映射位置;或者,
    当所述第二显示位置位于所述释放范围之内,根据所述释放范围,对所述第二显示位置进行缩放,得到缩放后的位置;根据所述缩放后的位置和所述移动范围,确定所述第一映射位置。
  14. 根据权利要求10所述的装置,其中,所述处理器进一步执行所述计算机可读指令,用于:当获取到所述鼠标的移动操作时,获取所述鼠标的第三显示位置;
    当获取到所述释放指令时,根据所述第一显示位置和所述第三显示位置,调整所述第一指定区域,得到第二指定区域;
    控制所述操作对象在所述第二指定区域内释放所述目标对象属性;
    当获取到所述操作对象的释放操作时,在所述调整后的区域内显示所述目标对象属性的释放器。
  15. 根据权利要求12所述的装置,其中,所述处理器进一步执行所述计算机可读指令,用于:当获取到所述鼠标的移动操作时,获取所述鼠标的第三显示位置;
    当获取到所述释放指令时,根据所述第三显示位置、所述释放范围及所述移动范围,确定所述第三显示位置在所述移动范围内的第二映射位置;
    在控制所述操作对象在所述第二指定区域内释放所述目标对象属性时,控制所述目标对象图标向所述第二映射位置移动;
    当获取到所述目标对象图标的移动操作时,在所述操作界面上显示所述目标对象图标向所述第二映射位置的移动轨迹。
  16. 根据权利要求10所述的装置,其中,所述处理器进一步执行所述计算机可读指令,用于:当获取到所述鼠标的点击操作时,获取点击指令,所述点击指令包括点击位置;
    当获取到所述点击指令时,在所述操作界面上显示所述点击位置;
    根据所述点击位置和所述第一显示位置,确定所述操作对象的移动方向和移动距离;
    根据所述移动距离,计算所述操作对象的移动时长;
    根据所述移动方向和所述移动时长,控制所述操作对象向所述点击位置移动;
    当获取到所述操作对象的移动操作时,在所述操作界面上显示所述操作对象的移动轨迹。
  17. 根据权利要求16所述的装置,其中,所述处理器进一步执行所述计算机可读指令,用于:在控制所述操作对象向所述点击位置移动时,控制移动图标按照所述操作对象的移动轨迹进行移动,所述移动图标为控制所述操作对象进行移动的图标;
    当获取到所述移动图标的移动操作时,在所述操作界面上显示所述移动图标的移动轨迹。
  18. 一种非易失性计算机可读存储介质,存储有计算机可读指令,可以使至少一个处理器执行如权利要求1-9任一项所述的方法。
PCT/CN2018/080205 2017-03-28 2018-03-23 操作控制方法、装置及存储介质 WO2018177207A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/433,914 US10845981B2 (en) 2017-03-28 2019-06-06 Operation control method, device and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710193663.3A CN107066173B (zh) 2017-03-28 2017-03-28 操作控制方法及装置
CN201710193663.3 2017-03-28

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/433,914 Continuation US10845981B2 (en) 2017-03-28 2019-06-06 Operation control method, device and storage medium

Publications (1)

Publication Number Publication Date
WO2018177207A1 true WO2018177207A1 (zh) 2018-10-04

Family

ID=59620437

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/080205 WO2018177207A1 (zh) 2017-03-28 2018-03-23 操作控制方法、装置及存储介质

Country Status (3)

Country Link
US (1) US10845981B2 (zh)
CN (1) CN107066173B (zh)
WO (1) WO2018177207A1 (zh)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107066173B (zh) 2017-03-28 2018-06-05 腾讯科技(深圳)有限公司 操作控制方法及装置
CN109582187B (zh) * 2017-09-29 2021-05-25 腾讯科技(深圳)有限公司 文件发送方法、装置、计算机设备和存储介质
CN110795191A (zh) * 2019-10-29 2020-02-14 北京一起教育信息咨询有限责任公司 一种连线对象位置调节方法和装置
CN112486321B (zh) * 2020-11-30 2022-12-13 郑州捷安高科股份有限公司 三维模型操作控制方法、装置以及终端设备
CN113238699B (zh) * 2021-05-31 2022-07-01 东莞市铭冠电子科技有限公司 一种画直线的鼠标辅助功能的方法
US11853480B2 (en) * 2021-06-04 2023-12-26 Zouheir Taher Fadlallah Capturing touchless inputs and controlling a user interface with the same
US11507197B1 (en) 2021-06-04 2022-11-22 Zouheir Taher Fadlallah Capturing touchless inputs and controlling an electronic device with the same
CN116974680B (zh) * 2023-08-02 2024-06-18 纽扣数字智能科技(深圳)集团有限公司 一种基于外设鼠键的电脑桌面远程控制方法、装置及设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1801080A (zh) * 2005-01-04 2006-07-12 凌阳科技股份有限公司 一种可编程控制装置的开发方法及系统
CN104922906A (zh) * 2015-07-15 2015-09-23 网易(杭州)网络有限公司 动作执行方法和装置
CN105117579A (zh) * 2015-07-21 2015-12-02 网易(杭州)网络有限公司 对象选取方法和装置
CN107066173A (zh) * 2017-03-28 2017-08-18 腾讯科技(深圳)有限公司 操作控制方法及装置

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8648970B2 (en) * 2010-08-02 2014-02-11 Chip Goal Electronics Corporation, Roc Remote controllable video display system and controller and method therefor
WO2012125990A2 (en) * 2011-03-17 2012-09-20 Laubach Kevin Input device user interface enhancements
US9086794B2 (en) * 2011-07-14 2015-07-21 Microsoft Technology Licensing, Llc Determining gestures on context based menus
US9582165B2 (en) * 2012-05-09 2017-02-28 Apple Inc. Context-specific user interfaces
US9295919B2 (en) * 2013-03-15 2016-03-29 Zynga Inc. Smart ping system
CN103886198B (zh) * 2014-03-17 2016-12-07 腾讯科技(深圳)有限公司 一种数据处理的方法、终端、服务器及系统
DE212015000213U1 (de) * 2014-09-02 2017-05-02 Apple Inc. Multidimensionale Objektneuordnung
CN104965655A (zh) * 2015-06-15 2015-10-07 北京极品无限科技发展有限责任公司 一种触摸屏游戏控制方法
CN106201265B (zh) * 2016-06-30 2019-07-05 网易(杭州)网络有限公司 一种基于web的移动终端操控方法、装置及系统
US10403050B1 (en) * 2017-04-10 2019-09-03 WorldViz, Inc. Multi-user virtual and augmented reality tracking systems

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1801080A (zh) * 2005-01-04 2006-07-12 凌阳科技股份有限公司 一种可编程控制装置的开发方法及系统
CN104922906A (zh) * 2015-07-15 2015-09-23 网易(杭州)网络有限公司 动作执行方法和装置
CN105117579A (zh) * 2015-07-21 2015-12-02 网易(杭州)网络有限公司 对象选取方法和装置
CN107066173A (zh) * 2017-03-28 2017-08-18 腾讯科技(深圳)有限公司 操作控制方法及装置

Also Published As

Publication number Publication date
CN107066173B (zh) 2018-06-05
US20190310753A1 (en) 2019-10-10
US10845981B2 (en) 2020-11-24
CN107066173A (zh) 2017-08-18

Similar Documents

Publication Publication Date Title
WO2018177207A1 (zh) 操作控制方法、装置及存储介质
CN107038112B (zh) 应用界面的调试方法及装置
TWI606416B (zh) 分享地理位置的方法、終端及系統
TWI525522B (zh) 目標元素移動方法、裝置及電子設備
WO2016107501A1 (zh) 智能设备控制方法及装置
WO2016169465A1 (zh) 一种显示弹幕信息的方法、装置和系统
TWI520043B (zh) 消息查看方法、裝置和移動終端
WO2015035796A1 (zh) 桌面显示方法、装置及终端
WO2016206491A1 (zh) 目标对象运动轨迹确定方法、装置以及存储介质
WO2020181955A1 (zh) 界面控制方法及终端设备
WO2014206101A1 (zh) 一种基于手势的会话处理方法、装置及终端设备
CN106445340B (zh) 一种双屏终端显示立体图像的方法和装置
WO2017129053A1 (zh) 数据传输方法及装置
CN107390922B (zh) 虚拟触控方法、装置、存储介质及终端
CN108681427B (zh) 一种访问权限控制的方法及终端设备
CN109032468B (zh) 一种调节设备参数的方法及终端
TW201515682A (zh) 一種數據獲取的方法及終端
CN111092990A (zh) 一种应用程序分享方法及电子设备
WO2018137304A1 (zh) 一种2d应用在vr设备中的显示方法及终端
CN111163224B (zh) 一种语音消息播放方法及电子设备
WO2017128986A1 (zh) 多媒体菜单项的选择方法、装置及存储介质
JP2017509051A (ja) ストリーミングメディアデータに関する統計を収集するための方法およびシステム、ならびに関連する装置
WO2021093772A1 (zh) 通知消息的处理方法及电子设备
WO2015074387A1 (zh) 滑动操作响应方法、装置及终端设备
US20150002417A1 (en) Method of processing user input and apparatus using the same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18776630

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18776630

Country of ref document: EP

Kind code of ref document: A1