CN111142689B - Mobile control method, device and equipment and computer readable storage medium - Google Patents

Mobile control method, device and equipment and computer readable storage medium Download PDF

Info

Publication number
CN111142689B
CN111142689B CN201911409408.3A CN201911409408A CN111142689B CN 111142689 B CN111142689 B CN 111142689B CN 201911409408 A CN201911409408 A CN 201911409408A CN 111142689 B CN111142689 B CN 111142689B
Authority
CN
China
Prior art keywords
movement
control
moving
sub
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911409408.3A
Other languages
Chinese (zh)
Other versions
CN111142689A (en
Inventor
何晶晶
仇蒙
田聪
张书婷
崔维健
刘博艺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201911409408.3A priority Critical patent/CN111142689B/en
Publication of CN111142689A publication Critical patent/CN111142689A/en
Application granted granted Critical
Publication of CN111142689B publication Critical patent/CN111142689B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention provides a mobile control method, a device, equipment and a storage medium; the method comprises the following steps: after entering the control application, displaying a direction moving control in a moving control area on the control interface, and displaying a moving object in the moving area on the control interface; the direction movement control represents a control of at least one movement direction in a first dimension; the direction moving control is used for realizing the movement of the moving object on the three-dimensional space of the moving area; the first dimension belongs to a three-dimensional space; receiving touch operation acting on a mobile control area; responding to touch operation, determining a target movement type triggered by a direction movement control, and acquiring track information of a touch end focus; determining a target moving direction in a three-dimensional space according to the moving direction and the track information corresponding to the target moving type; and controlling the moving object to move in the target moving direction in the moving area. By the embodiment of the invention, the processing flow for controlling the movement of the moving object can be simplified.

Description

Mobile control method, device and equipment and computer readable storage medium
Technical Field
The present invention relates to control technologies in the field of computers, and in particular, to a mobility control method, apparatus, device, and computer-readable storage medium.
Background
The virtual rocker is a control presented on a display screen of the touch screen device and used for controlling a moving object to move; specifically, a virtual rocker displayed on a display screen of the touch screen device controls a moving object to move corresponding to a touch operation by receiving the touch operation acted on the virtual rocker; virtual joysticks are commonly used in games to effectuate movement of a game character.
Generally, the virtual rockers include a virtual rocker which moves left, right, front and back, and a virtual rocker which moves up and down, so that in order to control the movement of a moving object in a three-dimensional space, two virtual rockers which move left, right, front and back, and a virtual rocker which moves up and down are required to be implemented; therefore, for example, when the moving object is controlled to move upward and forward, the touch commands of two virtual joysticks need to be detected and processed at the same time; therefore, the process flow of controlling the movement of the moving object is complicated.
Disclosure of Invention
Embodiments of the present invention provide a movement control method, apparatus, device, and computer-readable storage medium, which can simplify a processing flow for controlling movement of a moving object.
The technical scheme of the embodiment of the invention is realized as follows:
the embodiment of the invention provides a mobile control method, which comprises the following steps:
after entering a control application, displaying a direction moving control in a moving control area on a control interface, and displaying a moving object in the moving area on the control interface;
wherein the direction movement control characterizes a control of at least one direction of movement in a first dimension; the direction movement control is used for realizing the movement of the moving object on the three-dimensional space of the moving area; the first dimension belongs to the three-dimensional space;
receiving touch operation acting on the mobile control area;
responding to the touch operation, determining a target movement type triggered by the direction movement control, and acquiring track information of a touch end focus;
determining a target moving direction on the three-dimensional space according to the moving direction corresponding to the target moving type and the track information;
controlling the moving object to move in the target moving direction in the moving area.
An embodiment of the present invention provides a mobile control apparatus, including:
the display module is used for displaying a direction moving control in a moving control area on a control interface and displaying a moving object in the moving area on the control interface after entering the control application; wherein the direction movement control characterizes a control of at least one direction of movement in a first dimension; the direction movement control is used for realizing the movement of the moving object on the three-dimensional space of the moving area; the first dimension belongs to the three-dimensional space;
the touch control module is used for receiving touch control operation acting on the mobile control area;
the response module is used for responding to the touch operation, determining a target movement type triggered by the direction movement control and acquiring track information of a touch ending focus;
the direction determining module is used for determining the target moving direction on the three-dimensional space according to the moving direction corresponding to the target moving type and the track information;
and the movement control module is used for controlling the moving object to move along the target moving direction in the moving area.
An embodiment of the present invention provides a mobile control device, including:
a memory for storing executable instructions;
and the processor is used for realizing the movement control method provided by the embodiment of the invention when executing the executable instructions stored in the memory.
The embodiment of the invention provides a computer-readable storage medium, which stores executable instructions and is used for causing a processor to execute the executable instructions so as to realize the movement control method provided by the embodiment of the invention.
The embodiment of the invention has the following beneficial effects: when the moving object is controlled to move in the three-dimensional space, the moving direction of the moving object in the three-dimensional space can be determined according to the touch operation acting on one moving control area and the action relation between the touch operation and the direction moving control displayed on the moving control area, namely, in the process of controlling the moving object to move, the touch operation on one moving control area only needs to be detected and processed, and the processing flow for controlling the moving object to move is simplified.
Drawings
FIG. 1 is a schematic diagram of an exemplary motion control implementation;
fig. 2 is a schematic diagram of an alternative architecture of the mobile control system 100 according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of a terminal 400 according to an embodiment of the present invention;
fig. 4 is an alternative flow chart of the movement control method according to the embodiment of the present invention;
FIG. 5 is a diagram of an exemplary display direction movement control provided by an embodiment of the present invention;
FIG. 6 is a diagram of another exemplary display direction movement control provided by embodiments of the present invention;
FIG. 7 is a diagram illustrating an exemplary acquisition of trace information according to an embodiment of the present invention;
FIG. 8 is a diagram illustrating another exemplary method for obtaining trace information according to an embodiment of the present invention;
FIG. 9 is a schematic diagram of yet another exemplary display direction movement control provided by an embodiment of the present invention;
fig. 10 is a schematic flow chart of another alternative movement control method provided in the embodiment of the present invention;
fig. 11 is a schematic diagram illustrating an application of an exemplary mobility control method according to an embodiment of the present invention;
FIG. 12 is a schematic diagram of an exemplary upward movement implementation provided by embodiments of the present invention;
FIG. 13 is a diagram illustrating an exemplary movement trajectory provided by embodiments of the present invention;
FIG. 14 is a schematic diagram of an exemplary target movement direction provided by embodiments of the present invention;
FIG. 15 is a schematic diagram of an exemplary implementation of horizontal movement provided by embodiments of the present invention;
FIG. 16 is a schematic diagram of yet another exemplary target movement direction provided by an embodiment of the present invention;
FIG. 17 is a schematic diagram of an exemplary downward movement implementation provided by embodiments of the present invention;
FIG. 18 is a schematic diagram of another exemplary target movement direction provided by embodiments of the present invention;
FIG. 19 is a schematic structural diagram of a virtual joystick according to an embodiment of the present invention;
fig. 20 is a schematic diagram of an exemplary touch display provided by an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the present invention will be further described in detail with reference to the accompanying drawings, the described embodiments should not be construed as limiting the present invention, and all other embodiments obtained by a person of ordinary skill in the art without creative efforts shall fall within the protection scope of the present invention.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is understood that "some embodiments" may be the same subset or different subsets of all possible embodiments, and may be combined with each other without conflict.
In the following description, references to the terms "first \ second \ third" are only to distinguish similar objects and do not denote a particular order, but rather the terms "first \ second \ third" are used to interchange specific orders or sequences, where appropriate, to enable embodiments of the invention described herein to be practiced in other than the order shown or described herein.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the embodiments of the present invention is for the purpose of describing the embodiments of the present invention only and is not intended to be limiting of the present invention.
Before further detailed description of the embodiments of the present invention, terms and expressions mentioned in the embodiments of the present invention are explained, and the terms and expressions mentioned in the embodiments of the present invention are applied to the following explanations.
1) The three-dimensional space is composed of a first dimension, a second dimension and a third dimension and is a space which can be seen and felt by people; a representation in three-dimensional space is generally made by the x-axis, y-axis and z-axis; in the embodiment of the present invention, the first dimension, the second dimension, and the third dimension may correspond to the x axis, the y axis, and the z axis in a combined manner, for example, the first dimension may refer to the z axis, the second dimension may refer to the x axis, and the third dimension may refer to the y axis.
2) The virtual rocker refers to a control for controlling the moving direction of a moving object, which is implemented on the device through a virtual button, for example, an up-down virtual rocker, a front-back virtual rocker, a left-right virtual rocker, and the like.
3) The control-type application refers to a type of application that needs to move in various directions by controlling a moving object in a functional application, for example, a Role-playing game (RPG) in a hand game, a First-person shooter-type game (FPS), and the like.
4) A control, which refers to an operable object for triggering a corresponding processing operation, may be virtual or physical.
Generally, in control-class applications, when a preset function is implemented by controlling the movement of a moving object, functional components such as virtual rockers are usually used for receiving control signals to control the movement of the moving object, and when the moving object is controlled to move, the control is usually implemented by two virtual rockers, namely a virtual rocker for controlling the moving object to move back and forth and left and right and a virtual rocker for controlling the moving object to move up and down; so, when the control moves the removal of object on three-dimensional space, need two virtual rockers of both hands operation to realize to, operation flow is complicated, and two virtual rockers have occupied display space, have influenced display effect.
Illustratively, referring to fig. 1, fig. 1 is a schematic diagram of an exemplary motion control implementation; as shown in fig. 1, the game application interface 1-1 comprises a movement control area 1-21, a movement control area 1-22 and a movement area 1-3, the movement control area 1-21 and the movement control area 1-22 are displayed on the movement area 1-3 in a semi-transparent suspension manner, and a front-back left-right virtual rocker 1-211 is displayed on the movement control area 1-21 and is used for controlling the airplane model 1-4 to move in the front-back left-right direction in the movement area 1-3, which is generally realized by a left hand; an up-down virtual stick 1-221 is displayed on the movement control area 1-22 for controlling the airplane model 1-4 to move in the up-down direction in the movement area 1-3, typically by a right hand. That is, it is necessary to control the simultaneous forward/backward/left/right movement of the airplane model while moving up/down by two fingers or two-hand operation, which is inefficient and has a poor display effect.
Based on this, embodiments of the present invention provide a movement control method, apparatus, device, and computer-readable storage medium, which can simplify a processing flow of moving a moving object, reduce occupation of a display space, and improve a display effect. An exemplary application of the mobile control device provided by the embodiment of the present invention is described below, and the mobile control device provided by the embodiment of the present invention may be implemented as various types of user terminals such as a smart phone, a tablet computer, and a notebook computer, and may also be implemented as a server. Next, an exemplary application when the mobile control apparatus is implemented as a terminal will be explained.
Referring to fig. 2, fig. 2 is an alternative architecture diagram of the mobile control system 100 according to an embodiment of the present invention, in order to support a control-class application, the terminal 400 includes a graphical interface 410, the terminal 400 is connected to the server 200 through the network 300, and the network 300 may be a wide area network or a local area network, or a combination of the two.
The terminal 400 is configured to interact with the server 200 through the network 300 to obtain a function service corresponding to a control application, so as to display a direction movement control in a movement control area on a control interface, that is, a graphical interface 410, and display a moving object in the movement area on the control interface after entering the control application; wherein the direction movement control characterizes a control of at least one direction of movement in a first dimension; the direction moving control is used for realizing the movement of the moving object on the three-dimensional space of the moving area; the first dimension belongs to a three-dimensional space; receiving touch operation acting on a mobile control area; responding to touch operation, determining a target movement type triggered by a direction movement control, and acquiring track information of a touch end focus; determining a target moving direction in a three-dimensional space according to the moving direction and the track information corresponding to the target moving type; and controlling the moving object to move in the target moving direction in the moving area.
A server 200 for providing a function service corresponding to the control application to the terminal 400 through the network 300; here, the server 200 is a server corresponding to the control class application.
Referring to fig. 3, fig. 3 is a schematic structural diagram of a terminal 400 according to an embodiment of the present invention, where the terminal 400 shown in fig. 3 includes: at least one processor 410, memory 450, at least one network interface 420, and a user interface 430. The various components in the terminal 400 are coupled together by a bus system 440. It is understood that the bus system 440 is used to enable communications among the components. The bus system 440 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 440 in FIG. 3.
The Processor 410 may be an integrated circuit chip having Signal processing capabilities, such as a general purpose Processor, a Digital Signal Processor (DSP), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like, wherein the general purpose Processor may be a microprocessor or any conventional Processor, or the like.
The user interface 430 includes one or more output devices 431, including one or more speakers and/or one or more visual displays, that enable the presentation of media content. The user interface 430 also includes one or more input devices 432, including user interface components that facilitate user input, such as a keyboard, mouse, microphone, touch screen display, camera, other input buttons and controls.
The memory 450 includes either volatile memory or nonvolatile memory, and may include both volatile and nonvolatile memory. The non-volatile Memory may be a Read Only Memory (ROM), and the volatile Memory may be a Random Access Memory (RAM). The memory 450 described in embodiments of the invention is intended to comprise any suitable type of memory. Memory 450 optionally includes one or more storage devices physically located remote from processor 410.
In some embodiments, memory 450 is capable of storing data, examples of which include programs, modules, and data structures, or a subset or superset thereof, to support various operations, as exemplified below.
An operating system 451, including system programs for handling various basic system services and performing hardware-related tasks, such as a framework layer, a core library layer, a driver layer, etc., for implementing various basic services and handling hardware-based tasks;
a network communication module 452 for communicating to other computing devices via one or more (wired or wireless) network interfaces 420, exemplary network interfaces 420 including: bluetooth, wireless-compatibility authentication (Wi-Fi), and Universal Serial Bus (USB), etc.;
a display module 453 for enabling presentation of information (e.g., user interfaces for operating peripherals and displaying content and information) via one or more output devices 431 (e.g., display screens, speakers, etc.) associated with user interface 430;
an input processing module 454 for detecting one or more user inputs or interactions from one of the one or more input devices 432 and translating the detected inputs or interactions.
In some embodiments, the mobile control device provided by the embodiments of the present invention may be implemented in software, and fig. 3 illustrates a mobile control device 455 stored in a memory 450, which may be software in the form of programs and plug-ins, and includes the following software modules: a display module 4551, a touch module 4552, a response module 4553, a direction determination module 4554, and a joystick display module 4555, the functions of which will be described below.
In other embodiments, the mobile control Device provided in the embodiments of the present invention may be implemented in hardware, and for example, the mobile control Device provided in the embodiments of the present invention may be a processor in the form of a hardware decoding processor, which is programmed to execute the mobile control method provided in the embodiments of the present invention, for example, the processor in the form of the hardware decoding processor may be implemented by one or more Application Specific Integrated Circuits (ASICs), DSPs, Programmable Logic Devices (PLDs), Complex Programmable Logic Devices (CPLDs), Field Programmable Gate Arrays (FPGAs), or other electronic components.
In the following, the mobility control method provided by the embodiment of the present invention will be described in conjunction with an exemplary application and implementation of the terminal provided by the embodiment of the present invention.
Referring to fig. 4, fig. 4 is an alternative flow chart of a movement control method according to an embodiment of the present invention, which will be described with reference to the steps shown in fig. 4.
S101, after entering a control application, displaying a direction moving control in a moving control area on a control interface, and displaying a moving object in the moving area on the control interface; wherein the direction movement control characterizes a control of at least one direction of movement in a first dimension; the direction moving control is used for realizing the movement of the moving object on the three-dimensional space of the moving area; the first dimension belongs to a three-dimensional space.
In the embodiment of the invention, the mobile control equipment is provided with the control application, and the control application is accessed after the control application is operated after the enabling instruction for the control application is received. After the mobile control device enters the control-class application, a control interface corresponding to the control-class application is displayed on a display device (e.g., a display screen) of the mobile control device, so that the moving direction of the moving object is controlled on the control interface through the received instruction.
It should be noted that the control interface includes a movement control area and a movement area, and the movement control area may be suspended on the movement area, may also be displayed independently, and may also be displayed in other display manners, which is not specifically limited in this embodiment of the present invention; the mobile control area is an area for receiving an instruction for controlling a mobile object to move, and the mobile area is an area for the mobile object to move; thus, the mobile control device displays a direction movement control on the mobile control area for enabling movement of the mobile object in a three-dimensional space of the mobile area, where the first dimension belongs to the three-dimensional space; and displaying the moving object on the moving area, wherein the moving object is an object in control-class application, such as an airplane model, a human model, a sniping mold and the like.
In addition, the direction moving control represents a control of at least one moving direction in the first dimension, and when the direction moving control is triggered, the moving object can be triggered to move in the at least one moving direction in the first dimension; here, the at least one moving direction refers to different directions in the first dimension, for example, when the first dimension is a z-axis, the at least one moving direction may be three directions consisting of a positive direction of the z-axis, a direction of an origin of the z-axis, and a negative direction of the z-axis, and may also be two directions consisting of a positive direction of the z-axis including the origin and a negative direction of the z-axis.
S102, receiving touch operation acted on the mobile control area.
In the embodiment of the present invention, since the movement control area refers to an instruction area for controlling the moving object to move, the movement control device can obtain the touch operation through the movement control area, that is, receive the touch operation acting on the movement control area.
It should be noted that the direction movement control is displayed on the movement control area, so that the touch operation may be a touch operation acting on the direction movement control or a touch operation not acting on the direction movement control; moreover, the touch operation can be in the form of clicking, sliding or the like or a combination of a plurality of forms; the embodiment of the present invention is not particularly limited to this; in addition, the touch operation may be other non-contact operations, but as long as the operation acts on the mobile control area, the touch operation in the embodiment of the present invention is the touch operation.
S103, responding to the touch operation, determining the target movement type triggered by the direction movement control, and acquiring track information of a touch end focus.
In the embodiment of the present invention, the touch operation is used to control the moving object to move in the three-dimensional space, and therefore, after obtaining the touch operation, the mobile control device responds to the touch operation to implement the movement control of the moving object according to the touch operation.
Here, the movement types of the moving object in the first dimension determined by different touch operations are different, and therefore, the movement control apparatus can determine the movement direction of the moving object in at least one movement direction in the first dimension in response to the touch operations, and thus the target movement type is obtained.
In addition, the touch end focus of the touch operation corresponds to different track information, so that the mobile control device can obtain track information of the touch end focus when responding to the touch operation, where the track information is used to determine, in combination with the direction movement control, the movement results of the mobile object in the second dimension and the third dimension of the three-dimensional space, for example, whether to move in the second dimension and the third dimension, or the movement directions in the second dimension and the third dimension, and the like.
And S104, determining the target moving direction in the three-dimensional space according to the moving direction and the track information corresponding to the target moving type.
In the embodiment of the invention, since the target movement type represents the movement direction of the mobile object in at least one movement direction in the first dimension, the track information is used for determining the movement results of the mobile object in the second dimension and the third dimension of the three-dimensional space in combination with the direction movement control; therefore, after the target movement type and the trajectory information are obtained, the movement control apparatus can determine the movement direction of the moving object in three dimensions (the first dimension, the second dimension, and the third dimension) of the three-dimensional space, and also obtain the target movement direction.
And S105, controlling the moving object to move along the target moving direction in the moving area.
In the embodiment of the present invention, after the movement control device obtains the target movement direction, the movement direction of the moving object in the three-dimensional space is also specified, and the moving object can be controlled to move in the target movement direction in the movement area.
It should be noted that, when the movement control device controls the moving object to move in the target movement direction in the moving area, the moving speed of the moving object may be a preset speed, a speed determined according to a touch operation, a speed obtained in another manner, or the like, which is not specifically limited in this embodiment of the present invention.
It can be understood that the movement control method provided by the embodiment of the invention can realize the movement of the moving object on the three-dimensional control by using only one movement control area, thereby reducing the occupied space of the control interface and improving the display effect of the control interface; meanwhile, by adopting the movement control method provided by the embodiment of the invention, the movement direction of the moving object in the three-dimensional space can be obtained by combining the touch operation of a single finger or a single hand with the action relationship between the touch operation and the direction movement control, so that the moving object is controlled to move in the three-dimensional space by the single finger or the single hand, the movement control process is simplified, and the movement control efficiency is improved.
Further, in the embodiment of the present invention, the direction movement control includes at least one sub movement control moving in a direction; at this time, the movement control device displays the direction movement control in the movement control area on the control interface in S101, including S1011-S1012, which will be described below with reference to each step.
S1011, dividing the movement control area on the control interface into at least one sub movement control area.
In the embodiment of the present invention, since the direction movement control includes at least one sub movement control in the movement direction, it indicates that the number included in the direction movement control is at least one; accordingly, the mobile control device divides the mobile control area on the control interface into at least one area for displaying at least one sub-mobile control in the mobile direction, and obtains at least one sub-mobile control area.
It should be noted that, when the movement control device divides the movement control area, the area may be halved or not, and this is not particularly limited in the embodiment of the present invention. The sub-movement control is a set of controls for each of at least one movement direction.
And S1012, respectively displaying the sub movement control in at least one movement direction on at least one sub movement control area.
In the embodiment of the present invention, when the mobile control device obtains at least one sub mobile control area, a control in a corresponding moving direction in the sub mobile controls is displayed on each sub mobile control area, so that processing of respectively displaying the sub mobile controls in at least one moving direction on at least one sub mobile control area is realized.
Further, in the embodiment of the present invention, when the at least one sub-mobility control area includes: s1012 may be implemented by S10121-S10123 when the first sub movement control area, the second sub movement control area, and the third sub movement control area; that is, the movement control apparatus displays sub movement controls of at least one movement direction, including S10121-S10123, on at least one sub movement control region, respectively, as will be described below in conjunction with the respective steps.
S10121, displaying a first sub-movement control on the first sub-movement control area; the first sub-movement control is used for triggering the movement of the moving object in the positive direction of the first dimension.
It should be noted that, when the at least one sub movement control area includes three sub movement control areas, correspondingly, the at least one sub movement control in the movement direction also includes three sub movement controls: a first sub-move control, a second sub-move control, and a third sub-move control. Therefore, the mobile control equipment displays the first sub-mobile control on the first sub-mobile control area, and when the first sub-mobile control is triggered, the mobile object is triggered to move in the positive direction of the first dimension; here, the positive direction of the first dimension belongs to at least one direction of the first dimension.
S10122, displaying a second sub-movement control on the second sub-movement control area; the second sub-movement control is used for triggering the moving object to move in the direction of the origin of the first dimension.
Similar to the description of the implementation process of S10121, the mobile control device displays a second sub-mobile control on the second sub-mobile control area, and when the second sub-mobile control is triggered, the mobile object is triggered to move in the direction of the origin of the first dimension; here, the direction of the origin of the first dimension belongs to at least one direction of the first dimension. It is easy to know that the moving object does not move in the first dimension at this time.
S10123, displaying a third sub-movement control on the third sub-movement control area; the third sub-movement control is used for triggering the movement of the moving object in the negative direction of the first dimension.
Similar to the descriptions of the implementation processes of S10121 and S10122, the mobile control device displays a third sub-mobile control on the third sub-mobile control area, and when the third sub-mobile control is triggered, the mobile object is triggered to move in the negative direction of the first dimension; here, the negative direction of the first dimension belongs to at least one direction of the first dimension
It should be noted that S10121-S10123 are not in sequence in the execution order; the at least one direction of movement includes a positive direction of the first dimension, a direction of an origin of the first dimension, and a negative direction of the first dimension.
Illustratively, referring to fig. 5, fig. 5 is a schematic diagram of an exemplary display direction movement control provided by an embodiment of the present invention; as shown in fig. 5, when the motion control area 5-1 is located at the lower left corner of the control interface 5-2, and at least one of the sub motion control areas is three sub motion control areas: when the first sub-movement control area 5-3, the second sub-movement control area 5-4 and the third sub-movement control area 5-5 are obtained by evenly dividing the left lower corner by 30 degrees; in addition, the sub-movement controls include a first sub-movement control 5-31, a second sub-movement control 5-41 and a third sub-movement control 5-51, which are sequentially displayed on the first sub-movement control area 5-3, the second sub-movement control area 5-4 and the third sub-movement control area 5-5, respectively.
Illustratively, referring to fig. 6, fig. 6 is a schematic diagram of another exemplary display direction movement control provided by the embodiment of the present invention; as shown in FIG. 6, when the motion control area 6-1 is located at the lower left corner of the control interface 6-2, and at least one of the sub motion control areas is three sub motion control areas: the first sub-movement control area 6-3, the second sub-movement control area 6-4 and the third sub-movement control area 6-5 are obtained by being divided into 3 areas from top to bottom in sequence; in addition, the sub-movement controls include a first sub-movement control 6-31, a second sub-movement control 6-41 and a third sub-movement control 6-51, which are sequentially displayed on the first sub-movement control area 6-3, the second sub-movement control area 6-4 and the third sub-movement control area 6-5, respectively.
Further, in the embodiment of the present invention, based on the manner of displaying the direction movement control, the movement control device in S103 determines, in response to the touch operation, the target movement type of the direction movement control triggered through S1031 to S1033, which will be described below with reference to the steps.
And S1031, responding to the touch operation, and acquiring touch position information of a touch ending focus corresponding to the touch operation.
In the embodiment of the invention, as the three different sub-movement control areas are respectively displayed in the three sub-movement control areas, the movement control equipment can determine which control of the three different sub-movement controls is triggered by acquiring the position of the touch ending focus of the touch operation on the movement control area; here, the position of the touch end focus on the movement control area is touch position information.
S1032, determining a target sub-movement control area corresponding to the touch position information from the at least one sub-movement control area.
It should be noted that, after the mobile control device obtains the touch position information, it determines to which area of the at least one sub-mobile control area the touch position information belongs, that is, determines the sub-mobile control area corresponding to the touch position information from the at least one sub-mobile control area, and thus obtains the target sub-mobile control area.
S1033, determining a control displayed on the target sub-movement control area from the sub-movement controls to obtain an acted target sub-movement control, so as to obtain a target movement type.
It should be noted that, because each sub-movement control area respectively displays a corresponding control, after the movement control device obtains the target sub-movement control area, the movement control device obtains the control displayed on the target sub-movement control area, and obtains the acted target sub-movement control; in addition, three different sub-movement controls respectively identify three directions in the first dimension, namely three movement types in the first dimension; therefore, after the target sub-movement control in the three blind sub-movement controls is determined, the target movement type is obtained.
Further, in the embodiment of the present invention, based on the manner of displaying the direction movement control, the control device is moved in S103, and the track information of the touch end focus is obtained in response to the touch operation, which may be implemented through S1034-S1035, and this step is described below.
S1034, when the touch operation is a click operation, obtaining a track vector of a first preset position corresponding to the touch ending focus offset target sub-moving control corresponding to the click operation, and obtaining track information.
In the embodiment of the invention, the touch operation comprises two types, namely click operation and sliding operation; when the touch operation is a click operation, the mobile control device acquires a track vector from a first preset position corresponding to the target sub-mobile control to a touch end focus, and track information is obtained.
It should be noted that the first preset position corresponding to the target sub-movement control is used to represent a position of the target sub-movement control on the target sub-movement control area, for example, a center position of the target sub-movement control.
Exemplarily, referring to fig. 7, fig. 7 is a schematic diagram of an exemplary acquired track information according to an embodiment of the present invention, as shown in fig. 7, a track vector from a first preset position 7-1 to a touch end focus 7-2 is track information 7-3, where the first preset position 7-1 is a central position of a target sub-movement control.
And S1035, when the touch operation is the sliding operation, acquiring a track from a touch start focus to a touch end focus of the sliding operation to obtain track information.
In the embodiment of the present invention, when the touch operation is a sliding operation, a track from a touch start focus to a touch end focus of the sliding operation on the target sub movement control area, that is, track information, is obtained.
Referring to fig. 8, fig. 8 is a schematic diagram of another exemplary obtaining track information according to an embodiment of the present invention, as shown in fig. 8, a track from a touch start focus 8-1 to a touch end focus 8-2 in a sub-target movement control area 8-5, that is, track information 8-3, where a center position of a target sub-movement control, that is, a first preset position 8-4, is also shown.
Further, another way of displaying the direction movement control is provided in the embodiments of the present invention, which is described as follows. Here, for convenience of description, the embodiment of the present invention refers to this display manner as a switching mode.
In an embodiment of the present invention, the direction movement control comprises at least one movement type icon; thus, the movement control device in S101 displays the direction movement control in the movement control area on the control interface, which can be implemented in S1013, and this step is explained below.
And S1013, displaying a direction movement control comprising at least one movement type icon in a movement control area mark on the control interface.
It should be noted that the mobile control device displays a direction movement control on the mobile control area, and the movement of the mobile object in at least one movement direction of the first dimension is implemented by marking at least one movement type icon on the display direction movement control.
Here, the at least one movement type icon corresponds one-to-one to the at least one movement direction.
Further, in the embodiment of the present invention, when the at least one movement type icon includes a first movement type icon, a second movement type icon, and a third movement type icon; the first movement type icon refers to an icon corresponding to the positive direction of the first dimension, the second movement type icon refers to an icon corresponding to the positive direction of the second dimension, and the third movement type icon refers to an icon corresponding to the positive direction of the third dimension. Thus, S1013 can be realized by S10131-S10133; that is, the movement control apparatus displays a direction movement control including at least one movement type icon on the control interface in a movement control area mark, including S10131-S10133, which will be described below in conjunction with the respective steps.
S10131, displaying a direction moving control comprising a preset mark display icon on a moving control area on a control interface; the preset mark display icon is obtained by marking and displaying any one of the first movement type icon, the second movement type icon, and the third movement type icon.
In the embodiment of the present invention, the mobile control device marks at least one movement type icon on the direction movement control on the mobile control area in a preset manner, that is, obtains the direction movement control including the preset mark display icon, and the preset mark display icon is obtained by marking and displaying any one of the first movement type icon, the second movement type icon, and the third movement type icon.
Note that, the display mode of the mark display to achieve the effect of highlighting is, for example, a mode of thickening or changing color, and the embodiment of the present invention is not particularly limited to this.
S10132, acquiring a preset sequence of mark display among the first mobile type icon, the second mobile type icon and the third mobile type icon to obtain a switching strategy.
It should be noted that the mobile control device displays the first mobile type icon, the second mobile type icon, and the third mobile type icon according to a preset sequence; therefore, the mobile control device obtains the preset sequence of the mark display among the first mobile type icon, the second mobile type icon and the third mobile type icon, and the switching strategy is obtained.
S10133, performing label display switching on the first mobile type icon, the second mobile type icon, and the third mobile type icon in the directional mobile control once a switching operation is received from the directional mobile control including the preset label display icon according to the switching policy, thereby completing label display of the directional mobile control including at least one mobile type icon on the mobile control region.
That is to say, the mobile control device displays the first mobile type icon, the second mobile type icon, and the third mobile type icon in sequence according to the sequence of display of the mark represented by the switching policy and according to the received switching operation, so as to represent the moving direction of the mobile object in at least one moving direction of the first dimension.
Illustratively, referring to fig. 9, fig. 9 is a schematic diagram of yet another exemplary display direction movement control provided by an embodiment of the present invention; as shown in fig. 9, when the preset mark display icon corresponds to the second movement type icon, the switching policy is a cyclic sequence of mark display: when the second movement type icon, the first movement type icon, the third movement type icon, and the second movement type icon are displayed, the mobile control apparatus displays a direction movement control 9-3 including the first movement type icon 9-21, the marked second movement type icon 9-32, and the third movement type icon 9-41 on the movement control area 9-1 to identify that the mobile object moves in the direction of the origin of the first dimension.
When a switching operation acting on the direction movement control 9-3 including the first movement type icon 9-21, the marked second movement type icon 9-32, and the third movement type icon 9-41 is received, the direction movement control 9-3 including the marked first movement type icon 9-22, second movement type icon 9-31, and third movement type icon 9-41 is displayed on the movement control area 9-1 to identify that the moving object moves in the positive direction of the first dimension.
When a switching operation is received that acts on the direction movement control 9-3 including the marked first, second and third movement type icons 9-22, 9-31, 9-41, the direction movement control 9-3 including the first, second and third movement type icons 9-21, 9-31, 9-42 is displayed on the movement control area 9-1 to identify that the moving object is moving in the negative direction of the first dimension.
When a switching operation is received that acts on the direction movement control 9-3 including the first movement type icon 9-21, the second movement type icon 9-31 and the marked third movement type icon 9-42, the direction movement control 9-3 including the first movement type icon 9-21, the marked second movement type icon 9-32 and the marked third movement type icon 9-41 is displayed on the movement control area 9-1. Wherein the at least one direction of movement includes a positive direction of the first dimension, an origin direction of the first dimension, and a negative direction of the first dimension. And displaying the marks of the mobile type icons based on the switching mode.
Further, based on the manner of displaying the direction movement control, the movement control device in S103 determines the target movement type triggered by the direction movement control in response to the touch operation, which can also be implemented by S1036, and this step is described below.
S1036, responding to the touch operation, obtaining a movement type icon marked and displayed on the direction movement control after the direction movement control is touched, and obtaining a target movement type.
It should be noted that, when the mobile control device receives the touch operation, that is, the movement type icon marked and displayed on the direction movement control after the direction movement control is touched is obtained, and according to the movement direction corresponding to the movement type icon marked and displayed, the target movement type is also obtained.
Further, based on the manner of displaying the direction movement control, the movement control device in S103 responds to the touch operation to acquire the track information of the touch end focus, which can also be implemented through S1037-S1038, and this step is described below.
S1037, when the touch control operation is a click operation, obtaining a track vector of a second preset position corresponding to the touch control focus offset direction moving control corresponding to the click operation, and obtaining track information.
In the embodiment of the invention, the touch operation comprises two types, namely click operation and sliding operation; when the touch control operation is a click operation, the mobile control equipment acquires a track vector from a second preset position corresponding to the direction mobile control to a touch control ending focus, and track information is obtained.
It should be noted that the second preset position corresponding to the direction movement control is used to represent a position of the direction movement control on the movement control area, for example, a center position of the direction movement control.
And S1038, when the touch operation is a sliding operation, acquiring a track from a touch start focus to a touch end focus of the sliding operation to obtain track information.
In the embodiment of the present invention, when the touch operation is a sliding operation, a track from a touch start focus to a touch end focus of the sliding operation on the mobile control area, that is, track information; similar to the description of the S1035 implementation process in the embodiment of the invention.
It is easy to know that, in practical application, when the touch operation is a sliding operation, it indicates that the touch ending focus deviates from the preset position corresponding to the direction moving control, and then the acquisition of the moving directions in the second dimension and the third dimension is triggered at the same time.
Further, referring to fig. 10, fig. 10 is another schematic flow chart of an alternative movement control method provided in the embodiment of the present invention, as shown in fig. 10, in the embodiment of the present invention, S104 may be implemented by S1041-S1044; that is, the movement control device determines the target movement direction in the three-dimensional space according to the movement direction and trajectory information corresponding to the target movement type, including S1041-S1044, which will be described with reference to the steps shown in fig. 10.
S1041, taking the moving direction corresponding to the target moving type as a first moving direction; the first moving direction refers to a moving direction of the moving object in the first dimension.
In the embodiment of the present invention, since the target movement type represents the movement direction of the mobile object in the at least one movement direction in the first dimension, when the movement control device determines the movement direction corresponding to the target movement type, the movement direction of the mobile object in the at least one movement direction in the first dimension is also obtained, which is referred to as the first movement direction herein.
S1042, obtaining distance information of the track information to obtain a target distance.
It should be noted that, when the touch operation is a click operation, since the trajectory information is a trajectory vector, the mobile control device obtains the size of the trajectory information, and also obtains distance information of the trajectory information, and thus obtains a target distance; it is easy to know that the target distance refers to distance information of the position where the touch focus deviates from the control. When the touch operation is a sliding operation, the track information is a sliding track corresponding to the sliding operation, so that the mobile control device obtains the length corresponding to the sliding track, obtains the distance information of the track information, and obtains the target distance.
S1043, when the target distance is not greater than the preset distance, taking the first moving direction as a target moving direction; the preset distance is used for determining the moving result of the moving object in the second dimension and the third dimension of the three-dimensional space.
In the embodiment of the present invention, the movement control device does not determine the movement results of the moving object in the second dimension and the third dimension of the three-dimensional space according to the received operation, but determines whether to move in the second dimension and the third dimension and the moved direction according to the comparison result of the target distance and the preset distance. Here, the movement control apparatus has two processing modes, one is that when the target distance is not greater than the preset distance, at this time, it indicates that the moving object is at the origin of the second dimension and the origin of the third dimension, and moves in the first dimension according to the direction corresponding to the target movement type, that is, the target movement direction of the moving object in the three-dimensional space is the first movement direction.
It should be noted that the preset distance is a distance preset by the mobile control device, for example, the device may be performed according to a sensitivity requirement of touch (the size of the preset distance is in negative correlation with the sensitivity requirement), and is used to determine a moving result of the moving object in the second dimension and the third dimension of the three-dimensional space: whether to move in the second dimension and the third dimension, and the direction of the movement; for example, 0px, 0.1 px; the embodiment of the present invention does not specifically limit the size of the preset distance.
S1044, when the target distance is greater than the preset distance, taking the direction of the track information as a second moving direction, and obtaining the target moving direction in the three-dimensional space by adopting the first moving direction and the second moving direction; the second moving direction refers to a moving direction of the moving object in the second dimension and the third dimension.
In this embodiment of the present invention, another processing manner of the movement control device is that, when the target distance is greater than the preset distance, at this time, it indicates that the movement results of the moving object in the second dimension and the third dimension are: the moving object moves in the second dimension and the third dimension along the direction of the track information, i.e. the moving direction of the moving object in the second dimension and the third dimension, i.e. the second moving direction. Therefore, the movement control equipment adopts the first movement direction and the second movement direction, and the target movement direction of the moving object on the three-dimensional control is obtained.
Further, in the embodiment of the present invention, S1045 is further included after S1042; that is, after the movement control apparatus acquires the distance information of the trajectory information and obtains the target distance, the movement control method further includes S1045, which will be described below.
S1045, when the target distance is larger than the preset distance, displaying a virtual rocker in the mobile control area, and displaying a touch control ending focus mark on the virtual rocker; the virtual rocker is used for controlling the moving object to move in a second dimension and a third dimension, and the touch ending focus mark is used for marking the touch ending focus.
In the embodiment of the invention, when the target distance is greater than the preset distance and indicates that the mobile object moves in the second dimension and the third dimension, the mobile control equipment displays the virtual rocker in the mobile control area, and the mobile object is controlled to move in the second dimension and the third dimension by receiving touch operation through the virtual rocker;
here, the mobile control device displays the virtual joystick at the position of the direction movement control, or displays the virtual joystick at the position of the target sub-movement control, and displays a touch end focus identifier for identifying a touch end focus on the virtual joystick.
Further, S1045 may be realized by S10451; that is, the mobile control device displays the touch end focus on the virtual stick, including S10451, which will be described below.
S10451, displaying a touch control ending focus mark carrying a first moving direction mark on a virtual rocker; the first moving direction mark is used for marking a first moving direction.
It should be noted that, the movement control device also displays a first movement direction identifier for identifying a first movement direction on the touch receiving focus identifier displayed on the virtual joystick.
In the embodiments of the present invention, the directions mentioned are all directions when facing the screen.
In the following, an exemplary application of the embodiments of the present invention in a practical application scenario will be described.
For example, referring to fig. 11, fig. 11 is an application schematic diagram of an exemplary mobility control method provided by the embodiment of the present invention; as shown in FIG. 11, after entering the peace and elite game (control class application), the game interface 11-1 (control interface) includes a movement control area 11-11 and a movement area 11-12; the movement control area 11-11 is displayed on the movement area 11-12 in a floating mode in a transparent mode, and is divided by 30 degrees from the lower left corner of the movement control area 11-11 to obtain an area 11-111 (a first sub movement control area), an area 11-112 (a second sub movement control area) and an area 11-113 (a third sub movement control area); and, an up button 11-21 (first sub movement control) is displayed on the region 11-111, a horizontal button 11-22 (second sub movement control) is displayed on the region 11-112, and a down button 11-23 (third sub movement control) is displayed on the region 11-113; in addition, buttons 11-21 correspond to center points 11-211, buttons 11-22 correspond to center points 11-221, and buttons 11-23 correspond to center points 11-231. And, the airplane model 11-3 (moving object) is displayed on the moving area 11-12.
Based on fig. 11 and referring to fig. 12, when the touch operation is that the up button 11-21 is pressed and dragged, and the location 11-212 (touch end focus) is dragged, the upward movement type (target movement type) of the moving object is activated due to the touch operation performed on the area 11-111, and the moving object is determined to move upward (first movement direction); at this time, the direction from the center point 11-211 to the position 11-212 is obtained, and the moving direction of the moving object on the front, rear, left and right planes is also obtained: the right front (second moving direction) and the distance information between the central point 11-211 and the position 11-212 (track information) is obtained to be 20px (target distance) which is greater than 0px (preset distance), so that the moving object is determined to move upwards and the right front (target moving direction) is also moved at the preset speed of 0.5 m/s; and simultaneously displaying the virtual joysticks 11-41 in front, back, left and right, displaying touch end focuses 11-51 (touch end focus marks) on the virtual joysticks 11-41, and displaying up marks 11-511 (first moving direction marks) on the touch end focuses 11-51.
That is, referring to fig. 13, when the positive direction of the z-axis represents an upward direction, the positive direction of the y-axis represents a forward direction, and the positive direction of the x-axis represents a rightward direction, the moving locus of the airplane model is 13-1 corresponding to the touch operation in fig. 12.
In summary, after the displayed up button is triggered, it is determined that the moving object moves upward, and meanwhile, if it is determined that the distance information of the track information of the touch, i.e., the dragging distance, is greater than the preset distance, the moving object moves forward/backward/left/right while moving upward, and the moving direction of the moving object moves forward/backward/left/right, i.e., the direction of the track information; and if the distance information of the touch track information is not greater than the preset distance, only moving upwards. As shown in fig. 14, if the drag distance is not greater than the movement distance after the finger is pressed against the up button, the moving object (airplane model) moves upward only in the three-dimensional space. And if the dragging distance is greater than the moving distance, taking the center point of the up button as a reference, and taking the upper, lower, left and right sides of the screen as the front, back, left and right sides of the moving object in the three-dimensional space, so that the moving directions of the moving object in the front, back, left and right sides of the three-dimensional space are determined according to the dragging direction of the finger on the screen. Here, the finger is dragged upwards on the screen, and the moving object moves upwards and forwards in the three-dimensional space; dragging the finger downwards on the screen, and moving the object upwards and backwards in the three-dimensional space; dragging the finger on the screen to the left, and moving the object on the three-dimensional space to the left while moving the object upwards; dragging the finger to the right on the screen, and moving the object upwards and simultaneously moving the object to the right in the three-dimensional space; dragging the finger on the screen to the left and the top, and moving the object on the three-dimensional space to the left and the front at the same time; dragging the finger upwards to the right on the screen, and moving the object upwards in the three-dimensional space and simultaneously moving towards the front right; dragging the finger downwards and rightwards on the screen, and moving the object upwards and simultaneously backwards and rightwards on the three-dimensional space; the finger drags left and down on the screen, and the moving object moves upwards and simultaneously moves left and back in the three-dimensional space.
Based on fig. 11 and referring to fig. 15, when the touch operation is that the horizontal button 11-22 is pressed and dragged, and the position 11-222 (touch end focus) is dragged, the movement type (target movement type) of the moving object moving horizontally is activated due to the touch operation performed on the area 11-112, and it is determined that the moving object moves horizontally (first movement direction), that is, the moving object does not move up and down; at this time, the direction from the center point 11-221 to the position 11-222 is obtained, and the moving direction of the moving object on the front, rear, left and right planes is also obtained: right front (second moving direction), and obtaining distance information between the central point 11-221 and the position 11-222 (track information) as 20px (target distance) which is greater than 0px (preset distance), thereby determining that the moving object moves to right front (target moving direction) at a preset speed of 0.5 m/s; and simultaneously displaying the virtual rockers 11-42 at the front, the back, the left and the right, displaying touch end focuses 11-52 (touch end focus marks) on the virtual rockers 11-42, and displaying horizontal marks 11-521 (first moving direction marks) on the touch end focuses 11-52.
In summary, after the displayed horizontal button is triggered, if it is determined that the distance information of the touch trajectory information, i.e., the dragging distance, is greater than the preset distance, the touch trajectory information is moved forward/backward/left/right, and the touch trajectory information is moved forward/backward/left/right, i.e., the direction of the trajectory information; and if the distance information of the touch track information is not greater than the preset distance, the moving object does not move. As shown in fig. 16, if the drag distance is not greater than the movement distance after the finger is pressed on the horizontal button, the moving object (airplane model) does not move. And if the dragging distance is greater than the moving distance, taking the upper, lower, left and right sides of the screen as the front, rear, left and right sides of the moving object in the three-dimensional space by taking the central point of the horizontal button as a reference, so that the front, rear, left and right moving directions of the moving object in the three-dimensional space are determined according to the dragging direction of the finger on the screen. Here, the finger is dragged upwards on the screen, and the moving object moves forwards in the three-dimensional space; dragging the finger downwards on the screen, and moving the object backwards on the three-dimensional space; dragging the finger to the left on the screen, and moving the moving object to the left on the three-dimensional space; dragging the finger to the right on the screen, and moving the object to the right on the three-dimensional space; dragging the finger on the screen to the left and the top, and moving the object on the three-dimensional space to the left and the front; dragging the finger upwards to the right on the screen, and moving the object on the three-dimensional space to move towards the front right; dragging the finger downwards to the right on the screen, and moving the object on the three-dimensional space to move backwards and rightwards; and dragging the finger downwards and leftwards on the screen, and moving the object on the three-dimensional space backwards and leftwards.
Based on fig. 11 and referring to fig. 17, when the touch operation is the pressing and dragging of the down button 11-23 and the dragging to the position 11-232 (touch end focus), the downward movement type (target movement type) of the moving object is activated due to the touch operation performed on the area 11-113, and the moving object is determined to move upward (first movement direction); at this time, the direction from the center point 11-231 to the position 11-232 is obtained, and the moving direction of the moving object on the front, rear, left and right planes is also obtained: the right front (second moving direction) and the distance information between the central point 11-231 and the position 11-232 (track information) is obtained to be 20px (target distance) which is greater than 0px (preset distance), so that the moving object is determined to move downwards and the right front (target moving direction) is also moved, and the moving speed is 0.5 m/s; the virtual sticks 11-43, front, back, left and right, are displayed at the same time, and touch end focuses 11-53 (touch end focus marks) are displayed on the virtual sticks 11-43, and down marks 11-531 (first moving direction marks) are displayed on the touch end focuses 11-53.
In summary, after the displayed down button is triggered, it is determined that the moving object moves down, and meanwhile, if it is determined that the distance information of the track information of the touch, i.e., the dragging distance, is greater than the preset distance, the moving object moves forward/backward/left/right while moving down, and the moving direction of the moving object forward/backward/left/right, i.e., the direction of the track information; and if the distance information of the touch track information is not greater than the preset distance, only moving downwards. As shown in fig. 18, if the drag distance is not greater than the movement distance after the finger is pressed against the down button, the moving object (airplane model) moves downward only in the three-dimensional space. And if the dragging distance is greater than the moving distance, taking the center point of the down button as a reference, and taking the upper, lower, left and right sides of the screen as the front, back, left and right sides of the moving object in the three-dimensional space, so that the moving direction of the moving object in the front, back, left and right sides of the three-dimensional space is determined according to the dragging direction of the finger on the screen. Here, the finger is dragged upwards on the screen, and the moving object moves downwards and forwards in the three-dimensional space; dragging the finger downwards on the screen, and moving the object downwards and backwards on the three-dimensional space; dragging the finger on the screen to the left, and moving the object on the three-dimensional space to the left while moving the object downwards; dragging the finger to the right on the screen, and moving the object on the three-dimensional space to the right simultaneously; dragging the finger upwards and leftwards on the screen, and moving the object downwards and forwards leftwards on the three-dimensional space; dragging the finger upwards and rightwards on the screen, and moving the object downwards and simultaneously moving the object forwards and rightwards on the three-dimensional space; dragging the finger downwards to the right on the screen, and moving the object downwards on the three-dimensional space and simultaneously moving the object to the back right; the finger drags left and down on the screen, and the moving object moves down and moves left and back in the three-dimensional space.
It should be noted that the virtual joystick in the embodiment of the present invention is divided into two parts, a mobile operation area and a central area. A moving operation region 19-1 and a central region 19-2 as shown in fig. 19; and displaying the touch end focus mark at the edge position of the movement operation area in the corresponding direction when the touch operation acts on the movement control area but does not act on the movement operation area of the virtual rocker. As shown in fig. 20, when the clicked position 20-1 is outside the moving operation area 20-2 of the moving control area, the touch end focus mark 20-4 is displayed on the line connecting the center area 20-3 to the touch position 20-1 and inside the edge of the moving operation area 20-2, i.e., at 20-4.
Continuing with the exemplary structure of the mobile control device 455 provided by the embodiments of the present invention implemented as software modules, in some embodiments, as shown in fig. 3, the software modules stored in the mobile control device 455 of the memory 450 may include:
the display module 4551 is configured to display a direction movement control in a movement control area on a control interface after entering a control application, and display a moving object in the movement area on the control interface; wherein the direction movement control characterizes a control of at least one direction of movement in a first dimension; the direction movement control is used for realizing the movement of the moving object on the three-dimensional space of the moving area; the first dimension belongs to the three-dimensional space;
a touch module 4552 configured to receive a touch operation acting on the mobile control area;
a response module 4553, configured to determine, in response to the touch operation, a target movement type triggered by the direction movement control, and acquire track information of a touch end focus;
a direction determining module 4554, configured to determine a target moving direction in the three-dimensional space according to a moving direction corresponding to the target moving type and the trajectory information;
a movement control module 4555, configured to control the moving object to move in the target moving direction in the moving area.
Further, the direction movement control comprises a sub movement control of the at least one movement direction; the display module 4551 is further configured to divide the movement control area on the control interface into at least one sub movement control area; and respectively displaying the sub-movement controls in the at least one movement direction on the at least one sub-movement control area.
Further, the at least one sub-mobility control area comprises: a first sub-movement control area, a second sub-movement control area and a third sub-movement control area; the display module 4551 is further configured to display a first sub-movement control on the first sub-movement control area; the first sub-movement control is used for triggering the moving object to move in the positive direction of the first dimension; displaying a second sub-movement control on the second sub-movement control area; the second sub-movement control is used for triggering the moving object to move in the direction of the origin of the first dimension; displaying a third sub-movement control on the third sub-movement control area; the third sub-movement control is used for triggering the moving object to move in the negative direction of the first dimension; wherein the sub-movement controls of the at least one movement direction include the first sub-movement control, the second sub-movement control, and the third sub-movement control, the at least one movement direction including a positive direction of the first dimension, an origin direction of the first dimension, and a negative direction of the first dimension.
Further, the direction movement control comprises at least one movement type icon; the display module 4551 is further configured to display the direction movement control including the at least one movement type icon on the movement control area mark on the control interface.
Further, the at least one movement type icon includes a first movement type icon, a second movement type icon, and a third movement type icon; the display module 4551 is further configured to display the direction movement control including a preset mark display icon on the movement control area on the control interface; the preset mark display icon is obtained by displaying any one of the first movement type icon, the second movement type icon and the third movement type icon by marking; acquiring a preset sequence of mark display among the first mobile type icon, the second mobile type icon and the third mobile type icon to obtain a switching strategy; and performing label display switching on the first movement type icon, the second movement type icon and the third movement type icon in the direction movement control once switching operation is received from the direction movement control comprising the preset label display icon according to the switching strategy, so as to finish label display of the direction movement control comprising the at least one movement type icon on the movement control area.
Further, the response module 4553 is further configured to, in response to the touch operation, acquire touch position information of the touch end focus corresponding to the touch operation; determining a target sub-movement control area corresponding to the touch position information from the at least one sub-movement control area; and determining a control displayed on the target sub-movement control area from the sub-movement controls to obtain an acted target sub-movement control, so as to obtain the target movement type.
Further, the responding module 4553 is further configured to, in response to the touch operation, acquire a movement type icon marked and displayed on the direction movement control after the direction movement control is touched, and obtain the target movement type.
Further, the response module 4553 is further configured to, when the touch operation is a click operation, obtain a track vector of a first preset position corresponding to the touch end focus offset target sub-moving control corresponding to the click operation, and obtain the track information; and when the touch operation is a sliding operation, acquiring a track from a touch start focus to a touch end focus of the sliding operation to obtain track information.
Further, the response module 4553 is further configured to, when the touch operation is a click operation, obtain a track vector, where the touch focus corresponding to the click operation is offset from a second preset position corresponding to the direction moving control, and obtain the track information; and when the touch operation is a sliding operation, acquiring a track from a touch start focus to a touch end focus of the sliding operation to obtain track information.
Further, the direction determining module 4554 is further configured to use a moving direction corresponding to the target moving type as a first moving direction; the first moving direction refers to a moving direction of the moving object in the first dimension; obtaining distance information of the track information to obtain a target distance; when the target distance is not greater than the preset distance, taking the first moving direction as the target moving direction; the preset distance is used for determining the moving result of the moving object in the second dimension and the third dimension of the three-dimensional space; when the target distance is greater than the preset distance, taking the direction of the track information as the second moving direction, and obtaining the target moving direction in the three-dimensional space by adopting the first moving direction and the second moving direction; the second movement direction refers to a movement direction of the moving object in the second dimension and the third dimension.
Further, the movement control device 455 further includes a joystick display module 4555, configured to display a virtual joystick in the movement control area and display a touch end focus identifier on the virtual joystick when the target distance is greater than the preset distance; the virtual rocker is used for controlling the moving object to move in the second dimension and the third dimension, and the touch ending focus mark is used for marking the touch ending focus.
Further, the rocker display module 4555 is further configured to display the touch end focus identifier carrying the first moving direction identifier on the virtual rocker; wherein the first moving direction identifier is used for identifying the first moving direction.
Embodiments of the present invention provide a computer-readable storage medium having stored thereon executable instructions, which when executed by a processor, will cause the processor to perform a method provided by embodiments of the present invention, for example, a mobility control method as shown in fig. 4.
In some embodiments, the storage medium may be memory such as FRAM, ROM, PROM, EPROM, EEPROM, flash memory, magnetic surface memory, optical disk, or CD-ROM; or may be various devices including one or any combination of the above memories.
In some embodiments, executable instructions may be written in any form of programming language (including compiled or interpreted languages), in the form of programs, software modules, scripts or code, and may be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
By way of example, executable instructions may correspond, but do not necessarily have to correspond, to files in a file system, and may be stored in a portion of a file that holds other programs or data, such as in one or more scripts in a hypertext Markup Language (HTML) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
By way of example, executable instructions may be deployed to be executed on one computing device or on multiple computing devices at one site or distributed across multiple sites and interconnected by a communication network.
In summary, according to the embodiments of the present invention, when the moving object is controlled to move in the three-dimensional space, the moving direction of the moving object in the three-dimensional space can be determined according to the touch operation acting on one moving control area and the action relationship between the touch operation and the direction moving control displayed on the moving control area, that is, in the process of controlling the moving object to move, only the touch operation on one moving control area needs to be detected and processed, so that the processing flow of controlling the moving object to move is simplified.
The above description is only an example of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, and improvement made within the spirit and scope of the present invention are included in the protection scope of the present invention.

Claims (15)

1. A mobility control method, comprising:
after entering a control application, displaying a direction moving control in a moving control area on a control interface, and displaying a moving object in the moving area on the control interface;
wherein the direction movement control characterizes a control of at least one direction of movement in a first dimension; the direction movement control is used for realizing the movement of the moving object on the three-dimensional space of the moving area; the first dimension belongs to the three-dimensional space;
receiving touch operation acting on the mobile control area;
responding to the touch operation, determining a target movement type triggered by the direction movement control, and acquiring track information of a touch end focus;
determining a target moving direction on the three-dimensional space according to the moving direction corresponding to the target moving type and the track information;
controlling the moving object to move in the target moving direction in the moving area;
wherein, the determining the target moving direction on the three-dimensional space according to the moving direction corresponding to the target moving type and the track information includes: and determining the target moving direction in the three-dimensional space according to the moving direction of the moving object corresponding to the target moving type in at least one moving direction in the first dimension and the moving results of the moving object in the second dimension and the third dimension of the three-dimensional space determined by the track information.
2. The method of claim 1, wherein the direction-movement control comprises a sub-movement control of the at least one direction of movement;
the moving control area on the control interface displays a direction moving control, and the method comprises the following steps:
dividing the movement control area on the control interface into at least one sub-movement control area;
and respectively displaying the sub-movement controls in the at least one movement direction on the at least one sub-movement control area.
3. The method of claim 2, wherein the at least one sub-mobility control area comprises: a first sub-movement control area, a second sub-movement control area and a third sub-movement control area; the displaying the sub-movement controls of the at least one movement direction on the at least one sub-movement control area respectively includes:
displaying a first sub-movement control on the first sub-movement control area; the first sub-movement control is used for triggering the moving object to move in the positive direction of the first dimension;
displaying a second sub-movement control on the second sub-movement control area; the second sub-movement control is used for triggering the moving object to move in the direction of the origin of the first dimension;
displaying a third sub-movement control on the third sub-movement control area; the third sub-movement control is used for triggering the moving object to move in the negative direction of the first dimension;
wherein the sub-movement controls of the at least one movement direction include the first sub-movement control, the second sub-movement control, and the third sub-movement control, the at least one movement direction including a positive direction of the first dimension, an origin direction of the first dimension, and a negative direction of the first dimension.
4. The method of claim 1, wherein the directional movement control comprises at least one movement type icon;
the moving control area on the control interface displays a direction moving control, and the method comprises the following steps:
displaying the direction movement control including the at least one movement type icon in the movement control area indicia on the control interface.
5. The method of claim 4, wherein the at least one movement type icon comprises a first movement type icon, a second movement type icon, and a third movement type icon;
said displaying said directional movement control including said at least one movement type icon in said movement control area indicia on said control interface comprises:
displaying the direction movement control comprising a preset mark display icon on the movement control area on the control interface; the preset mark display icon is obtained by displaying any one of the first movement type icon, the second movement type icon and the third movement type icon by marking;
acquiring a preset sequence of mark display among the first mobile type icon, the second mobile type icon and the third mobile type icon to obtain a switching strategy;
and performing label display switching on the first movement type icon, the second movement type icon and the third movement type icon in the direction movement control once switching operation is received from the direction movement control comprising the preset label display icon according to the switching strategy, so as to finish label display of the direction movement control comprising the at least one movement type icon on the movement control area.
6. The method according to claim 2 or 3, wherein the determining, in response to the touch operation, a target movement type in which the direction movement control is triggered comprises:
responding to the touch operation, and acquiring touch position information of the touch ending focus corresponding to the touch operation;
determining a target sub-movement control area corresponding to the touch position information from the at least one sub-movement control area;
and determining a control displayed on the target sub-movement control area from the sub-movement controls to obtain an acted target sub-movement control, so as to obtain the target movement type.
7. The method according to claim 4 or 5, wherein the determining, in response to the touch operation, a target movement type of the directional movement control triggered comprises:
and responding to the touch operation, and acquiring a movement type icon marked and displayed on the direction movement control after the direction movement control is touched to obtain the target movement type.
8. The method according to claim 2 or 3, wherein the acquiring, in response to the touch operation, track information of a touch end focus comprises:
when the touch control operation is a click operation, acquiring a track vector of a first preset position corresponding to the touch control ending focus offset target sub-moving control corresponding to the click operation to obtain track information;
and when the touch operation is a sliding operation, acquiring a track from a touch start focus to a touch end focus of the sliding operation to obtain track information.
9. The method according to claim 4 or 5, wherein the acquiring, in response to the touch operation, track information of a touch end focus comprises:
when the touch operation is a click operation, acquiring a track vector of a touch focus corresponding to the click operation deviating from a second preset position corresponding to the direction movement control to obtain track information;
and when the touch operation is a sliding operation, acquiring a track from a touch start focus to a touch end focus of the sliding operation to obtain track information.
10. The method according to any one of claims 1 to 5, wherein the determining the target movement direction on the three-dimensional space according to the movement direction corresponding to the target movement type and the trajectory information includes:
taking the moving direction corresponding to the target moving type as a first moving direction; the first moving direction refers to a moving direction of the moving object in the first dimension;
obtaining distance information of the track information to obtain a target distance;
when the target distance is not greater than a preset distance, taking the first moving direction as the target moving direction; the preset distance is used for determining the moving result of the moving object in the second dimension and the third dimension of the three-dimensional space;
when the target distance is greater than the preset distance, taking the direction of the track information as a second moving direction, and obtaining the target moving direction on the three-dimensional space by adopting the first moving direction and the second moving direction; the second movement direction refers to a movement direction of the moving object in the second dimension and the third dimension.
11. The method of claim 10, wherein after obtaining the distance information of the track information and obtaining the target distance, the method further comprises:
when the target distance is larger than the preset distance, displaying a virtual rocker in the mobile control area, and displaying a touch control ending focus mark on the virtual rocker;
the virtual rocker is used for controlling the moving object to move in the second dimension and the third dimension, and the touch ending focus mark is used for marking the touch ending focus.
12. The method of claim 11, wherein the displaying the touch end focus on the virtual joystick comprises:
displaying the touch control ending focus mark carrying a first moving direction mark on the virtual rocker; wherein the first moving direction identifier is used for identifying the first moving direction.
13. A movement control apparatus, comprising:
the display module is used for displaying a direction moving control in a moving control area on a control interface and displaying a moving object in the moving area on the control interface after entering the control application; wherein the direction movement control characterizes a control of at least one direction of movement in a first dimension; the direction movement control is used for realizing the movement of the moving object on the three-dimensional space of the moving area; the first dimension belongs to the three-dimensional space;
the touch control module is used for receiving touch control operation acting on the mobile control area;
the response module is used for responding to the touch operation, determining a target movement type triggered by the direction movement control and acquiring track information of a touch ending focus;
the direction determining module is used for determining the target moving direction on the three-dimensional space according to the moving direction corresponding to the target moving type and the track information;
a movement control module for controlling the moving object to move in the target moving direction in the moving area;
the direction determining module is further configured to determine the target movement direction in the three-dimensional space according to a movement direction of the moving object corresponding to the target movement type in at least one movement direction in the first dimension, and movement results of the moving object in a second dimension and a third dimension of the three-dimensional space determined by the trajectory information.
14. A mobile control device, comprising:
a memory for storing executable instructions;
a processor for implementing the method of any one of claims 1 to 12 when executing executable instructions stored in the memory.
15. A computer-readable storage medium having stored thereon executable instructions for causing a processor, when executing, to implement the method of any one of claims 1 to 12.
CN201911409408.3A 2019-12-31 2019-12-31 Mobile control method, device and equipment and computer readable storage medium Active CN111142689B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911409408.3A CN111142689B (en) 2019-12-31 2019-12-31 Mobile control method, device and equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911409408.3A CN111142689B (en) 2019-12-31 2019-12-31 Mobile control method, device and equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN111142689A CN111142689A (en) 2020-05-12
CN111142689B true CN111142689B (en) 2021-02-05

Family

ID=70522580

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911409408.3A Active CN111142689B (en) 2019-12-31 2019-12-31 Mobile control method, device and equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN111142689B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112308988B (en) * 2020-11-20 2022-01-07 深圳羽迹科技有限公司 Three-dimensional posture editing method, device, terminal and storage medium
CN113747219A (en) * 2021-08-23 2021-12-03 贵州广电新媒体产业发展有限公司 Method for controlling movement of focus of network television

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150102363A (en) * 2014-02-28 2015-09-07 성신여자대학교 산학협력단 Apparatus for controlling user interface based on multi-touches, and Method thereof
CN108786109A (en) * 2018-04-04 2018-11-13 网易(杭州)网络有限公司 A kind of direction-controlling method and device of game role
CN108771858B (en) * 2018-05-11 2021-12-14 网易(杭州)网络有限公司 Skill control switching method, device, terminal and storage medium
CN109032493A (en) * 2018-08-03 2018-12-18 网易(杭州)网络有限公司 Information processing method, device and electronic equipment
CN109224438B (en) * 2018-10-26 2022-06-24 网易(杭州)网络有限公司 Method and device for controlling virtual character in game
CN109364476B (en) * 2018-11-26 2022-03-08 网易(杭州)网络有限公司 Game control method and device
CN109589604A (en) * 2019-01-24 2019-04-09 网易(杭州)网络有限公司 Control method, control device, storage medium and the processor of virtual objects
CN110270086B (en) * 2019-07-17 2023-03-24 网易(杭州)网络有限公司 Method and device for controlling movement of virtual character in game

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
曲柄摇杆与正弦机构串联机构的虚拟仿真研究;张彭等;《吉林化工学院学报》;20180515;第35卷(第5期);全文 *

Also Published As

Publication number Publication date
CN111142689A (en) 2020-05-12

Similar Documents

Publication Publication Date Title
US11947792B2 (en) Devices, methods, and graphical user interfaces for providing multitouch inputs and hardware-based features using a single touch input
US9595238B2 (en) Electronic device, cover for electronic device, and method of performing a function in an electronic device
US9626104B2 (en) Thumb access area for one-handed touchscreen use
CN107019909B (en) Information processing method, information processing device, electronic equipment and computer readable storage medium
US8812994B2 (en) Device, method, and graphical user interface for configuring restricted interaction with a user interface
CN103530047B (en) Touch screen equipment event triggering method and device
CN103345312B (en) Using intelligent terminal simultaneously as the system and method for main frame, mouse and touch pad
CN102830926B (en) Mobile terminal and operational approach thereof
CN112162665B (en) Operation method and device
JP2019516189A (en) Touch screen track recognition method and apparatus
US20170336883A1 (en) Using a hardware mouse to operate a local application running on a mobile device
CN103229141A (en) Managing workspaces in a user interface
US20150277748A1 (en) Edit providing method according to multi-touch-based text block setting
KR20140074141A (en) Method for display application excution window on a terminal and therminal
EP3485358B1 (en) Electronic device and method thereof for managing applications
TW201044277A (en) Apparatus and method for handling tasks within a computing device
CN111142689B (en) Mobile control method, device and equipment and computer readable storage medium
CN105074644A (en) Information processing terminal, screen control method, and screen control program
KR20130112629A (en) Menu contolling method of media equipment, apparatus thereof, and medium storing program source thereof
WO2016183912A1 (en) Menu layout arrangement method and apparatus
EP2965181B1 (en) Enhanced canvas environments
US10895979B1 (en) Methods and user interfaces for positioning a selection, selecting, and editing, on a computing device running under a touch-based operating system, using gestures on a touchpad device
CN102693064B (en) Method and system for quitting protection screen by terminal
JP2006271841A (en) Game program
CN112221123B (en) Virtual object switching method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant