WO2024021635A1 - 移动控制的方法、装置、存储介质及电子设备 - Google Patents

移动控制的方法、装置、存储介质及电子设备 Download PDF

Info

Publication number
WO2024021635A1
WO2024021635A1 PCT/CN2023/082259 CN2023082259W WO2024021635A1 WO 2024021635 A1 WO2024021635 A1 WO 2024021635A1 CN 2023082259 W CN2023082259 W CN 2023082259W WO 2024021635 A1 WO2024021635 A1 WO 2024021635A1
Authority
WO
WIPO (PCT)
Prior art keywords
joystick
control area
control
domain
point
Prior art date
Application number
PCT/CN2023/082259
Other languages
English (en)
French (fr)
Inventor
王莉莎
黄博宇
迟杰萌
Original Assignee
网易(杭州)网络有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 网易(杭州)网络有限公司 filed Critical 网易(杭州)网络有限公司
Publication of WO2024021635A1 publication Critical patent/WO2024021635A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/533Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Definitions

  • the present disclosure relates to the field of virtual interaction technology, and specifically, to a movement control method, a movement control device, a computer-readable storage medium, and an electronic device.
  • the purpose of the embodiments of the present disclosure is to provide a mobile control method, a mobile control device, an electronic device, and a computer-readable storage medium, thereby solving the problems of inconvenience and poor coherence of mobile control operations, at least to a certain extent. It achieves improved convenience and consistency of mobile control.
  • a mobile control method which provides a graphical user interface through a terminal device.
  • the graphical user interface includes a preset first control area and a second control area located on the periphery of the first control area.
  • Control area the method includes:
  • the operation point of the joystick field is controlled to be at that location.
  • the first control area moves inward move.
  • a movement control vector is generated according to the relative position of the joystick domain and the operating point, and movement control is performed based on the movement control vector.
  • a mobile control device which provides a graphical user interface through a terminal device.
  • the graphical user interface includes a preset first control area and a second control area located on the periphery of the first control area.
  • Control area the device includes:
  • a joystick display module configured to display a joystick domain through the first control area
  • a first joystick following module configured to perform, in response to a first sliding operation acting in the first control area, control the operating point of the joystick domain to follow the first sliding operation on the graphical user interface Moves with the movement of the touch point;
  • the second rocker following module is configured to perform a change in position of the touch point of the second sliding operation in the second control area in response to a second sliding operation acting in the second control area, The operating point controlling the rocker field moves within the first control area.
  • a movement control module configured to generate a movement control vector according to the relative position of the joystick domain and the operating point, and perform movement control based on the movement control vector.
  • a computer-readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, any one of the methods described above is implemented.
  • an electronic device including: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform via executing the executable instructions. Any of the above methods.
  • the joystick domain is displayed through the first control area; in response to the first sliding operation acting in the first control area, the operating point of the control rocker domain follows the first sliding operation. Move with the movement of the touch point on the graphical user interface; in response to the second sliding operation in the second control area, control the joystick field according to the position change of the touch point of the second sliding operation in the second control area.
  • the operating point moves within the first control area.
  • a movement control vector is generated based on the relative position of the joystick domain and the operating point, and movement control is performed based on the movement control vector.
  • Figure 1 shows a schematic diagram of an exemplary system architecture in which a mobile control method and device according to embodiments of the present disclosure can be applied;
  • Figure 2 schematically illustrates a flow chart of a method of movement control according to an embodiment of the present disclosure
  • Figure 3 schematically shows an interface diagram of the first control area according to one embodiment of the present disclosure
  • Figure 4 schematically shows an interface schematic diagram of a fixed display rocker domain according to one embodiment of the present disclosure
  • FIG. 5A schematically illustrates an interface diagram for displaying a rocker field in response to a first touch operation acting on the first sub-region according to an embodiment of the present disclosure
  • 5B schematically illustrates an interface schematic diagram of displaying a rocker field in response to a second touch operation acting on the second sub-region according to an embodiment of the present disclosure
  • FIG. 6 schematically illustrates an interface schematic diagram of a control operation point following in response to a first sliding operation acting in a first control area according to an embodiment of the present disclosure
  • Figure 7 schematically illustrates an interface diagram for sliding from a second control area to a first control area according to an embodiment of the present disclosure
  • Figure 8 schematically shows an interface diagram for controlling in the second control area according to one embodiment of the present disclosure
  • Figure 9 schematically shows a structural block diagram of a mobile control device according to an embodiment of the present disclosure.
  • FIG. 10 schematically shows a structural diagram of a computer system suitable for implementing an electronic device according to an embodiment of the present disclosure.
  • Example embodiments will now be described more fully with reference to the accompanying drawings.
  • Example embodiments may, however, be embodied in various forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concepts of the example embodiments.
  • the described features, structures or characteristics may be combined in any suitable manner in one or more embodiments.
  • numerous specific details are provided to provide a thorough understanding of embodiments of the disclosure.
  • those skilled in the art will appreciate that the technical solutions of the present disclosure may be practiced without one or more of the specific details described, or other methods, components, devices, steps, etc. may be adopted.
  • well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the disclosure.
  • FIG. 1 shows a schematic diagram of an exemplary application environment in which a mobile control method and device according to embodiments of the present disclosure can be applied.
  • the system architecture 100 may include one or more of terminal devices 101, 102, and 103.
  • the terminal devices 101, 102, and 103 may be various electronic devices with display screens, including but not limited to desktop computers, portable computers, smart phones, tablet computers, and so on.
  • the terminal device can install and run virtual display programs, three-dimensional map programs, virtual game programs, etc.
  • the mobile control method in one embodiment of the present disclosure can be run on a local terminal device or a server.
  • the method can be implemented and executed based on a cloud interaction system, where the cloud interaction system includes a server and a client device.
  • cloud applications such as cloud games
  • cloud gaming refers to a gaming method based on cloud computing.
  • the client device In the running mode of cloud games, the running body of the game program and the game screen rendering body are separated. The storage and operation of the mobile control method are completed on the cloud game server.
  • the client device is used to receive, send and receive data.
  • the client device can be a display device with data transmission function close to the user side, such as a mobile terminal, a TV, a computer, a handheld computer, etc.; however, the information processing is performed by cloud games in the cloud. server.
  • the player operates the client device to send operating instructions to the cloud game server.
  • the cloud game server runs the game according to the operating instructions, encodes and compresses the game screen and other data, and returns it to the client device through the network.
  • the cloud game server performs operations through the client device. Decode and output game screen.
  • the local terminal device stores the game program and is used to present the game screen.
  • the local terminal device is used to interact with the player through the graphical user interface, that is, conventionally, the game program is downloaded, installed and run through the terminal device.
  • the local terminal device may provide the graphical user interface to the player in a variety of ways. For example, it may be rendered and displayed on the display screen of the terminal, or provided to the player through holographic projection.
  • the local terminal device may include a display screen and a processor.
  • the display screen is used to present a graphical user interface.
  • the graphical user interface includes a game screen.
  • the processor is used to run the game, generate the graphical user interface, and control the graphical user interface. displayed on the display screen.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of code that contains one or more logic functions that implement the specified executable instructions.
  • the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown one after another may actually execute substantially in parallel, or they may sometimes execute in the reverse order, depending on the functionality involved.
  • each block in the block diagram or flowchart illustration, and combinations of blocks in the block diagram or flowchart illustration can be implemented by special purpose hardware-based systems that perform the specified functions or operations, or may be implemented by special purpose hardware-based systems that perform the specified functions or operations. Achieved by a combination of specialized hardware and computer instructions.
  • the units involved in the embodiments of the present disclosure can be implemented in software or hardware, and the described units can also be provided in a processor. Among them, the names of these units do not constitute a limitation on the unit itself under certain circumstances.
  • the present disclosure can be about controlling virtual objects or virtual characters to move in a virtual scene.
  • it can be virtual objects in games including virtual characters, virtual objects in three-dimensional map programs, and so on.
  • the virtual scene involved in the exemplary embodiments of the present disclosure may be a digital scene outlined by digital technology on smart terminal devices such as computers, mobile phones, and tablets.
  • the virtual scene can include buildings or structures such as houses, buildings, gardens, bridges, and pools, as well as natural landscapes such as mountains, rivers, and lakes, as well as any virtual items or virtual props such as weapons, tools, and creatures.
  • the virtual scene may be a simulation scene of the real world, a purely fictitious virtual scene, or a partially simulated and partially fictitious virtual scene, which is not specifically limited in this exemplary embodiment.
  • the mobile control method provides a graphical user interface through a terminal device.
  • the graphical user interface includes a preset first control area and a second control area located outside the first control area.
  • the methods include:
  • Step S210 display the joystick domain through the first control area
  • Step S220 In response to the first sliding operation in the first control area, control the operating point of the joystick field to move following the movement of the touch point of the first sliding operation on the graphical user interface. ;
  • Step S230 In response to the second sliding operation in the second control area, control the operation of the joystick field according to the position change of the touch point of the second sliding operation in the second control area. The point moves within the first control area.
  • Step S240 Generate a movement control vector according to the relative position of the joystick domain and the operation point, and perform movement control based on the movement control vector.
  • the rocker domain is displayed through the first control area; in response to the first sliding operation acting in the first control area, the operating point of the control rocker domain follows the first sliding operation. Move with the movement of the touch point on the graphical user interface; in response to the second sliding operation in the second control area, control the joystick according to the position change of the touch point of the second sliding operation in the second control area.
  • the operating point of the domain moves within the first control area. Generate movement control vectors based on the relative positions of the joystick domain and the operating point for movement control.
  • the embodiments of the present disclosure prevents the user from operating only in a fixed or semi-fixed joystick domain, thereby increasing the operable range and thereby improving operational convenience; on the other hand, the joystick domain is not limited to It will always follow the user's operation, avoiding the need to interrupt the operation when the joystick field moves to an inconvenient place or when the movement direction needs to be changed, improving the coherence of mobile control interaction.
  • the first control area is determined in the graphical user interface provided by the terminal device.
  • the first control area is located at the most convenient position for the user to perform mobile control operations, which may be at the lower left of the graphical user interface, or at Other positions in different mobile control scenarios.
  • the first control area may be a circle, the first sub-area is a smaller circle in the circle of the first control area, the second sub-area is a ring surrounding the first area, and the first area and the second area are The combination is the first control area.
  • the radius of the first sub-region may be half of the radius of the first control region, and the width of the ring of the second sub-region may be equal to the length of the radius of the circle of the first region.
  • the shapes of the first control area, the first sub-area, and the second sub-area can be freely configured, and the relationship between the above-mentioned radius and width can also be changed. This is not limited by the embodiments of the present disclosure.
  • the area ranges of the first control area and the second control area may be visible or invisible.
  • the first control area 301 is located in the lower left corner of the horizontally arranged graphical user interface. Outside the first control area 301 is a second control area.
  • the specific area of the second control area is not limited.
  • the second control area may be an annular area surrounding the first control area 301 .
  • the right side can be controls for controlling the virtual character to release skills or perform actions.
  • the first control area 301 includes a first sub-area corresponding to a smaller circle and a second sub-area corresponding to an annular shape surrounding the smaller circle.
  • step S210 a joystick field is displayed through the first control area.
  • a joystick domain is displayed in the first control area, and the joystick domain may include a chassis and an operating point. Before the operation is started, the position of the operation point can default to the center of the joystick field or the operation point can not be displayed. When the operating point is configured to always be located within the joystick domain, the area of the joystick domain on the graphical user interface can be the same as the area of its chassis.
  • step S220 in response to the first sliding operation in the first control area, control the operation point of the joystick field to follow the movement of the touch point of the first sliding operation on the graphical user interface. And move.
  • the position of the touch point of the first sliding operation on the graphical user interface is obtained, and the operation point of the joystick field is controlled to follow the movement of the touch point. And move.
  • the touch position of the first sliding operation can be determined by monitoring sliding events and touch events, or by monitoring screen pressure changes, and then obtaining the position coordinates of the touch point.
  • step S230 in response to the second sliding operation in the second control area, the joystick area is controlled according to the position change of the touch point of the second sliding operation in the second control area.
  • the operating point moves within the first control area.
  • the second sliding operation may be a continuous operation with the first sliding operation, that is, the user slides out of the first control area and slides to the second control area.
  • the first touch point of the second sliding operation in the second control area is a point close to the first control area; the second sliding operation can also be an operation independent of the first sliding operation, then the first The touch point can be any point within the second control area.
  • the second sliding operation may be an operation continuous with the first sliding operation, that is, the user slides out of the first control area and slides to the second control area.
  • the chassis of the joystick area can be fixedly displayed in the first control area.
  • the center of the joystick domain can be determined on the line connecting the first control area and the first touch point of the second sliding operation in the second control area, and the chassis of the joystick domain It is inscribed within the first control area, and the position of the incision is determined at the center of the joystick area.
  • the line connecting the first control area and the touch point intersects with the first control area. s position.
  • the chassis of the joystick domain is circular, the center of the joystick domain is located at point A, point B can be a point directly to the right outside the joystick domain, and point C can be located at the joystick A point directly below the outside of the domain.
  • point B can be a point directly to the right outside the joystick domain
  • point C can be located at the joystick A point directly below the outside of the domain.
  • a connection line is determined based on the position of the touch point and the center of the joystick domain, and the intersection point of the connection line and the edge of the joystick domain is used as the operating point of the joystick domain and the joystick domain.
  • the operating point is visualized as a smaller circle on the rocker domain chassis.
  • the intersection point of the connecting line and the edge of the joystick domain can also be directly used as the operating point corresponding to the center of the circle.
  • the circle is on the edge of the first control area, rather than inscribed with the edge.
  • step S240 a movement control vector is generated according to the relative position of the joystick domain and the operation point, and movement control is performed based on the movement control vector.
  • the movement control vector is generated according to the relative position of the joystick domain and the operating point. For example, determine the location of the operating point based on the touch point. The operating point can be on the right side of the joystick field. Then the movement control vector generated based on this can control the virtual character to walk to the right; the operating point can be on the upper left side of the joystick field. part, the movement control vector generated based on this can control the virtual character to walk forward and to the left.
  • the joystick field is a circle with a radius of 3 units.
  • the first operating point is located 1 unit directly to the right of the center of the circle, and the second operating point is located 2 units directly to the right of the center of the circle. .
  • the first operating point may correspond to a moving speed of 10
  • the second operating point may correspond to a moving speed of 20
  • the first operating point may correspond to walking to the right
  • the second operating point may correspond to running to the right.
  • the disclosed embodiments are not limited here.
  • the present disclosure also provides an implementation of a mobile control method.
  • the first control area includes a first sub-area and a second sub-area located on the periphery of the first sub-area; the step of displaying a joystick domain through the first control area includes:
  • the touch position of the first touch operation is used as the center of the joystick field and the joystick field is displayed;
  • the center of the rocker field is determined based on the touch position of the second touch operation and the center of the first sub-area. and display the joystick field.
  • the first control area includes a first sub-area and a second sub-area located outside the first sub-area.
  • the first sub-area may be circular, and the second sub-area may be surrounding the first sub-area. of ring.
  • the touch point When displaying the joystick field in response to acting on the first control area, the touch point may be in the first sub-area or the second sub-area.
  • the touch position of the first touch operation is used as the center of the joystick field and combined with the preset value of the radius of the joystick field to display the joystick field.
  • the circle in the middle is the first sub-region, and the second sub-region is annular.
  • Point A At any point in a sub-area, according to the touch position point A, determine the center of the joystick area and display the joystick area. Since the joystick field is displayed based on the touch position, the touch position is located in the center of the joystick field. At this time, no movement control vector is generated, and the controlled virtual character can remain motionless or maintain its original motion state without being adjusted.
  • the center of the joystick field is determined according to the touch position of the second touch operation and the center of the first sub-region and the joystick field is displayed.
  • a connection line is determined according to the touch position and the center of the first sub-region, and the center of the joystick domain is located on the connection line.
  • the touch position is used as the operating point corresponding to the center of the circle.
  • the operating point corresponds to the circle inscribed in the joystick domain.
  • the center position of the joystick domain can be determined and then displayed.
  • the circle in the middle is the first sub-region, and the second sub-region is annular.
  • the center of the joystick domain determines the center of the joystick domain and display the joystick domain.
  • the specific process of determining the center of the joystick domain and displaying the joystick domain is as above and will not be repeated here. Repeat.
  • the present disclosure also provides an implementation of a mobile control method.
  • the operation point of the joystick domain is controlled to move following the movement of the touch point of the first sliding operation on the graphical user interface. steps, including:
  • the position of the chassis controlling the joystick field remains unchanged, and only the position of the operation point moves with the position of the touch point.
  • touch points A and B are both located within the range of the joystick field.
  • the display position of the joystick field is different. Change. If the joystick domain corresponding to touch points A and B is used as the initial joystick domain, point C is in an area outside the initial joystick domain. Therefore, when the touch point of the first sliding operation is point C, the control The chassis and operating point of the joystick field move following the movement of the touch point. During this process, the circle corresponding to the operating point remains inscribed in the rocker domain.
  • Embodiments of the present disclosure can be implemented by controlling the chassis of the joystick domain to remain motionless when the touch point of the first sliding operation is located within the joystick domain; the touch point is located in the first control area other than the joystick domain.
  • the chassis and operating point of the control joystick area move following the movement of the touch point. It can avoid the operation range being too large, but still allows the user to expand the range of mobile control operations, improving the convenience and fault tolerance of the operation.
  • the first control area is actually a relatively suitable area for the user to perform mobile control operations, during the mobile operation process, even if the user performs mobile control in the second control area, he or she will return to the first control area again to a large extent. for movement control.
  • the present disclosure also provides an implementation of a mobile control method.
  • the method also includes:
  • the joystick domain is currently fixedly displayed in the first control area.
  • the movement control vector at this time is determined, and the control rocker field is changed from the current
  • the position is translated in the direction of the current movement control vector until the joystick domain is inscribed in the second control area.
  • the operating point located in the joystick domain corresponds to the inscribed position of the circle and the first control area and the third sliding operation.
  • the touch position is the same.
  • point A is a touch point directly below the joystick area and outside the first control area
  • point B is a touch point at the bottom left of the joystick area and outside the first control area.
  • point C is a touch point in the first control area at the lower left of the joystick field.
  • the control joystick domain in response to a third sliding operation from the second control area to the first control area, is translated from the current position toward the direction of the current movement control vector until the joystick domain is aligned with the first control area. Cut inside a control area. It can prevent the joystick domain from moving within a large range following the sliding operation, thereby preventing the range of interactive control from being too large, thereby improving the convenience of the mobile control interaction process.
  • the present disclosure also provides an implementation of a mobile control method.
  • the response acts on the second sliding operation in the second control area, and controls the operating point of the joystick field according to the position change of the touch point of the second sliding operation in the second control area.
  • the step of moving within the first control area includes:
  • a connection line is determined based on the touch point of the second sliding operation and the center of the first control area, and the connection line is connected with the second control area.
  • the intersection point of the edge of a control area serves as the inscribed point of the operating point of the joystick domain and the chassis respectively with the first control area;
  • the second sliding operation acts on the second control area
  • a connection line is determined according to the touch point of the second sliding operation and the center of the first control area
  • the connection line is connected with the edge of the first control area.
  • the intersection point serves as the inscribed point between the operating point of the rocker domain and the chassis respectively and the first control area.
  • the operating point and chassis of the control joystick area remain within the first control area respectively.
  • the chassis and the operating point are controlled to move at the same angle within the first control area.
  • point A is a touch point directly to the right of the joystick area and outside the first control area
  • point B is a touch point directly below the joystick area and outside the first control area.
  • the chassis and operating point of the control rocker domain are within the first control area and move 90 degrees while remaining inscribed with the edge of the first control area.
  • Embodiments of the present disclosure are implemented by displaying a joystick field in a first control area in response to a change in position of a control operation acting within a second control area, and moving the joystick field accordingly. It can expand the user's operable area and improve operating efficiency and fault tolerance.
  • the present disclosure also provides a device for mobile control, which provides a graphical user interface through a terminal device.
  • the graphical user interface includes a preset first control area and a mobile phone located on the periphery of the first control area. Second control area.
  • the device 900 includes:
  • the joystick display module 901 is configured to display the joystick domain through the first control area
  • the first joystick following module 902 is configured to perform, in response to a first sliding operation acting in the first control area, control the operating point of the joystick field to follow the first sliding operation in the graphical user interface. Moves according to the movement of the touch point;
  • the second rocker following module 903 is configured to perform a position change of the touch point in the second control area according to the second sliding operation in the second control area. , controlling the operating point of the joystick domain to move within the first control area.
  • the movement control module 904 is configured to generate a movement control vector according to the relative position of the joystick domain and the operating point, and perform movement control based on the movement control vector.
  • the first control area includes a first sub-area and a second sub-area located outside the first sub-area; the rocker display module is configured to perform a response
  • the first touch operation that acts on the first sub-region in the first control area uses the touch position of the first touch operation as the center of the joystick field and displays the joystick field; or
  • the center of the rocker field is determined based on the touch position of the second touch operation and the center of the first sub-area. and display the joystick field.
  • the radius of the chassis of the joystick domain is a preset value;
  • the joystick display module is configured to execute the touch position according to the touch operation and the first The center of the sub-region determines a connection line;
  • a joystick domain with a radius of the preset value is generated in the first control area according to the connection line, wherein the center of the joystick domain is located on the connection line, and the operation of the joystick domain
  • the circle corresponding to the point is inscribed in the chassis, and the operation point is located at the touch position.
  • the first rocker following module is configured to perform a response to a first sliding operation acting in the first control area.
  • first sliding operation When the first sliding operation is touched, When the control point is located within the joystick domain, the chassis controlling the joystick domain remains stationary, and the operating point controlling the joystick domain follows the touch point. to move;
  • the second rocker following module is configured to perform a starting point of the second control area according to the second sliding operation, and is fixed within the first control area. Display the chassis of the joystick domain;
  • the operating point of the control rocker domain remains inscribed with the chassis and moves on the chassis of the rocker domain.
  • the device further includes:
  • a recovery control module configured to perform a third sliding operation from the second control area to the first control area, and control the rocker field to translate from the current position toward the direction of the current movement control vector, Until the rocker domain is inscribed with the second control area, wherein the operating point of the rocker domain is displayed at the touch position of the third sliding operation in the first control area.
  • the second rocker following module is configured to perform a touch operation in response to a second sliding operation acting in the second control area.
  • the control point and the center of the first control area determine a connection line, and the intersection point of the connection line and the edge of the first control area is used as the operating point and chassis of the joystick domain and the first control area respectively. the inscribed point of the region;
  • the angle at which the chassis and the operating point move in the first control area is respectively the same as the angle at which the touch point of the second sliding operation moves in the second control area.
  • the angle of internal movement is the same.
  • FIG. 10 shows a schematic structural diagram of a computer system suitable for implementing a terminal device according to an embodiment of the present disclosure.
  • the computer system includes a central processing unit (CPU) that can execute various appropriate programs based on programs stored in a read-only memory (ROM) or loaded from a storage section into a random access memory (RAM). actions and processing.
  • CPU central processing unit
  • RAM random access memory
  • various programs and data required for system operation are also stored.
  • CPU, ROM and RAM are connected to each other through buses.
  • I/O Input/output
  • the following components are connected to the (I/O) interface: the input section including keyboard, mouse, etc.; the output section including cathode ray tube (CRT), liquid crystal display (LCD), etc., speakers, etc.; the storage section including hard disk, etc.; and Includes the communication portion of network interface cards such as LAN cards, modems, etc.
  • the communication section performs communication processing via a network such as the Internet.
  • Drives are also connected to (I/O) interfaces as needed.
  • Removable media such as magnetic disks, optical disks, magneto-optical disks, semiconductor memories, etc., are mounted on the drive as needed to facilitate reading from them.
  • the outgoing computer programs are installed into the storage section as needed.
  • embodiments of the present disclosure include a computer program product including a computer program carried on a computer-readable medium, the computer program containing program code for performing the method illustrated in the flowchart.
  • the computer program may be downloaded and installed from the network via the communications component, and/or installed from removable media.
  • CPU central processing unit
  • a method of mobile control which provides a graphical user interface through a terminal device.
  • the graphical user interface includes a preset first control area and a second control area located on the periphery of the first control area.
  • the method includes:
  • the operation point of the joystick field is controlled to be at that location. move within the first control area.
  • a movement control vector is generated according to the relative position of the joystick domain and the operating point, and movement control is performed based on the movement control vector.
  • the first control area includes a first sub-area and a second sub-area located outside the first sub-area; the rocker is displayed through the first control area Domain steps include:
  • the touch position of the first touch operation is used as the center of the joystick field and the joystick field is displayed;
  • the center of the rocker field is determined based on the touch position of the second touch operation and the center of the first sub-area. and display the joystick field.
  • the radius of the chassis of the rocker domain is a preset value; the rocker is determined based on the touch position of the second touch operation and the center of the first sub-area.
  • the steps to center the field and display the rocker field include:
  • a joystick domain with a radius of the preset value is generated in the first control area according to the connection line, wherein the center of the joystick domain is located on the connection line, and the operation of the joystick domain
  • the circle corresponding to the point is inscribed in the chassis, and the operation point is located at the touch position.
  • the response acts on a first sliding operation in the first control area, and the operating point of the rocker field is controlled in the graphic following the first sliding operation.
  • the steps for moving touch points on the user interface include:
  • the operation point of the joystick domain is controlled in the first control area. Steps for moving within the area include:
  • the chassis of the joystick domain is fixedly displayed in the first control area
  • the operating point of the control rocker domain remains inscribed with the chassis and moves on the chassis of the rocker domain.
  • the method further includes:
  • the joystick field is controlled to translate from the current position toward the direction of the current movement control vector until the joystick field is in contact with the
  • the second control area is inscribed, wherein the operation point of the joystick field is displayed at the touch position of the third sliding operation in the first control area.
  • the response acts on a second sliding operation in the second control area, and the touch point according to the second sliding operation is in the second control area.
  • Position change the step of controlling the operating point of the joystick domain to move within the first control area, includes:
  • a connection line is determined based on the touch point of the second sliding operation and the center of the first control area, and the connection line is connected with the second control area.
  • the intersection point of the edge of a control area serves as the inscribed point of the operating point of the joystick domain and the chassis respectively with the first control area;
  • the angle at which the chassis and the operating point move in the first control area is respectively the same as the angle at which the touch point of the second sliding operation moves in the second control area.
  • the angle of internal movement is the same.
  • the joystick domain is displayed through the first control area; in response to the first sliding operation acting in the first control area, the operating point of the control rocker domain follows the first sliding operation. Move with the movement of the touch point on the graphical user interface; in response to the second sliding operation in the second control area, control the joystick field according to the position change of the touch point of the second sliding operation in the second control area.
  • the operating point moves within the first control area.
  • a movement control vector is generated based on the relative position of the joystick domain and the operating point, and movement control is performed based on the movement control vector.
  • each functional module of the device for placing virtual props in a virtual scene corresponds to the steps of the above-mentioned exemplary embodiment of the method for placing virtual props in a virtual scene
  • details not disclosed in the embodiments of the device of the present disclosure are and effects, please refer to the embodiments of the method for placing virtual props in a virtual scene of the present disclosure.
  • the above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more application specific integrated circuits (ASIC for short), or one or more microprocessors (Digital Singnal Processor (DSP for short), or one or more Field Programmable Gate Array (Field Programmable Gate Array (FPGA for short)), etc.
  • ASIC application specific integrated circuit
  • DSP Digital Singnal Processor
  • FPGA Field Programmable Gate Array
  • the processing element can be a general-purpose processor, such as a central processing unit (Central Processing Unit, CPU for short) or other processors that can call program code.
  • these modules can be integrated together and implemented in the form of a system-on-a-chip (SOC).
  • SOC system-on-a-chip
  • the disclosed devices and methods can be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be combined or can be integrated into another system, or some features can be ignored, or not implemented.
  • the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, and the indirect coupling or communication connection of the devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or they may be distributed to multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in various embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of hardware plus software functional units.
  • the above-mentioned integrated unit implemented in the form of a software functional unit can be stored in a computer-readable storage medium.
  • the above-mentioned software functional unit is stored in a storage medium and includes a number of instructions to cause a computer device (which can be a personal computer, a server, or a network device, etc.) or a processor (English: processor) to execute the various embodiments of the present disclosure. Some steps of the method.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本公开提供一种移动控制的方法、装置、存储介质及电子设备;涉及虚拟交互技术领域。通过终端设备提供图形用户界面,图形用户界面中包括一预设的第一控制区域和位于第一控制区域外围的第二控制区域,方法包括:通过第一控制区域显示摇杆域;响应作用于第一控制区域内的第一滑动操作,控制摇杆域的操作点跟随第一滑动操作在所述图形用户界面上的触控点的移动而移动;响应作用于第二控制区域内的第二滑动操作,根据第二滑动操作的触控点在第二控制区域内的位置变化,控制摇杆域的操作点在第一控制区域内移动。根据摇杆域和操作点的相对位置生成移动控制向量,基于移动控制向量进行移动控制。本公开可以提高移动控制的便捷度和交互连贯性。 (图2)

Description

移动控制的方法、装置、存储介质及电子设备
相关申请的交叉引用
本公开要求于2022年07月26日提交的申请号为202210886343.7、名称为“移动控制的方法、装置、存储介质及电子设备”的中国专利申请的优先权,该中国专利申请的全部内容通过引用全部并入本文。
技术领域
本公开涉及虚拟交互技术领域,具体而言,涉及一种移动控制的方法、移动控制的装置、计算机可读存储介质以及电子设备。
背景技术
在生活中,常常需要控制一个或多个对象的移动,例如控制飞机和汽车来运动;在虚拟场景中,也常需要控制虚拟对象进行移动。通过摇杆控制虚拟对象的移动是一种常用的控制方法。
以虚拟游戏为例,用户通过界面提供的摇杆来控制虚拟角色。目前的方案有三种,一是固定摇杆,用户只能在有限的摇杆区域内操作;二是半固定摇杆,基于初次操作在相应位置生成摇杆,用户也只能在有限的摇杆区域内操作,直到下一次重新操作生成新的摇杆区域;三是跟随摇杆,摇杆区域跟随用户操作不断变化。
上述方案,要么用户的操作范围受限制程度高,要么操作连贯性较差。
需要说明的是,在上述背景技术部分公开的信息仅用于加强对本公开的背景的理解,因此可以包括不构成对本领域普通技术人员已知的现有技术的信息。
发明内容
本公开实施例的目的在于提供一种移动控制的方法、移动控制的装置、电子设备以及计算机可读存储介质,进而至少在一定程度上解决移动控制操作的不够便捷以及连贯性较差的问题,实现了提高移动控制的便捷度和连贯性。
根据本公开的一个方面,提供一种移动控制的方法,通过终端设备提供图形用户界面,所述图形用户界面中包括一预设的第一控制区域和位于所述第一控制区域外围的第二控制区域,所述方法包括:
通过所述第一控制区域显示摇杆域;
响应作用于所述第一控制区域内的第一滑动操作,控制所述摇杆域的操作点跟随所述第一滑动操作在所述图形用户界面上的触控点的移动而移动;
响应作用于所述第二控制区域内的第二滑动操作,根据所述第二滑动操作的触控点在所述第二控制区域内的位置变化,控制所述摇杆域的操作点在所述第一控制区域内移 动。
根据所述摇杆域和所述操作点的相对位置生成移动控制向量,基于所述移动控制向量进行移动控制。
根据本公开的一个方面,提供一种移动控制的装置,通过终端设备提供图形用户界面,所述图形用户界面中包括一预设的第一控制区域和位于所述第一控制区域外围的第二控制区域,所述装置包括:
摇杆显示模块,被配置为执行通过所述第一控制区域显示摇杆域;
第一摇杆跟随模块,被配置为执行响应作用于所述第一控制区域内的第一滑动操作,控制所述摇杆域的操作点跟随所述第一滑动操作在所述图形用户界面上的触控点的移动而移动;
第二摇杆跟随模块,被配置为执行响应作用于所述第二控制区域内的第二滑动操作,根据所述第二滑动操作的触控点在所述第二控制区域内的位置变化,控制所述摇杆域的操作点在所述第一控制区域内移动。
移动控制模块,被配置为执行根据所述摇杆域和所述操作点的相对位置生成移动控制向量,基于所述移动控制向量进行移动控制。
根据本公开的一个方面,提供一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序被处理器执行时实现上述任意一项所述的方法。
根据本公开的一个方面,提供一种电子设备,包括:处理器;以及存储器,用于存储所述处理器的可执行指令;其中,所述处理器配置为经由执行所述可执行指令来执行上述任意一项所述的方法。
本公开示例性实施例可以具有以下部分或全部有益效果:
在公开示例实施方式所提供的移动控制的方法,通过第一控制区域显示摇杆域;响应作用于第一控制区域内的第一滑动操作,控制摇杆域的操作点跟随第一滑动操作在图形用户界面上的触控点的移动而移动;响应作用于第二控制区域内的第二滑动操作,根据第二滑动操作的触控点在第二控制区域内的位置变化,控制摇杆域的操作点在第一控制区域内移动。根据摇杆域和操作点的相对位置生成移动控制向量,基于移动控制向量进行移动控制。实施本公开的实施例,一方面,避免了用户只能在固定或半固定式的摇杆域中进行操作,提高了可操作范围,进而提高操作便捷度;另一方面,使得摇杆域不会时刻保持跟随用户操作,避免了摇杆域移动到不便操作的地方或需要改变移动方向时需要打断操作,提高了移动控制交互的连贯性。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本公开。
附图说明
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本公开的实施例, 并与说明书一起用于解释本公开的原理。显而易见地,下面描述中的附图仅仅是本公开的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1示出了可以应用本公开实施例的一种移动控制的方法及装置的示例性系统架构的示意图;
图2示意性示出了根据本公开的一个实施例的移动控制的方法的流程图;
图3示意性示出了根据本公开的一个实施例中第一控制区域的界面示意图;
图4示意性示出了根据本公开的一个实施例中固定显示摇杆域的界面示意图;
图5A示意性示出了根据本公开的一个实施例中响应作用于第一子区域的第一触控操作显示摇杆域的界面示意图;
图5B示意性示出了根据本公开的一个实施例中响应作用于第二子区域的第二触控操作显示摇杆域的界面示意图;
图6示意性示出了根据本公开的一个实施例中响应作用于第一控制区域内的第一滑动操作,控制操作点跟随的界面示意图;
图7示意性示出了根据本公开的一个实施例中从第二控制区域滑动至第一控制区域的界面示意图;
图8示意性示出了根据本公开的一个实施例中在第二控制区域进行控制的界面示意图;
图9示意性示出了根据本公开的一个实施例的移动控制的装置的结构框图;
图10示意性示出了适于用来实现本公开实施例的电子设备的计算机系统的结构示意图。
具体实施方式
现在将参考附图更全面地描述示例实施方式。然而,示例实施方式能够以多种形式实施,且不应被理解为限于在此阐述的范例;相反,提供这些实施方式使得本公开将更加全面和完整,并将示例实施方式的构思全面地传达给本领域的技术人员。所描述的特征、结构或特性可以以任何合适的方式结合在一个或更多实施方式中。在下面的描述中,提供许多具体细节从而给出对本公开的实施方式的充分理解。然而,本领域技术人员将意识到,可以实践本公开的技术方案而省略所述特定细节中的一个或更多,或者可以采用其它的方法、组元、装置、步骤等。在其它情况下,不详细示出或描述公知技术方案以避免喧宾夺主而使得本公开的各方面变得模糊。
此外,附图仅为本公开的示意性图解,并非一定是按比例绘制。图中相同的附图标记表示相同或类似的部分,因而将省略对它们的重复描述。附图中所示的一些方框图是功能实体,不一定必须与物理或逻辑上独立的实体相对应。可以采用软件形式来实现这些功能实体,或在一个或多个硬件模块或集成电路中实现这些功能实体,或在不同网络 和/或处理器装置和/或微控制器装置中实现这些功能实体。
图1示出了可以应用本公开实施例的一种移动控制的方法及装置的示例性应用环境的示意图。
如图1所示,系统架构100可以包括终端设备101、102、103中的一个或多个。终端设备101、102、103可以是具有显示屏的各种电子设备,包括但不限于台式计算机、便携式计算机、智能手机和平板电脑等等。终端设备可以安装并运行虚拟显示程序、三维地图程序、虚拟游戏程序等。
在本公开其中一种实施例中的移动控制的方法可以运行于本地终端设备或者是服务器。当移动控制的方法运行于服务器时,该方法则可以基于云交互系统来实现与执行,其中,云交互系统包括服务器和客户端设备。
在一可选的实施方式中,云交互系统下可以运行各种云应用,例如:云游戏。以云游戏为例,云游戏是指以云计算为基础的游戏方式。在云游戏的运行模式下,游戏程序的运行主体和游戏画面呈现主体是分离的,移动控制的方法的储存与运行是在云游戏服务器上完成的,客户端设备用于数据的接收、发送以及游戏画面的呈现,举例而言,客户端设备可以是靠近用户侧的具有数据传输功能的显示设备,如,移动终端、电视机、计算机、掌上电脑等;但是进行信息处理的为云端的云游戏服务器。在进行游戏时,玩家操作客户端设备向云游戏服务器发送操作指令,云游戏服务器根据操作指令运行游戏,将游戏画面等数据进行编码压缩,通过网络返回客户端设备,最后,通过客户端设备进行解码并输出游戏画面。
在一可选的实施方式中,以游戏为例,本地终端设备存储有游戏程序并用于呈现游戏画面。本地终端设备用于通过图形用户界面与玩家进行交互,即,常规的通过终端设备下载安装游戏程序并运行。该本地终端设备将图形用户界面提供给玩家的方式可以包括多种,例如,可以渲染显示在终端的显示屏上,或者,通过全息投影提供给玩家。举例而言,本地终端设备可以包括显示屏和处理器,该显示屏用于呈现图形用户界面,该图形用户界面包括游戏画面,该处理器用于运行该游戏、生成图形用户界面以及控制图形用户界面在显示屏上的显示。
附图中的流程图和框图,图示了按照本公开各种实施例的系统、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段、或代码的一部分,上述模块、程序段、或代码的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。也应当注意,在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个接连地表示的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图或流程图中的每个方框、以及框图或流程图中的方框的组合,可以用执行规定的功能或操作的专用的基于硬件的系统来实现,或者可以用专用硬件与计算机指令的组合来实现。
描述于本公开实施例中所涉及到的单元可以通过软件的方式实现,也可以通过硬件的方式来实现,所描述的单元也可以设置在处理器中。其中,这些单元的名称在某种情况下并不构成对该单元本身的限定。
需要说明的是,本公开可以是在虚拟场景中控制虚拟对象或虚拟角色进行移动,例如可以是游戏中的虚拟对象包括虚拟角色,三维地图程序中的虚拟对象等等。本公开的示例性实施例中涉及的虚拟场景可以是计算机、手机、平板电脑等智能终端设备通过数字化技术勾勒出的数字化场景。该虚拟场景可以包括房屋、楼宇、园林、桥梁、水池等建筑物或构筑物,还可以包括山地、河流、湖泊等自然景观以及武器、工具、生物等任意的虚拟物品或虚拟道具。虚拟场景可以是对真实世界的仿真场景、可以是纯虚构的虚拟场景或部分仿真部分虚构的虚拟场景,本示例性实施例对此不做特殊限定。
参考图2所示,该移动控制的方法,通过终端设备提供图形用户界面,所述图形用户界面中包括一预设的第一控制区域和位于所述第一控制区域外围的第二控制区域,所述方法包括:
步骤S210、通过所述第一控制区域显示摇杆域;
步骤S220、响应作用于所述第一控制区域内的第一滑动操作,控制所述摇杆域的操作点跟随所述第一滑动操作在所述图形用户界面上的触控点的移动而移动;
步骤S230、响应作用于所述第二控制区域内的第二滑动操作,根据所述第二滑动操作的触控点在所述第二控制区域内的位置变化,控制所述摇杆域的操作点在所述第一控制区域内移动。
步骤S240、根据所述摇杆域和所述操作点的相对位置生成移动控制向量,基于所述移动控制向量进行移动控制。
在本示例实施方式所提供的移动控制的方法中,通过第一控制区域显示摇杆域;响应作用于第一控制区域内的第一滑动操作,控制摇杆域的操作点跟随第一滑动操作在图形用户界面上的触控点的移动而移动;响应作用于第二控制区域内的第二滑动操作,根据第二滑动操作的触控点在第二控制区域内的位置变化,控制摇杆域的操作点在第一控制区域内移动。根据摇杆域和操作点的相对位置生成移动控制向量以进行移动控制。实施本公开的实施例,一方面,避免了用户只能在固定或半固定式的摇杆域中进行操作,提高了可操作范围,进而提高操作便捷度;另一方面,使得摇杆域不会时刻保持跟随用户操作,避免了摇杆域移动到不便操作的地方或需要改变移动方向时需要打断操作,提高了移动控制交互的连贯性。
下面,对上述步骤进行更加详细的说明。
在本公开中,在终端设备提供的图形用户界面中确定第一控制区域,该第一控制区域处于用户最方便进行移动控制操作的位置,可以是在图形用户界面的左下方,也可以是在不同移动控制场景下的其他位置。第一控制区域可以是圆形,第一子区域为第一控制区域圆形中较小的圆形,第二子区域为围绕第一区域的环形,第一区域和第二区域的 结合即为第一控制区域。第一子区域的半径可以为第一控制区域半径的一半,第二子区域环形的宽度可以等于第一区域圆形的半径长度。在不同的移动控制场景下,第一控制区域、第一子区域和第二子区域的形状可以自由配置,上述半径与宽度的关系也可改变,本公开实施例在此不做限制。
可以理解的是,第一控制区域和第二控制区域的区域范围可以是可视的或不可视的。
举例而言,如图3所示,第一控制区域301位于横向设置的图形用户界面的左下角,第一控制区域301外是第二控制区域,第二控制区域的具体区域不做限制。第二控制区域可以是围绕第一控制区域301的环形区域。右侧可以是控制虚拟角色释放技能或执行动作的控件。第一控制区域中301包括较小圆形对应的第一子区域和较小圆形外围的环形对应的第二子区域。
在步骤S210中,通过所述第一控制区域显示摇杆域。
在本公开实施例中,在第一控制区域显示摇杆域,摇杆域可以包括一个底盘和一个操作点。在还未开始操作时,操作点的位置可以默认在摇杆域的中心或是不显示操作点。当操作点被配置为始终位于摇杆域内时,摇杆域的在图形用户界面上的区域就可以与其底盘的区域相同。
在步骤S220中,响应作用于所述第一控制区域内的第一滑动操作,控制所述摇杆域的操作点跟随所述第一滑动操作在所述图形用户界面上的触控点的移动而移动。
在本公开实施例中,响应作用于第一控制区域内的第一滑动操作,获取第一滑动操作在图形用户界面上的触控点位置,控制摇杆域的操作点跟随触控点的移动而移动。可以通过监听滑动事件和点触事件,或是通过监测屏幕压力变化来判断第一滑动操作的触控位置,继而获取触控点的位置坐标。
在步骤S230中,响应作用于所述第二控制区域内的第二滑动操作,根据所述第二滑动操作的触控点在所述第二控制区域内的位置变化,控制所述摇杆域的操作点在所述第一控制区域内移动。
在本公开实施例中,第二滑动操作可以是与第一滑动操作连续的操作,即用户从第一控制区域滑出,滑动到第二控制区域。此时第二滑动操作在第二控制区域的第一个触控点是与第一控制区域紧贴的一个点;第二滑动操作也可以是独立于第一滑动操作的操作,则第一个触控点可以是第二控制区域内的任意一个点。
响应作用于所述第二控制区域内的第二滑动操作,第二滑动操作可以是与第一滑动操作连续的操作,即用户从第一控制区域滑出,滑动到第二控制区域。首先根据第二滑动操作初次作用于第二控制区域的触控点的位置,可以在第一控制区域内固定显示所述摇杆域的底盘。
在固定显示摇杆域时,可以将摇杆域的中心确定在第一控制区域和第二滑动操作在第二控制区域内的第一个触控点的连线上,并且摇杆域的底盘与第一控制区域内切,内切的位置在摇杆域的中心确定在第一控制区域和第触控点的连线与第一控制区域相交 的位置。
举例而言,如图4所述,摇杆域的底盘为圆形,摇杆域的中心位于A点,B点可以是位于摇杆域外正右方的一个点,C点可以是位于摇杆域外正下方的一个点。第二滑动操作的触控点从B点移动至C点的过程中,摇杆域固定显示,不发生位置变化,仅移动控制向量发生变化。
在确定摇杆域的操作点时,根据触控点的位置和摇杆域的中心确定一连线,将该连线与摇杆域的边缘的交点作为摇杆域的操作点与所述摇杆域的底盘的内切位置。继续参考图4,操作点可视化为摇杆域底盘上的一个较小的圆形。但是,可以理解的是,连线与摇杆域的边缘的交点也可以直接作为操作点对应圆形的圆心,此时圆形在第一控制区域的边缘上,而不是与边缘内切。本公开实施例,在此不做特殊限制。
在步骤S240中,根据所述摇杆域和所述操作点的相对位置生成移动控制向量,基于所述移动控制向量进行移动控制。
在本公开实施例中,根据摇杆域和操作点的相对位置生成移动控制向量。举例而言,根据触控点确定操作点所在位置,操作点可以在摇杆域的靠右部分,则基于此生成移动控制向量可以控制虚拟角色往右行走;操作点可以在摇杆域的左上部分,则基于此生成移动控制向量可以控制虚拟角色往左前方行走。
可以理解的是,基于同一位置的摇杆域,同一方向的不同位置的两个操作点,可以在控制虚拟角色都往同一方向行走之外,还可以有其他的区别控制。举例而言,摇杆域是半径为3个单位的圆形,第一操作点位于距离圆形圆心1个单位的正右方,第二操作点位于距离圆形圆心2个单位的正右方。则第一操作点可以对应移动速度10,第二操作点可以对应移动速度20;也可以是,第一操作点可以对应步行向右,第二操作点可以对应跑步向右。本公开实施例在此不做限制。
本公开还提供一种移动控制的方法的实现方式。所述第一控制区域包括第一子区域和位于所述第一子区域外围的第二子区域;所述通过所述第一控制区域显示摇杆域的步骤,包括:
响应作用于所述第一控制区域中所述第一子区域的第一触控操作,将所述第一触控操作的触控位置作为摇杆域的中心并显示所述摇杆域;或
响应作用于所述第一控制区域中所述第二子区域的第二触控操作,根据所述第二触控操作的触控位置和所述第一子区域的中心确定摇杆域的中心并显示所述摇杆域。
在本公开实施例中,第一控制区域为包括第一子区域和位于第一子区域外围的第二子区域,第一子区域可以是圆形,第二子区域可以是围绕第一子区域的环形。
在响应作用于第一控制区域显示摇杆域时,触控点可以是在第一子区域也可以是第二子区域。响应作用于第一子区域的第一触控操作,将第一触控操作的触控位置作为摇杆域的中心并结合摇杆域的半径的预设值,显示摇杆域。
举例而言,如图5A所示,中间的圆形为第一子区域,第二子区域为环形。A点第 一子区域内的任意一点,根据触控位置A点,确定摇杆域的中心并显示摇杆域。由于是根据触控位置显示了摇杆域,触控位置位于摇杆域的中心。此时不生成移动控制向量,被控制的虚拟角色可保持不动或维持原有运动状态不被调整。
响应作用于第二子区域的第二触控操作,根据第二触控操作的触控位置和第一子区域的中心确定摇杆域的中心并显示摇杆域。其中,根据触控位置和第一子区域的中心确定一连线,摇杆域的中心位于该连线上。将触控位置作为操作点对应圆形的圆心所在位置,该操作点对应圆形与摇杆域内切,结合摇杆域的底盘的半径的预设值可以确定摇杆域的中心所在位置进而显示摇杆域。
举例而言,如图5B所示,中间的圆形为第一子区域,第二子区域为环形。B点第二子区域内的任意一点,根据触控位置B点,确定摇杆域的中心并显示摇杆域,具体确定摇杆域的中心并显示摇杆域的过程如上,此处不再赘述。
本公开还提供一种移动控制的方法的实现方式。所述响应作用于所述第一控制区域内的第一滑动操作,控制所述摇杆域的操作点跟随所述第一滑动操作在所述图形用户界面上的触控点的移动而移动的步骤,包括:
响应作用于所述第一控制区域内的第一滑动操作,当所述第一滑动操作的触控点位于所述摇杆域以内时,控制所述摇杆域的底盘保持不动,且控制所述摇杆域的操作点跟随所述触控点的移动而移动;
当所述第一滑动操作的触控点位于所述第一控制区域内除所述摇杆域以外的区域时,控制所述摇杆域的底盘和操作点跟随所述触控点的移动而移动。
在本公开实施例中,当第一滑动操作的触控点位于当前摇杆域内,控制摇杆域的底盘位置不变,仅操作点的位置跟随触控点的位置移动。举例而言,如图6所示,触控点A和B都位于摇杆域范围内,在第一滑动操作从触控点A移动至触控点B的过程中,摇杆域显示位置不变。若将,触控点A和B对应的摇杆域作为初始摇杆域,则C点在初始摇杆域以外的区域,因此当第一滑动操作的触控点为C点时,控制所述摇杆域的底盘和操作点跟随触控点的移动而移动。在此过程中,操作点对应的圆形保持与摇杆域内切。
实施本公开的实施例,可以通过当第一滑动操作的触控点位于摇杆域以内时,控制摇杆域的底盘保持不动;触控点位于第一控制区域内除摇杆域以外的区域时,控制摇杆域的底盘和操作点跟随触控点的移动而移动。能够避免操作范围过大,但仍使得用户可以进行移动控制操作的范围可以进行拓展,提高操作的便捷度和容错度。
由于第一控制区域实际是相对适合用户进行移动控制操作的区域,则用户在移动操作的过程中,即使在第二控制区域进行了移动控制,很大程度上还会再次回到第一控制区域进行移动控制。
基于此,本公开还提供一种移动控制的方法的实现方式。所述方法还包括:
响应从所述第二控制区域滑动至所述第一控制区域的第三滑动操作,控制所述摇杆 域从当前位置朝当前的移动控制向量的方向进行平移,直到所述摇杆域与所述第一控制区域内切,其中,所述摇杆域的操作点显示于所述第三滑动操作在所述第一控制区域内的触控位置。
在本公开实施例中,首先,摇杆域当前是固定显示在第一控制区域内的。响应从所述第二控制区域滑动至所述第一控制区域的第三滑动操作,根据第三滑动操作首次接触第一控制区域的位置,确定此时的移动控制向量,控制摇杆域从当前位置朝当前的移动控制向量的方向进行平移,直到摇杆域与第二控制区域内切,此时位于摇杆域内的操作点对应圆形与第一控制区域的内切位置和第三滑动操作的触控位置相同。
举例而言,如图7所示,A点为摇杆域正下方的、第一控制区域外的一个触控点;B点为摇杆域左下方的、第一控制区域外的一个触控点;C点为摇杆域左下方的、第一控制区域内的一个触控点。第三滑动操作从A点滑动至B点,相应摇杆域中的操作点对应圆形位置跟随变化;响应第三滑动操作从B点滑动至C点,摇杆域位置移动到与第一控制区域内切。
实施本公开的实施例,通过响应从第二控制区域滑动至第一控制区域的第三滑动操作,控制摇杆域从当前位置朝当前的移动控制向量的方向进行平移,直到摇杆域与第一控制区域内切。能够避免摇杆域在较大的范围内跟随滑动操作移动,进而避免交互控制的范围过大,进而提高了移动控制交互过程的便捷度。
本公开还提供一种移动控制的方法的实现方式。所述响应作用于所述第二控制区域内的第二滑动操作,根据所述第二滑动操作的触控点在所述第二控制区域内的位置变化,控制所述摇杆域的操作点在所述第一控制区域内移动的步骤,包括:
响应作用于所述第二控制区域内的第二滑动操作,根据所述第二滑动操作的触控点和所述第一控制区域的中心确定一连线,将所述连线与所述第一控制区域的边缘的交点作为所述摇杆域的操作点和底盘分别与所述第一控制区域的内切点;
根据所述触控点在所述第二控制区域内的位置变化,控制所述摇杆域的操作点和底盘分别在保持与所述第一控制区域内切时,在所述第一控制区域内移动。
在本公开实施例中,第二滑动操作作用于第二控制区域,根据第二滑动操作的触控点和第一控制区域的中心确定一连线,将连线与第一控制区域的边缘的交点作为摇杆域的操作点和底盘分别与所述第一控制区域的内切点。
当持续于第二控制区域内进行第二滑动操作,摇杆域在第一控制区域内移动的过程中,控制摇杆域的操作点和底盘分别保持与第一控制区域内切。其中,根据第二滑动操作不同的触控点与第一控制中心的不同连线所形成的角度,来控制底盘和操作点在所述第一控制区域内移动相同的角度。
举例而言,如图8所示,A点为摇杆域正右方的、第一控制区域外的一个触控点;B点为摇杆域正下方的、第一控制区域外的一个触控点;当第二滑动操作的触控点从A点移动到B点,A与第一控制区域的连线与B与第一控制区域的连线形成的角度为90 度。则相应的,控制摇杆域的底盘和操作点在第一控制区域内,在保持与第一控制区域边缘内切的同时,移动90度。
实施本公开的实施例,通过响应作用于第二控制区域内的控制操作的位置变化,在第一控制区域显示摇杆域,并相应移动摇杆域。能够扩大用户可操作的区域,提高操作效率、容错度。
应当注意,尽管在附图中以特定顺序描述了本公开中方法的各个步骤,但是,这并非要求或者暗示必须按照该特定顺序来执行这些步骤,或是必须执行全部所示的步骤才能实现期望的结果。附加的或备选的,可以省略某些步骤,将多个步骤合并为一个步骤执行,以及/或者将一个步骤分解为多个步骤执行等。
进一步的,本公开中,还提供了一种移动控制的装置,通过终端设备提供图形用户界面,所述图形用户界面中包括一预设的第一控制区域和位于所述第一控制区域外围的第二控制区域。所述装置900包括:
摇杆显示模块901,被配置为执行通过所述第一控制区域显示摇杆域;
第一摇杆跟随模块902,被配置为执行响应作用于所述第一控制区域内的第一滑动操作,控制所述摇杆域的操作点跟随所述第一滑动操作在所述图形用户界面上的触控点的移动而移动;
第二摇杆跟随模块903,被配置为执行响应作用于所述第二控制区域内的第二滑动操作,根据所述第二滑动操作的触控点在所述第二控制区域内的位置变化,控制所述摇杆域的操作点在所述第一控制区域内移动。
移动控制模块904,被配置为执行根据所述摇杆域和所述操作点的相对位置生成移动控制向量,基于所述移动控制向量进行移动控制。
在本公开的一种示例性实施例中,所述第一控制区域包括第一子区域和位于所述第一子区域外围的第二子区域;所述摇杆显示模块,被配置为执行响应作用于所述第一控制区域中所述第一子区域的第一触控操作,将所述第一触控操作的触控位置作为摇杆域的中心并显示所述摇杆域;或
响应作用于所述第一控制区域中所述第二子区域的第二触控操作,根据所述第二触控操作的触控位置和所述第一子区域的中心确定摇杆域的中心并显示所述摇杆域。
在本公开的一种示例性实施例中,所述摇杆域的底盘的半径为预设值;摇杆显示模块,被配置为执行根据所述触控操作的触控位置和所述第一子区域的中心确定一连线;
根据所述连线在所述第一控制区域中生成半径为所述预设值的摇杆域,其中,所述摇杆域的中心位于所述连线上,且所述摇杆域的操作点对应的圆形与所述底盘内切,所述操作点位于所述触控位置。
在本公开的一种示例性实施例中,所述第一摇杆跟随模块,被配置为执行响应作用于所述第一控制区域内的第一滑动操作,当所述第一滑动操作的触控点位于所述摇杆域以内时,控制所述摇杆域的底盘保持不动,且控制所述摇杆域的操作点跟随所述触控点 的移动而移动;
当所述第一滑动操作的触控点位于所述第一控制区域内除所述摇杆域以外的区域时,控制所述摇杆域的底盘和操作点跟随所述触控点的移动而移动。
在本公开的一种示例性实施例中,第二摇杆跟随模块,被配置为执行根据所述第二滑动操作在所述第二控制区域的起始点,在所述第一控制区域内固定显示所述摇杆域的底盘;
确定所述第二滑动操作的触控点和所述摇杆域的中心的连线,将所述连线和所述摇杆域的边缘的交点确定为所述摇杆域的操作点与所述摇杆域的底盘的内切位置;
根据所述第二滑动操作的触控点在所述第二控制区域内的位置变化,控制所述摇杆域的操作点保持与所述底盘内切且在所述摇杆域的底盘上移动。
在本公开的一种示例性实施例中,所述装置还包括:
回复控制模块,被配置为执行响应从所述第二控制区域滑动至所述第一控制区域的第三滑动操作,控制所述摇杆域从当前位置朝当前的移动控制向量的方向进行平移,直到所述摇杆域与所述第二控制区域内切,其中,所述摇杆域的操作点显示于所述第三滑动操作在所述第一控制区域内的触控位置。
在本公开的一种示例性实施例中,所述第二摇杆跟随模块,被配置为执行响应作用于所述第二控制区域内的第二滑动操作,根据所述第二滑动操作的触控点和所述第一控制区域的中心确定一连线,将所述连线与所述第一控制区域的边缘的交点作为所述摇杆域的操作点和底盘分别与所述第一控制区域的内切点;
根据所述触控点在所述第二控制区域内的位置变化,控制所述摇杆域的操作点和底盘分别在保持与所述第一控制区域内切时,在所述第一控制区域内移动。
在本公开的一种示例性实施例中,所述底盘和所述操作点在所述第一控制区域内移动的角度分别与所述第二滑动操作的触控点在所述第二控制区域内移动的角度相同。
图10示出了适于用来实现本公开实施例的终端设备的计算机系统的结构示意图。
需要说明的是,图10示出的终端设备的计算机系统仅是一个示例,不应对本公开实施例的功能和使用范围带来任何限制。
如图10所示,计算机系统包括中央处理单元(CPU),其可以根据存储在只读存储器(ROM)中的程序或者从存储部分加载到随机访问存储器(RAM)中的程序而执行各种适当的动作和处理。在(RAM)中,还存储有系统操作所需的各种程序和数据。CPU、ROM以及RAM通过总线彼此相连。输入/输出(I/O)接口也连接至总线。
以下部件连接至(I/O)接口:包括键盘、鼠标等的输入部分;包括诸如阴极射线管(CRT)、液晶显示器(LCD)等以及扬声器等的输出部分;包括硬盘等的存储部分;以及包括诸如LAN卡、调制解调器等的网络接口卡的通信部分。通信部分经由诸如因特网的网络执行通信处理。驱动器也根据需要连接至(I/O)接口。可拆卸介质,诸如磁盘、光盘、磁光盘、半导体存储器等等,根据需要安装在驱动器上,以便于从其上读 出的计算机程序根据需要被安装入存储部分。
特别地,根据本公开的实施例,下文参考流程图描述的过程可以被实现为计算机软件程序。例如,本公开的实施例包括一种计算机程序产品,其包括承载在计算机可读介质上的计算机程序,该计算机程序包含用于执行流程图所示的方法的程序代码。在这样的实施例中,该计算机程序可以通过通信部分从网络上被下载和安装,和/或从可拆卸介质被安装。在该计算机程序被中央处理单元(CPU)执行时,可以实现如下方法步骤:
一种移动控制的方法,通过终端设备提供图形用户界面,所述图形用户界面中包括一预设的第一控制区域和位于所述第一控制区域外围的第二控制区域,所述方法包括:
通过所述第一控制区域显示摇杆域;
响应作用于所述第一控制区域内的第一滑动操作,控制所述摇杆域的操作点跟随所述第一滑动操作在所述图形用户界面上的触控点的移动而移动;
响应作用于所述第二控制区域内的第二滑动操作,根据所述第二滑动操作的触控点在所述第二控制区域内的位置变化,控制所述摇杆域的操作点在所述第一控制区域内移动。
根据所述摇杆域和所述操作点的相对位置生成移动控制向量,基于所述移动控制向量进行移动控制。
在本公开的一种示例性实施例中,所述第一控制区域包括第一子区域和位于所述第一子区域外围的第二子区域;所述通过所述第一控制区域显示摇杆域的步骤,包括:
响应作用于所述第一控制区域中所述第一子区域的第一触控操作,将所述第一触控操作的触控位置作为摇杆域的中心并显示所述摇杆域;或
响应作用于所述第一控制区域中所述第二子区域的第二触控操作,根据所述第二触控操作的触控位置和所述第一子区域的中心确定摇杆域的中心并显示所述摇杆域。
在本公开的一种示例性实施例中,所述摇杆域的底盘的半径为预设值;根据所述第二触控操作的触控位置和所述第一子区域的中心确定摇杆域的中心并显示所述摇杆域的步骤,包括:
根据所述触控操作的触控位置和所述第一子区域的中心确定一连线;
根据所述连线在所述第一控制区域中生成半径为所述预设值的摇杆域,其中,所述摇杆域的中心位于所述连线上,且所述摇杆域的操作点对应的圆形与所述底盘内切,所述操作点位于所述触控位置。
在本公开的一种示例性实施例中,所述响应作用于所述第一控制区域内的第一滑动操作,控制所述摇杆域的操作点跟随所述第一滑动操作在所述图形用户界面上的触控点的移动而移动的步骤,包括:
响应作用于所述第一控制区域内的第一滑动操作,当所述第一滑动操作的触控点位于所述摇杆域以内时,控制所述摇杆域的底盘保持不动,且控制所述摇杆域的操作点跟随所述触控点的移动而移动;
当所述第一滑动操作的触控点位于所述第一控制区域内除所述摇杆域以外的区域时,控制所述摇杆域的底盘和操作点跟随所述触控点的移动而移动。
在本公开的一种示例性实施例中,根据所述第二滑动操作的触控点在所述第二控制区域内的位置变化,控制所述摇杆域的操作点在所述第一控制区域内移动的步骤,包括:
根据所述第二滑动操作在所述第二控制区域的起始点,在所述第一控制区域内固定显示所述摇杆域的底盘;
确定所述第二滑动操作的触控点和所述摇杆域的中心的连线,将所述连线和所述摇杆域的边缘的交点确定为所述摇杆域的操作点与所述摇杆域的底盘的内切位置;
根据所述第二滑动操作的触控点在所述第二控制区域内的位置变化,控制所述摇杆域的操作点保持与所述底盘内切且在所述摇杆域的底盘上移动。
在本公开的一种示例性实施例中,所述方法还包括:
响应从所述第二控制区域滑动至所述第一控制区域的第三滑动操作,控制所述摇杆域从当前位置朝当前的移动控制向量的方向进行平移,直到所述摇杆域与所述第二控制区域内切,其中,所述摇杆域的操作点显示于所述第三滑动操作在所述第一控制区域内的触控位置。
在本公开的一种示例性实施例中,所述响应作用于所述第二控制区域内的第二滑动操作,根据所述第二滑动操作的触控点在所述第二控制区域内的位置变化,控制所述摇杆域的操作点在所述第一控制区域内移动的步骤,包括:
响应作用于所述第二控制区域内的第二滑动操作,根据所述第二滑动操作的触控点和所述第一控制区域的中心确定一连线,将所述连线与所述第一控制区域的边缘的交点作为所述摇杆域的操作点和底盘分别与所述第一控制区域的内切点;
根据所述触控点在所述第二控制区域内的位置变化,控制所述摇杆域的操作点和底盘分别在保持与所述第一控制区域内切时,在所述第一控制区域内移动。
在本公开的一种示例性实施例中,所述底盘和所述操作点在所述第一控制区域内移动的角度分别与所述第二滑动操作的触控点在所述第二控制区域内移动的角度相同。
本实施例运行的移动控制的方法的具体实施例内容,同样适用于前述移动控制的方法的实施例内容,故在此不做赘述。
在公开示例实施方式所提供的移动控制的方法,通过第一控制区域显示摇杆域;响应作用于第一控制区域内的第一滑动操作,控制摇杆域的操作点跟随第一滑动操作在图形用户界面上的触控点的移动而移动;响应作用于第二控制区域内的第二滑动操作,根据第二滑动操作的触控点在第二控制区域内的位置变化,控制摇杆域的操作点在第一控制区域内移动。根据摇杆域和操作点的相对位置生成移动控制向量,基于移动控制向量进行移动控制。实施本公开的实施例,一方面,避免了用户只能在固定或半固定式的摇杆域中进行操作,提高了可操作范围,进而提高操作便捷度;另一方面,使得摇杆域不会时刻保持跟随用户操作,避免了摇杆域移动到不便操作的地方或需要改变移动方向时 需要打断操作,提高了移动控制交互的连贯性。
应当注意,尽管在上文详细描述中提及了用于动作执行的设备的若干模块或者单元,但是这种划分并非强制性的。实际上,根据本公开的实施方式,上文描述的两个或更多模块或者单元的特征和功能可以在一个模块或者单元中具体化。反之,上文描述的一个模块或者单元的特征和功能可以进一步划分为由多个模块或者单元来具体化。
由于本公开的示例实施例的虚拟场景中放置虚拟道具的装置的各个功能模块与上述虚拟场景中放置虚拟道具的方法的示例实施例的步骤对应,因此对于本公开装置实施例中未披露的细节和效果,请参照本公开上述虚拟场景中放置虚拟道具的方法的实施例。
以上这些模块可以是被配置成实施以上方法的一个或多个集成电路,例如:一个或多个特定集成电路(Application Specific Integrated Circuit,简称ASIC),或,一个或多个微处理器(Digital Singnal Processor,简称DSP),或,一个或者多个现场可编程门阵列(Field Programmable Gate Array,简称FPGA)等。再如,当以上某个模块通过处理元件调度程序代码的形式实现时,该处理元件可以是通用处理器,例如中央处理器(Central Processing Unit,简称CPU)或其它可以调用程序代码的处理器。再如,这些模块可以集成在一起,以片上系统(System-on-a-chip,简称SOC)的形式实现。
在本公开所提供的实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本公开各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用硬件加软件功能单元的形式实现。
上述以软件功能单元的形式实现的集成的单元,可以存储在一个计算机可读取存储介质中。上述软件功能单元存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器(英文:processor)执行本公开各个实施例所述方法的部分步骤。
上仅为本公开的具体实施方式,但本公开的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本公开揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本公开的保护范围之内。因此,本公开的保护范围应以权利要求的保护范围为准。

Claims (11)

  1. 一种移动控制的方法,通过终端设备提供图形用户界面,所述图形用户界面中包括一预设的第一控制区域和位于所述第一控制区域外围的第二控制区域,所述方法包括:
    通过所述第一控制区域显示摇杆域;
    响应作用于所述第一控制区域内的第一滑动操作,控制所述摇杆域的操作点跟随所述第一滑动操作在所述图形用户界面上的触控点的移动而移动;
    响应作用于所述第二控制区域内的第二滑动操作,根据所述第二滑动操作的触控点在所述第二控制区域内的位置变化,控制所述摇杆域的操作点在所述第一控制区域内移动;
    根据所述摇杆域和所述操作点的相对位置生成移动控制向量,基于所述移动控制向量进行移动控制。
  2. 根据权利要求1所述的方法,其中,所述第一控制区域包括第一子区域和位于所述第一子区域外围的第二子区域;所述通过所述第一控制区域显示摇杆域的步骤,包括:
    响应作用于所述第一控制区域中所述第一子区域的第一触控操作,将所述第一触控操作的触控位置作为摇杆域的中心并显示所述摇杆域;或
    响应作用于所述第一控制区域中所述第二子区域的第二触控操作,根据所述第二触控操作的触控位置和所述第一子区域的中心确定摇杆域的中心并显示所述摇杆域。
  3. 根据权利要求2所述的方法,其中,所述摇杆域的底盘的半径为预设值;根据所述第二触控操作的触控位置和所述第一子区域的中心确定摇杆域的中心并显示所述摇杆域的步骤,包括:
    根据所述触控操作的触控位置和所述第一子区域的中心确定一连线;
    根据所述连线在所述第一控制区域中生成半径为所述预设值的摇杆域,其中,所述摇杆域的中心位于所述连线上,且所述摇杆域的操作点对应的圆形与所述底盘内切,所述操作点位于所述触控位置。
  4. 根据权利要求2所述的方法,其中,所述响应作用于所述第一控制区域内的第一滑动操作,控制所述摇杆域的操作点跟随所述第一滑动操作在所述图形用户界面上的触控点的移动而移动的步骤,包括:
    响应作用于所述第一控制区域内的第一滑动操作,当所述第一滑动操作的触控点位于所述摇杆域以内时,控制所述摇杆域的底盘保持不动,且控制所述摇杆域的操作点跟随所述触控点的移动而移动;
    当所述第一滑动操作的触控点位于所述第一控制区域内除所述摇杆域以外的区域时,控制所述摇杆域的底盘和操作点跟随所述触控点的移动而移动。
  5. 根据权利要求1所述的方法,其中,根据所述第二滑动操作的触控点在所述第 二控制区域内的位置变化,控制所述摇杆域的操作点在所述第一控制区域内移动的步骤,包括:
    根据所述第二滑动操作在所述第二控制区域的起始点,在所述第一控制区域内固定显示所述摇杆域的底盘;
    确定所述第二滑动操作的触控点和所述摇杆域的中心的连线,将所述连线和所述摇杆域的边缘的交点确定为所述摇杆域的操作点与所述摇杆域的底盘的内切位置;
    根据所述第二滑动操作的触控点在所述第二控制区域内的位置变化,控制所述摇杆域的操作点保持与所述底盘内切且在所述摇杆域的底盘上移动。
  6. 根据权利要求5所述的方法,其中,所述方法还包括:
    响应从所述第二控制区域滑动至所述第一控制区域的第三滑动操作,控制所述摇杆域从当前位置朝当前的移动控制向量的方向进行平移,直到所述摇杆域与所述第一控制区域内切,其中,所述摇杆域的操作点显示于所述第三滑动操作在所述第一控制区域内的触控位置。
  7. 根据权利要求1所述的方法,其中,所述响应作用于所述第二控制区域内的第二滑动操作,根据所述第二滑动操作的触控点在所述第二控制区域内的位置变化,控制所述摇杆域的操作点在所述第一控制区域内移动的步骤,包括:
    响应作用于所述第二控制区域内的第二滑动操作,根据所述第二滑动操作的触控点和所述第一控制区域的中心确定一连线,将所述连线与所述第一控制区域的边缘的交点作为所述摇杆域的操作点和底盘分别与所述第一控制区域的内切点;
    根据所述触控点在所述第二控制区域内的位置变化,控制所述摇杆域的操作点和底盘分别在保持与所述第一控制区域内切时,在所述第一控制区域内移动。
  8. 根据权利要求7所述的方法,其中,所述底盘和所述操作点在所述第一控制区域内移动的角度分别与所述第二滑动操作的触控点在所述第二控制区域内移动的角度相同。
  9. 一种移动控制的装置,通过终端设备提供图形用户界面,所述图形用户界面中包括一预设的第一控制区域和位于所述第一控制区域外围的第二控制区域,所述装置包括:
    摇杆显示模块,被配置为执行通过所述第一控制区域显示摇杆域;
    第一摇杆跟随模块,被配置为执行响应作用于所述第一控制区域内的第一滑动操作,控制所述摇杆域的操作点跟随所述第一滑动操作在所述图形用户界面上的触控点的移动而移动;
    第二摇杆跟随模块,被配置为执行响应作用于所述第二控制区域内的第二滑动操作,根据所述第二滑动操作的触控点在所述第二控制区域内的位置变化,控制所述摇杆域的操作点在所述第一控制区域内移动;
    移动控制模块,被配置为执行根据所述摇杆域和所述操作点的相对位置生成移动控 制向量,基于所述移动控制向量进行移动控制。
  10. 一种计算机可读存储介质,其上存储有计算机程序,所述程序被处理器执行时实现如权利要求1~8中任一项所述的移动控制的方法。
  11. 一种电子设备,包括:
    处理器;以及
    存储器,用于存储所述处理器的可执行指令;
    其中,所述处理器配置为经由执行所述可执行指令来执行权利要求1-8任一项所述的移动控制的方法。
PCT/CN2023/082259 2022-07-26 2023-03-17 移动控制的方法、装置、存储介质及电子设备 WO2024021635A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210886343.7 2022-07-26
CN202210886343.7A CN115129224B (zh) 2022-07-26 2022-07-26 移动控制的方法、装置、存储介质及电子设备

Publications (1)

Publication Number Publication Date
WO2024021635A1 true WO2024021635A1 (zh) 2024-02-01

Family

ID=83386397

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/082259 WO2024021635A1 (zh) 2022-07-26 2023-03-17 移动控制的方法、装置、存储介质及电子设备

Country Status (2)

Country Link
CN (1) CN115129224B (zh)
WO (1) WO2024021635A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115129224B (zh) * 2022-07-26 2023-08-04 网易(杭州)网络有限公司 移动控制的方法、装置、存储介质及电子设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018190181A (ja) * 2017-05-04 2018-11-29 望月 貴里子 ユーザーインターフェース
CN110096214A (zh) * 2019-06-05 2019-08-06 腾讯科技(深圳)有限公司 虚拟对象的移动控制方法、装置、终端和存储介质
CN111111190A (zh) * 2019-12-17 2020-05-08 网易(杭州)网络有限公司 游戏中虚拟角色的交互方法、装置以及触控终端
CN111228810A (zh) * 2020-01-13 2020-06-05 网易(杭州)网络有限公司 一种虚拟摇杆的控制方法及装置、电子设备、存储介质
CN113908550A (zh) * 2021-10-20 2022-01-11 网易(杭州)网络有限公司 虚拟角色控制方法、非易失性存储介质及电子装置
CN115129224A (zh) * 2022-07-26 2022-09-30 网易(杭州)网络有限公司 移动控制的方法、装置、存储介质及电子设备

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9910527B2 (en) * 2013-02-15 2018-03-06 Flatfrog Laboratories Ab Interpretation of pressure based gesture
CN108211350B (zh) * 2017-12-07 2021-06-04 网易(杭州)网络有限公司 信息处理方法、电子设备及存储介质
CN108295466B (zh) * 2018-03-08 2021-09-07 网易(杭州)网络有限公司 虚拟对象运动控制方法、装置、电子设备及存储介质
CN109999506B (zh) * 2019-03-26 2023-04-07 网易(杭州)网络有限公司 目标事件的交互控制方法与装置、存储介质、电子设备
CN113440835A (zh) * 2021-07-02 2021-09-28 网易(杭州)网络有限公司 虚拟单位的控制方法、装置、处理器及电子装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018190181A (ja) * 2017-05-04 2018-11-29 望月 貴里子 ユーザーインターフェース
CN110096214A (zh) * 2019-06-05 2019-08-06 腾讯科技(深圳)有限公司 虚拟对象的移动控制方法、装置、终端和存储介质
CN111111190A (zh) * 2019-12-17 2020-05-08 网易(杭州)网络有限公司 游戏中虚拟角色的交互方法、装置以及触控终端
CN111228810A (zh) * 2020-01-13 2020-06-05 网易(杭州)网络有限公司 一种虚拟摇杆的控制方法及装置、电子设备、存储介质
CN113908550A (zh) * 2021-10-20 2022-01-11 网易(杭州)网络有限公司 虚拟角色控制方法、非易失性存储介质及电子装置
CN115129224A (zh) * 2022-07-26 2022-09-30 网易(杭州)网络有限公司 移动控制的方法、装置、存储介质及电子设备

Also Published As

Publication number Publication date
CN115129224B (zh) 2023-08-04
CN115129224A (zh) 2022-09-30

Similar Documents

Publication Publication Date Title
EP4070865A1 (en) Method and apparatus for displaying virtual scene, and device and storage medium
CN107977141B (zh) 交互控制方法、装置、电子设备及存储介质
CN111464430B (zh) 一种动态表情展示方法、动态表情创建方法及装置
WO2024021635A1 (zh) 移动控制的方法、装置、存储介质及电子设备
US20230405452A1 (en) Method for controlling game display, non-transitory computer-readable storage medium and electronic device
CN108776544B (zh) 增强现实中的交互方法及装置、存储介质、电子设备
US20230259260A1 (en) Interaction method and apparatus for video call
CN111773709A (zh) 场景地图的生成方法及装置、计算机存储介质、电子设备
CN112053370A (zh) 基于增强现实的显示方法、设备及存储介质
CN109509242B (zh) 虚拟对象面部表情生成方法及装置、存储介质、电子设备
CN108245889B (zh) 自由视角朝向切换方法及装置、存储介质、电子设备
WO2022022729A1 (zh) 渲染控制方法、设备以及系统
CN114119829A (zh) 虚拟场景的素材处理方法及装置、电子设备和存储介质
WO2023236602A1 (zh) 虚拟对象的显示控制方法、装置、存储介质和电子装置
WO2022247318A1 (zh) 一种游戏界面显示的方法、装置、设备及介质
CN113360064A (zh) 对图片的局部区域的搜索方法及装置、介质及电子设备
CN113769403A (zh) 虚拟对象移动方法和装置、可读存储介质、电子设备
CN111973984A (zh) 虚拟场景的坐标控制方法、装置、电子设备及存储介质
CN113589992B (zh) 游戏界面交互方法、游戏界面交互装置、介质及终端设备
Miguel et al. A PDA-based see-through interface within an immersive environment
CN111476874B (zh) 曲面交互界面生成方法及装置、电子设备、存储介质
CN117631816A (zh) 一种虚拟现实场景中的界面处理方法和相关装置
CN111782333B (zh) 游戏中的界面显示方法、装置、存储介质与终端设备
CN115089964A (zh) 渲染虚拟雾模型的方法、装置、存储介质及电子装置
KR20220119328A (ko) 구성 요소를 조작하는 방법, 전자기기, 저장매체 및 프로그램 제품

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23844853

Country of ref document: EP

Kind code of ref document: A1