CN109542323B - Interaction control method and device based on virtual scene, storage medium and electronic equipment - Google Patents

Interaction control method and device based on virtual scene, storage medium and electronic equipment Download PDF

Info

Publication number
CN109542323B
CN109542323B CN201811409176.7A CN201811409176A CN109542323B CN 109542323 B CN109542323 B CN 109542323B CN 201811409176 A CN201811409176 A CN 201811409176A CN 109542323 B CN109542323 B CN 109542323B
Authority
CN
China
Prior art keywords
virtual
touch
control area
movement control
display screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811409176.7A
Other languages
Chinese (zh)
Other versions
CN109542323A (en
Inventor
张静雅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201811409176.7A priority Critical patent/CN109542323B/en
Publication of CN109542323A publication Critical patent/CN109542323A/en
Application granted granted Critical
Publication of CN109542323B publication Critical patent/CN109542323B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention relates to an interaction control method and device based on a virtual scene, belonging to the technical field of human-computer interaction, wherein the method comprises the following steps: displaying the virtual scene in the first touch display screen at least partially, and providing a first mobile control area; when detecting that a first touch operation occurs in the first movement control area, controlling the virtual object to move in the virtual scene; when detecting that a second touch operation occurs at a first preset position in the first touch display screen, calling a virtual keyboard; and if the operation response area of the virtual keyboard at least partially covers the first movement control area and the operation response level of the virtual keyboard is higher than that of the first movement control area, providing a second movement control area in the second touch screen. The method solves the problem that the user can not control the movement of the virtual object while inputting the characters in the prior art, and improves the user experience.

Description

Interaction control method and device based on virtual scene, storage medium and electronic equipment
Technical Field
The embodiment of the invention relates to the technical field of human-computer interaction, in particular to an interaction control method based on a virtual scene, an interaction control device based on the virtual scene, a computer readable storage medium and electronic equipment.
Background
In the conventional MOBA (Multiplayer Online Battle Arena) game, players often need to communicate by inputting characters. Although text communication is relatively best in accuracy and ideality, when inputting text, the popped up keyboard is generally displayed in a full screen, and a player can only perform input operation, so that the virtual object cannot be moved or operated during the text input process. However, in a game, players all have a displacement requirement (to go to a certain destination or to avoid a certain enemy).
In order to solve the above problems, conventional handtours are often solved by simplifying character input. For example, some texts or signals can be preset in the game, and then the player can directly send the corresponding contents by clicking. A specific example diagram may be seen with reference to fig. 1; in fig. 1, the preset text stored in the device terminal 101 may be, for example, as shown in the block of fig. 1.
On one hand, most of the stored shortcut information is short and hard, the default shortcut information is customized by the system, the requirements of most of players on characters cannot be met, and the personalization is not enough; on the other hand, the player has no way to adjust the preset information in the game, and if the preset information does not need the player, the player needs to input the preset information manually, so that the control of the player on the virtual object is influenced, and the user experience of the player is reduced; on the other hand, the amount of the quick information which can be preset in advance is limited, the situation can not be estimated completely, and the method has uncontrollable property; further, if the preset shortcut information is too much, the server is burdened, and the speed of the whole game is affected.
Therefore, it is desirable to provide a new interaction control method and apparatus.
It is to be noted that the information invented in the above background section is only for enhancing the understanding of the background of the present invention, and therefore, may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present invention is directed to an interaction control method, an interaction control apparatus, a computer-readable storage medium, and an electronic device, which overcome at least some of the problems that a user cannot control movement of a virtual object while inputting text due to limitations and disadvantages of the related art.
According to one aspect of the present disclosure, an interaction control method based on a virtual scene is provided, where the virtual scene at least includes a virtual object, and the virtual object is applied to a device terminal having a first touch display screen and a second touch screen; the interaction control method comprises the following steps:
displaying the virtual scene in the first touch display screen at least partially, and providing a first mobile control area;
when detecting that a first touch operation occurs in the first movement control area, controlling the virtual object to move in the virtual scene;
when detecting that a second touch operation occurs at a first preset position in the first touch display screen, calling a virtual keyboard;
if the operation response area of the virtual keyboard at least partially covers the first movement control area and the operation response level of the virtual keyboard is higher than that of the first movement control area, providing a second movement control area in the second touch screen;
wherein the second movement control area is configured to control the virtual object to move in the virtual scene according to a third touch operation acting on the second movement control area.
In an exemplary embodiment of the disclosure, after invoking the virtual keyboard, the method further comprises:
disabling the first movement control area.
In an exemplary embodiment of the present disclosure, invoking the virtual keyboard comprises:
judging whether the touch duration of the second touch operation reaches a first preset duration or not;
and if the touch duration of the second touch operation reaches the first preset duration, calling the virtual keyboard.
In an exemplary embodiment of the present disclosure, after invoking the virtual keyboard, the interactive control method further includes:
inputting a text through the virtual keyboard, and judging whether the text input is finished;
hiding the virtual keyboard if the text entry is complete.
In an exemplary embodiment of the present disclosure, determining whether text input is complete includes:
and judging whether the text input is finished according to whether the sending control in the virtual keyboard is touched.
In an exemplary embodiment of the disclosure, after hiding the virtual keyboard, the method further comprises:
disabling the second movement control region and activating the first movement control region.
In an exemplary embodiment of the present disclosure, the first touch display screen is a main touch display screen, and the second touch display screen is an auxiliary touch display screen;
the display area of the first touch display screen is larger than that of the second touch display screen.
According to one aspect of the present disclosure, an interaction control apparatus based on a virtual scene is provided, where the virtual scene at least includes a virtual object, and the virtual object is applied to a device terminal having a first touch display screen and a second touch screen; the interaction control device includes:
the first display module is used for at least partially displaying the virtual scene and providing a first mobile control area;
the first detection module is used for controlling the virtual object to move in the virtual scene when detecting that the first touch operation occurs in the first movement control area;
the second detection module is used for calling the virtual keyboard when detecting that a second touch operation is performed at a first preset position in the first touch display screen;
the second display module is used for providing a second movement control area in the second touch screen if the operation response area of the virtual keyboard at least partially covers the first movement control area and the operation response level of the virtual keyboard is higher than that of the first movement control area;
wherein the second movement control area is configured to control the virtual object to move in the virtual scene according to a third touch operation acting on the second movement control area.
According to an aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program, which when executed by a processor, implements the virtual scene-based interaction control method of any one of the above.
According to an aspect of the present disclosure, there is provided an electronic device including:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to execute any one of the above-mentioned virtual scene-based interaction control methods via execution of the executable instructions.
The interaction control method and the interaction control device have the advantages that when a second touch operation is detected to occur at a first preset position of a first touch display screen, a virtual keyboard is called; if the operation response area of the virtual keyboard at least partially covers the first movement control area and the operation response level of the virtual keyboard is higher than that of the first movement control area, displaying a second movement control area in the second touch screen; wherein the second movement control area is configured to control the virtual object to move in the virtual scene according to a third touch operation acting on the second movement control area; on one hand, the virtual keyboard is called in the first touch display screen to input the text, and the movement of the virtual object is controlled in the second touch screen, so that the problem that a user cannot control the movement of the virtual object while inputting characters in the prior art is solved, and the user experience is improved; on the other hand, text input is carried out through the virtual keyboard, so that the user can independently input the text according to needs, the problem that the communication between the user and the user is hard only by selecting the existing quick information is avoided, and the communication between the user and the user is more personalized; on the other hand, text input is carried out through the virtual keyboard, the burden of the server caused by the fact that a large amount of shortcut information needs to be stored in advance is avoided, the burden of the server is reduced, and meanwhile the response speed of the virtual object in the virtual scene is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention. It is obvious that the drawings in the following description are only some embodiments of the invention, and that for a person skilled in the art, other drawings can be derived from them without inventive effort.
Fig. 1 schematically shows an example view of a virtual scene.
Fig. 2 schematically shows a flow chart of a virtual scene based interaction control method.
Fig. 3(a) schematically shows an exemplary view of a virtual scene of an interactive control method.
Fig. 3(b) schematically shows an exemplary view of a virtual scene of another interactive control method.
Fig. 3(c) schematically shows an exemplary view of a virtual scene of another interactive control method.
Fig. 4 schematically shows a block diagram of an interaction control device based on a virtual scenario.
Fig. 5 schematically illustrates an example of an electronic device for implementing the above-described virtual scene-based interaction control method.
Fig. 6 schematically illustrates a computer-readable storage medium for implementing the above-described virtual scene-based interaction control method.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to provide a thorough understanding of embodiments of the invention. One skilled in the relevant art will recognize, however, that the invention may be practiced without one or more of the specific details, or with other methods, components, devices, steps, and so forth. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the invention.
Furthermore, the drawings are merely schematic illustrations of the invention and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The present exemplary embodiment first provides an interaction control method based on a virtual scene, where the virtual scene includes at least one virtual object; the method can be operated on a terminal device with a first touch display screen and a second touch display screen, wherein the terminal device can be a mobile device, such as a mobile phone or a tablet computer; the method can also be operated in other terminal devices having a first touch display screen and a second touch screen, for example, a PC or the like, which is not limited in this example. Referring to fig. 2, the interactive control method may include the steps of:
step S210, displaying at least part of the virtual scene in the first touch display screen, and providing a first mobile control area.
Step S220, when the first touch operation of the first mobile control area is detected, controlling the virtual object to move in the virtual scene
Step S230, when a second touch operation is detected to occur at a first preset position in the first touch display screen, calling a virtual keyboard.
Step 240, if the operation response area of the virtual keyboard at least partially covers the first mobile control area, and the operation response level of the virtual keyboard is higher than that of the first mobile control area, providing a second mobile control area in the second touch screen;
wherein the second movement control area is configured to control the virtual object to move in the virtual scene according to a third touch operation acting on the second movement control area.
In the interaction control method, on one hand, the virtual keyboard is called in the first touch display screen to input the text, and the movement of the virtual object is controlled in the second touch screen, so that the problem that a user cannot control the movement of the virtual object while inputting characters in the prior art is solved, and the user experience is improved; on the other hand, text input is carried out through the virtual keyboard, so that the user can independently input the text according to needs, the problem that the communication between the user and the user is hard only by selecting the existing quick information is avoided, and the communication between the user and the user is more personalized; on the other hand, text input is carried out through the virtual keyboard, the burden of the server caused by the fact that a large amount of shortcut information needs to be stored in advance is avoided, the burden of the server is reduced, and meanwhile the response speed of the virtual object in the virtual scene is improved.
Hereinafter, each step in the above-described interaction control method based on a virtual scene in the present exemplary embodiment will be explained and explained in detail with reference to the drawings.
In step S210, the virtual scene is at least partially displayed on the first touch display screen, and a first movement control area is provided.
In the present exemplary embodiment, the first touch display screen may be a main display screen, for example, may be a front display screen of a mobile phone; the virtual scene (game scene) can be displayed on the front display screen of the mobile phone, or the virtual scene (game scene) can be partially displayed, for example, the game scene can be displayed in a split screen manner; further, the first movement control area may be provided in a virtual scene; other locations may be provided, such as an area other than the virtual scene in the split screen display, and so on, which is not limited in this example.
In step S220, when it is detected that a first touch operation occurs in the first movement control area, the virtual object is controlled to move in the virtual scene.
In the present exemplary embodiment, when it is detected that the first touch operation occurs in the first movement control area, the virtual field may be controlled to move in the virtual scene. The first touch operation may be a sliding operation, and when the first touch operation is detected to occur in the first movement control area, controlling the virtual object to move in the virtual scene includes: when the first movement control area is detected to have sliding operation, the virtual object is controlled to move and/or rotate in the virtual scene according to the sliding track of the sliding operation.
In the present exemplary embodiment, the first movement control area may be a virtual joystick manipulation area, a direction control virtual key area, and the like, and the present exemplary embodiment is not particularly limited thereto.
In an embodiment of the present invention, the first movement control area is a virtual joystick control area, as shown in fig. 3(a), the virtual joystick control area is located at the lower left of the first touch screen 301, the virtual object 303 is controlled to displace and/or rotate in the virtual scene according to the first touch operation received by the virtual joystick control area, and a plurality of skill controls 305 are provided at the lower right of the first touch screen for providing a control function of sending skill to the player. Therefore, in the embodiment, the virtual object can be conveniently controlled to be displaced and rotated in the virtual scene through the left hand, and the skill control is controlled through the right hand to send the skill.
As an alternative embodiment, the first movement control area is an area with a visual indication, for example, the first movement control area may have a bounding box, or have a range of fill colors, or have a range of predetermined transparencies, or in some other manner that enables the first movement control area to be visually distinguished. And controlling the virtual object to displace and/or rotate in the virtual scene according to the first touch operation such as sliding, clicking and the like received by the first movement control area. The first mobile control area with the visual indication can enable a user to quickly locate the area, and can reduce the operation difficulty of a game beginner.
As another alternative, the first movement control area may be an area without a visual indication. The first movement control area without visual indication can not cover or influence the game picture, thereby providing better picture effect and saving screen space. But is not easily perceived by the player because it does not have a visual indication, as an improved implementation, a visual guidance control may be displayed in the first movement control area, for example, in an embodiment of the present invention, when a virtual joystick is used as a directional control scheme of the virtual object, a virtual joystick may be displayed in the first movement control area to visually guide the player.
In step S230, when it is detected that a second touch operation occurs at a first preset position in the first touch display screen, a virtual keyboard is called.
In the present exemplary embodiment, the first preset position may be a specific position set in the first touch display screen for calling out the virtual keyboard. For example, to avoid user operation, the first preset position may be a bottom position corresponding to a wide side of the first touch display screen of the terminal device, or may also be a bottom position corresponding to a long side, or any position on the first touch display screen, which is not limited in this example. It can be understood that, in an embodiment of the present invention, an indication icon may be provided at the first preset position to provide a visual indication to the user, so that the user can quickly locate the position, quickly call up the virtual keyboard, and reduce the operation difficulty of the game beginner. It should be noted here that the second touch operation may be a user directly touching with a finger, or a user touching with another object, for example, a living article, and the like, which is not limited in this example.
As shown in fig. 3(a) and 3(b), an indication icon 308 is provided on the first touch display 301, and when a second touch operation of the indication icon 308 is detected, the virtual keyboard 306 is invoked. The second touch operation may be a click operation, a long press operation, a heavy press operation, or the like.
Further, invoking the virtual keyboard may include: judging whether the touch duration of the second touch operation reaches a first preset duration or not; and if the touch duration of the second touch operation reaches the first preset duration, calling the virtual keyboard. In detail:
for example, when it is detected that a second touch operation event occurs at the first preset position, whether the touch duration of the second touch operation reaches a first preset duration is judged in response to the second touch operation; the first preset time period may be, for example, 1s, 1.5s, 2s, or the like, which is not limited in this example; further, referring to fig. 3(b), when the touch duration of the second touch operation reaches a first preset duration (for example, reaches 1 s), the virtual keyboard 306 may be called, and then text input is performed through the virtual keyboard; furthermore, when the touch duration of the second touch operation does not reach the first preset duration (for example, only 0.8s), the virtual keyboard does not need to be called. Through the method, the wrong operation caused when the user does not need to call the virtual keyboard but touches the first preset position can be avoided, the error caused by the wrong operation in the game of the user is avoided, and the user experience is improved. It should be added that the virtual keyboard may be transparently set, so that when a user inputs a text, the user may also view the position of the virtual object in the virtual scene through the virtual keyboard.
In step S240, if the operation response area of the virtual keyboard at least partially covers the first movement control area, and the operation response level of the virtual keyboard is higher than the first movement control area, providing a second movement control area in the second touch screen;
wherein the second movement control area is configured to control the virtual object to move in the virtual scene according to a third touch operation acting on the second movement control area.
In the present exemplary embodiment, the second touch screen may be a touch pad without a display function, or may be an auxiliary touch display screen, for example, a back touch display screen of a dual-sided mobile phone, etc., as shown in fig. 3(c), the second touch screen may be as shown in 307; when the operation response area of the virtual keyboard partially or completely covers the first mobile control area and the operation response hierarchy of the virtual keyboard floats above the first mobile control area, a second mobile control area can be provided in the second touch screen; wherein the second movement control area may be configured to control the virtual object to move in the virtual scene according to a third touch operation applied to the second movement control area.
In the present exemplary embodiment, the first movement control area is disabled after the virtual keyboard is invoked. For example, when the virtual keyboard is called, no matter whether the virtual keyboard is touched or other areas outside the virtual keyboard in the first touch display screen are touched, no action can be generated on the virtual object. By using the technical scheme, the problem of misoperation caused by the fact that the virtual object is influenced by touch control on the area outside the virtual keyboard can be avoided, and the user experience is further improved.
In this exemplary embodiment, after the virtual keyboard is invoked, the method for controlling interaction based on a virtual scene may further include: inputting a text through the virtual keyboard, and judging whether the text input is finished; hiding the virtual keyboard if the text entry is complete. In detail:
in an embodiment of the present invention, a send button may be provided in the virtual keyboard, and when a fourth touch operation on the send button (Enter) is detected, it is determined that the text input is completed.
For example, a user can input a text required by the user through the virtual keyboard, then click a send button (Enter) to send, and the click of the send button is regarded as the completion of text input, so that the virtual keyboard is hidden; if the input is needed again, after the sending, step S230 may be executed again, and after the virtual keyboard is called, the needed text is input; when the text input is complete, the virtual keyboard may then be used.
In another embodiment of the present invention, a send button and a close button are provided in the virtual keyboard, and when a fifth touch operation acting on the send button is detected, a text is sent; and when a sixth touch operation acting on the close button is detected, judging that the text input is finished.
For example, a user can input a text required by the user through the virtual keyboard, and then click a sending button to send the text; if the text needs to be input again, the required text can be continuously input after the last text is sent; and after the text input is finished, clicking a closing button to hide the virtual keyboard.
In this exemplary embodiment, after hiding the virtual keyboard, the method for controlling interaction based on a virtual scene may further include: disabling the second movement control region and activating the first movement control region. For example:
referring to fig. 3(a), 3(b) and 3(c), after the virtual keyboard 306 is hidden, the second movement control region 307 is disabled and the first movement control region 304 is activated for the convenience of the user's operation, and at this time, the second movement control region 307 does not respond to the third touch operation but controls the movement of the virtual object 303 in the virtual scene 302 by the first touch operation acting on the first movement control region 304.
In this exemplary embodiment, after the virtual keyboard is invoked, the method for controlling interaction based on a virtual scene may further include: detecting whether a touch event exists at a second preset position in the first touch display screen; if yes, the virtual keyboard is hidden, the second mobile control area is forbidden, and the first mobile control area is activated.
In the example embodiment, whether a touch event occurs at a second preset position in the first touch display screen is detected; the second preset position can be a place uncovered by the virtual keyboard; for example, it may be any position in the virtual scene that is not covered by the virtual keyboard in the first touch display screen, and so on. The second predetermined position may also be a specific position in the virtual keyboard, for example, a close button is provided at the upper right corner of the virtual keyboard, and when the user clicks the close button, the virtual keyboard is hidden, the second movement control area is disabled, and the first movement control area is activated.
In this example embodiment, if it is detected that the second preset position has the touch event, it is determined whether the touch duration of the touch event reaches a second preset duration, and if the touch duration reaches the second preset duration, the virtual keyboard is hidden, the second mobile control area is disabled, and the first mobile control area is activated.
Specifically, when it is detected that the second preset position has a touch event, determining whether the touch duration of the touch event reaches a second preset duration; the second preset time period may be, for example, 0.3s, 0.5s, or 0.6s, etc., which is not limited in this example; if the touch duration of the touch event does not reach a second preset duration, the virtual keyboard can be continuously called to input the text.
If the touch duration of the touch event reaches a second preset duration, the virtual keyboard is hidden, the second mobile control area is disabled, the first mobile control area is activated, and the movement of the virtual object 303 in the virtual scene 302 is controlled through the first mobile control area 304. Through the mode, the error operation caused by calling the virtual control in the first touch display screen under the condition that the text is not input completely by the user is avoided, and the experience of the user in the game is further improved.
The present disclosure also provides an interaction control device, which is applied to an equipment terminal having a first touch display screen and a second touch screen; referring to fig. 4, the interactive control apparatus may include a first display module 410, a first detection module 420, a second detection module 430, and a second display module 440. Wherein:
the first display module 410 may be configured to display the virtual scene at least partially and provide a first movement control area;
the first detection module 420 may be configured to control the virtual object to move in the virtual scene when detecting that the first touch operation occurs in the first movement control area;
the second detecting module 430 may be configured to invoke a virtual keyboard when detecting that a second touch operation occurs at a first preset position in the first touch display screen;
the second display module 440 may be configured to display a second movement control area in the second touch screen if the operation response area of the virtual keyboard at least partially covers the first movement control area and the operation response level of the virtual keyboard is higher than the first movement control area;
wherein the second movement control area is configured to control the virtual object to move in the virtual scene according to a third touch operation acting on the second movement control area.
In an example embodiment of the present disclosure, invoking the virtual keyboard comprises:
judging whether the touch duration of the second touch operation reaches a first preset duration or not;
and if the touch duration of the second touch operation reaches the first preset duration, calling the virtual keyboard.
In an example embodiment of the present disclosure, after invoking the virtual keyboard, the interactive control method further includes:
inputting a text through the virtual keyboard, and judging whether the text input is finished;
hiding the virtual keyboard if the text entry is complete.
In an example embodiment of the present disclosure, determining whether text input is complete includes:
and judging whether the text input is finished according to whether the sending control in the virtual keyboard is touched.
In an example embodiment of the present disclosure, the first touch screen is a main display screen, and the second touch screen is an auxiliary display screen;
the display area of the first touch display screen is larger than that of the second touch display screen.
In an example embodiment of the present disclosure, controlling the virtual object to move in the virtual scene comprises:
and controlling the virtual object to move in the virtual scene through a virtual control in the virtual scene.
In an example embodiment of the present disclosure, controlling the virtual object to move in the virtual scene through a virtual control in the virtual scene includes:
detecting a fourth touch operation acting on a virtual control in the virtual scene;
and controlling the virtual object to move in the virtual scene according to the operation track of the fourth touch operation.
The specific details of each module in the above-mentioned interaction control device based on virtual scene have been described in detail in the corresponding interaction control method based on virtual scene, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the invention. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Moreover, although the steps of the methods of the present invention are depicted in the drawings in a particular order, this does not require or imply that the steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiment of the present invention can be embodied in the form of a software product, which can be stored in a non-volatile storage medium (which can be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to make a computing device (which can be a personal computer, a server, a mobile terminal, or a network device, etc.) execute the method according to the embodiment of the present invention.
In an exemplary embodiment of the present invention, there is also provided an electronic device capable of implementing the above method.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Thus, various aspects of the invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 500 according to this embodiment of the invention is described below with reference to fig. 5. The electronic device 500 shown in fig. 5 is only an example and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 5, the electronic device 500 is embodied in the form of a general purpose computing device. The components of the electronic device 500 may include, but are not limited to: the at least one processing unit 510, the at least one memory unit 520, a bus 530 connecting the various system components (including the memory unit 520 and the processing unit 510), and a display unit 540.
Wherein the storage unit stores program code that is executable by the processing unit 510 to cause the processing unit 510 to perform steps according to various exemplary embodiments of the present invention as described in the above section "exemplary methods" of the present specification. For example, the processing unit 510 may perform step S210 as shown in fig. 2: displaying the virtual scene in the first touch display screen at least partially, and providing a first mobile control area; s220: when detecting that a first touch operation occurs in the first movement control area, controlling the virtual object to move in the virtual scene; step S230: when detecting that a second touch operation occurs at a first preset position in the first touch display screen, calling a virtual keyboard; step S240: if the operation response area of the virtual keyboard at least partially covers the first movement control area and the operation response level of the virtual keyboard is higher than that of the first movement control area, displaying a second movement control area in the second touch screen; wherein the second movement control area is configured to control the virtual object to move in the virtual scene according to a third touch operation acting on the second movement control area.
The memory unit 520 may include a readable medium in the form of a volatile memory unit, such as a random access memory unit (RAM)5201 and/or a cache memory unit 5202, and may further include a read only memory unit (ROM) 5203.
Storage unit 520 may also include a program/utility 5204 having a set (at least one) of program modules 5205, such program modules 5205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 530 may be one or more of any of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 500 may also communicate with one or more external devices 600 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 500, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 500 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interfaces 550. Also, the electronic device 500 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via the network adapter 560. As shown, the network adapter 560 communicates with the other modules of the electronic device 500 over the bus 530. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 500, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiment of the present invention can be embodied in the form of a software product, which can be stored in a non-volatile storage medium (which can be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to make a computing device (which can be a personal computer, a server, a terminal device, or a network device, etc.) execute the method according to the embodiment of the present invention.
In an exemplary embodiment of the present invention, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above section "exemplary methods" of the present description, when said program product is run on the terminal device.
Referring to fig. 6, a program 610 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program codes, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Furthermore, the above-described figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.

Claims (9)

1. An interaction control method based on a virtual scene, wherein the virtual scene at least comprises a virtual object, and is characterized in that the interaction control method is applied to a device terminal with a first touch display screen and a second touch screen; the interaction control method comprises the following steps:
displaying the virtual scene in the first touch display screen at least partially, and providing a first mobile control area;
when detecting that a first touch operation occurs in the first movement control area, controlling the virtual object to move in the virtual scene;
when detecting that a second touch operation occurs at a first preset position in the first touch display screen, calling a virtual keyboard;
if the operation response area of the virtual keyboard at least partially covers the first movement control area and the operation response level of the virtual keyboard is higher than that of the first movement control area, providing a second movement control area in the second touch screen and disabling the first movement control area in the first touch display screen;
wherein the second movement control area is configured to control the virtual object to move in the virtual scene according to a third touch operation acting on the second movement control area.
2. The interactive control method of claim 1, wherein invoking the virtual keyboard comprises:
judging whether the touch duration of the second touch operation reaches a first preset duration or not;
and if the touch duration of the second touch operation reaches the first preset duration, calling the virtual keyboard.
3. The interactive control method of claim 1, wherein after invoking the virtual keyboard, the interactive control method further comprises:
inputting a text through the virtual keyboard, and judging whether the text input is finished;
hiding the virtual keyboard if the text entry is complete.
4. The interaction control method of claim 3, wherein determining whether text input is complete comprises:
and judging whether the text input is finished according to whether the sending control in the virtual keyboard is touched.
5. The interactive control method of claim 3, wherein after hiding the virtual keyboard, the method further comprises:
disabling the second movement control region and activating the first movement control region.
6. The interaction control method according to claim 1, wherein the first touch display screen is a main touch display screen, and the second touch display screen is an auxiliary touch display screen;
the display area of the first touch display screen is larger than that of the second touch display screen.
7. An interaction control device based on a virtual scene, wherein the virtual scene at least comprises a virtual object, and is characterized in that the interaction control device is applied to a device terminal with a first touch display screen and a second touch screen; the interaction control device includes:
the first display module is used for at least partially displaying the virtual scene and providing a first mobile control area;
the first detection module is used for controlling the virtual object to move in the virtual scene when detecting that the first touch operation occurs in the first movement control area;
the second detection module is used for calling the virtual keyboard when detecting that a second touch operation is performed at a first preset position in the first touch display screen;
a second display module, configured to provide a second movement control area in the second touch screen and disable the first movement control area in the first touch screen if the operation response area of the virtual keyboard at least partially covers the first movement control area and the operation response level of the virtual keyboard is higher than the first movement control area;
wherein the second movement control area is configured to control the virtual object to move in the virtual scene according to a third touch operation acting on the second movement control area.
8. A computer-readable storage medium, on which a computer program is stored, the computer program, when being executed by a processor, implementing the virtual scene-based interaction control method according to any one of claims 1 to 6.
9. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the virtual scenario-based interaction control method of any one of claims 1-6 via execution of the executable instructions.
CN201811409176.7A 2018-11-23 2018-11-23 Interaction control method and device based on virtual scene, storage medium and electronic equipment Active CN109542323B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811409176.7A CN109542323B (en) 2018-11-23 2018-11-23 Interaction control method and device based on virtual scene, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811409176.7A CN109542323B (en) 2018-11-23 2018-11-23 Interaction control method and device based on virtual scene, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN109542323A CN109542323A (en) 2019-03-29
CN109542323B true CN109542323B (en) 2021-03-02

Family

ID=65849869

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811409176.7A Active CN109542323B (en) 2018-11-23 2018-11-23 Interaction control method and device based on virtual scene, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN109542323B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110007840A (en) * 2019-04-10 2019-07-12 网易(杭州)网络有限公司 Object control method, apparatus, medium and electronic equipment
CN110882537B (en) * 2019-11-12 2023-07-25 北京字节跳动网络技术有限公司 Interaction method, device, medium and electronic equipment
WO2021134358A1 (en) * 2019-12-30 2021-07-08 华为技术有限公司 Human-computer interaction method, device, and system
CN112764618A (en) * 2021-01-22 2021-05-07 维沃移动通信有限公司 Interface operation method, device, equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103294400A (en) * 2013-05-15 2013-09-11 成都理想境界科技有限公司 Touch keyboard, handheld mobile terminal and fast text type-in method
CN203759667U (en) * 2014-03-07 2014-08-06 天津大学 Double-touch-screen mobile terminal
CN104750364A (en) * 2015-04-10 2015-07-01 赵晓辉 Character and signal inputting method and device on intelligent electronic device
WO2016047855A1 (en) * 2014-09-25 2016-03-31 김용운 Smart device having application by which input/output screen is freely configured and sharing control method
CN107102806A (en) * 2017-01-25 2017-08-29 维沃移动通信有限公司 A kind of split screen input method and mobile terminal
CN108536497A (en) * 2017-12-28 2018-09-14 努比亚技术有限公司 Double-sided screen word input control method, equipment and computer readable storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2479756B (en) * 2010-04-21 2013-06-05 Realvnc Ltd Virtual interface devices
CN107645612A (en) * 2017-10-23 2018-01-30 珠海市魅族科技有限公司 A kind of information processing method and terminal device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103294400A (en) * 2013-05-15 2013-09-11 成都理想境界科技有限公司 Touch keyboard, handheld mobile terminal and fast text type-in method
CN203759667U (en) * 2014-03-07 2014-08-06 天津大学 Double-touch-screen mobile terminal
WO2016047855A1 (en) * 2014-09-25 2016-03-31 김용운 Smart device having application by which input/output screen is freely configured and sharing control method
CN104750364A (en) * 2015-04-10 2015-07-01 赵晓辉 Character and signal inputting method and device on intelligent electronic device
CN107102806A (en) * 2017-01-25 2017-08-29 维沃移动通信有限公司 A kind of split screen input method and mobile terminal
CN108536497A (en) * 2017-12-28 2018-09-14 努比亚技术有限公司 Double-sided screen word input control method, equipment and computer readable storage medium

Also Published As

Publication number Publication date
CN109542323A (en) 2019-03-29

Similar Documents

Publication Publication Date Title
CN109542323B (en) Interaction control method and device based on virtual scene, storage medium and electronic equipment
US11853523B2 (en) Display device and method of indicating an active region in a multi-window display
CN109164964B (en) Content sharing method and device, terminal and storage medium
US10528252B2 (en) Key combinations toolbar
US20150212730A1 (en) Touch event isolation method and related device and computer readable medium
CN109876439B (en) Game picture display method and device, storage medium and electronic equipment
CN107930119B (en) Information processing method, information processing device, electronic equipment and storage medium
CN110090444B (en) Game behavior record creating method and device, storage medium and electronic equipment
US20120266079A1 (en) Usability of cross-device user interfaces
US20140013276A1 (en) Accessing a Marine Electronics Data Menu
US20130293573A1 (en) Method and Apparatus for Displaying Active Operating System Environment Data with a Plurality of Concurrent Operating System Environments
CN109260713B (en) Virtual object remote assistance operation method and device, storage medium and electronic equipment
CN109460179B (en) Virtual object control method and device, electronic equipment and storage medium
JP2015022766A (en) Touchpad for user to vehicle interaction
US11099723B2 (en) Interaction method for user interfaces
CN111420395B (en) Interaction method and device in game, readable storage medium and electronic equipment
CN108553894A (en) Display control method and device, electronic equipment, storage medium
CN108776544B (en) Interaction method and device in augmented reality, storage medium and electronic equipment
CN110075519B (en) Information processing method and device in virtual reality, storage medium and electronic equipment
CN108595010B (en) Interaction method and device for virtual objects in virtual reality
US20140013272A1 (en) Page Editing
US10019148B2 (en) Method and apparatus for controlling virtual screen
TW202026849A (en) Interaction method, apparatus and device
CN108434732A (en) Virtual object control method and device, storage medium, electronic equipment
CN114779977A (en) Interface display method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant