WO2018177170A1 - Display control method and apparatus for game picture, storage medium and electronic device - Google Patents

Display control method and apparatus for game picture, storage medium and electronic device Download PDF

Info

Publication number
WO2018177170A1
WO2018177170A1 PCT/CN2018/079756 CN2018079756W WO2018177170A1 WO 2018177170 A1 WO2018177170 A1 WO 2018177170A1 CN 2018079756 W CN2018079756 W CN 2018079756W WO 2018177170 A1 WO2018177170 A1 WO 2018177170A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
user interface
screen
game
graphical user
Prior art date
Application number
PCT/CN2018/079756
Other languages
French (fr)
Chinese (zh)
Inventor
吴志武
鲍慧翡
Original Assignee
网易(杭州)网络有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to CN201710188700.1 priority Critical
Priority to CN201710188700.1A priority patent/CN106975219B/en
Application filed by 网易(杭州)网络有限公司 filed Critical 网易(杭州)网络有限公司
Publication of WO2018177170A1 publication Critical patent/WO2018177170A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5255Changing parameters of virtual cameras according to dedicated instructions from a player, e.g. using a secondary joystick to rotate the camera around a player's character
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/803Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with three-dimensional environments, e.g. control of viewpoint to navigate in the environment
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for entering handwritten data, e.g. gestures, text
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the screen or tablet into independently controllable areas, e.g. virtual keyboards, menus
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/65Methods for processing data by generating or executing the game program for computing the condition of a game character

Abstract

A display control method and apparatus for a game picture, a storage medium and an electronic device. The method comprises: providing a first touch manipulation area (4) in a graphical user interface, and configuring a virtual role to move and/or rotate in a game scene picture according to a first touch operation received by the first touch manipulation area (4); providing a second touch manipulation area (5) in the graphical user interface, and when a second touch operation on the second touch manipulation area (5) is detected, changing a presentation view of the game scene picture on the graphical user interface; when the end of the second touch operation is detected, controlling the presentation view of the game scene picture on the graphical user interface so same is restored to a state prior to the second touch operation.

Description

Display control method and device for game screen, storage medium, electronic device Technical field

The present disclosure relates to the field of computer interaction technologies, and in particular, to a display control method and apparatus for a game screen, a storage medium, and an electronic device.

Background technique

With the development of mobile intelligent terminals and the game industry, a large number of mobile games with different themes have emerged to meet the needs of users. In the game, the virtual characters are viewed by default. If you want to observe the virtual environment in which the virtual corner is located, the virtual environment behind or around the virtual character must be realized by rotating the orientation of the virtual character.

However, in a mobile terminal (especially a mobile terminal using touch control), the surrounding environment is observed by controlling the rotation of the virtual character, which has great limitations: on the one hand, the operability is poor, and it is not convenient; on the other hand, by controlling the virtual The rotation of the character to observe the surrounding environment will interrupt or change the combat state of the virtual character, unable to switch or affect the battle in the battle, and can not meet the user's need for the viewing field switching, and the user experience is not good.

It is to be understood that the information disclosed in the background section above is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute a related art known to those of ordinary skill in the art.

Summary of the invention

An object of the present disclosure is to provide a display control method and apparatus for a game screen, a storage medium, and an electronic device, and at least to some extent overcome one or more problems due to limitations and disadvantages of the related art.

According to an aspect of the present disclosure, there is provided a display control method for a game screen, the game screen including a graphical user interface obtained by executing a software application on a processor of the mobile terminal and rendering on a display of the mobile terminal The content presented by the graphical user interface includes a game scene screen and at least partially includes a virtual character, the method comprising:

Providing a first touch manipulation area at the graphical user interface, and configuring the virtual character to be displaced and/or rotated in the game scene screen according to a first touch operation received by the first touch manipulation area;

Providing a second touch manipulation area on the graphical user interface, and configuring a presentation visual field of the game scene screen on the graphical user interface to be changed according to a second touch operation received by the second touch manipulation area;

Detecting the second touch operation located in the second touch manipulation area, and changing a presentation view of the game scene picture on the graphic user interface according to the second touch operation;

The end of the second touch operation is detected, and the presentation visual field of the game scene screen on the graphical user interface is controlled to return to the state before the second touch operation.

In an exemplary embodiment of the present disclosure, the first touch manipulation area is a virtual joystick manipulation area.

In an exemplary embodiment of the present disclosure, the second touch operation is a touch sliding operation.

In an exemplary embodiment of the present disclosure, the changing a presentation view of the game scene screen on the graphical user interface according to the second touch operation includes:

Changing a rendering view of the game scene screen on the graphical user interface according to a sliding trajectory of the touch sliding operation.

In an exemplary embodiment of the present disclosure, the changing a presentation view of the game scene screen on the graphical user interface according to the second touch operation includes:

Changing the position of the virtual camera in the game scene to a preset position;

The direction of the virtual camera is changed according to a sliding trajectory of the touch sliding operation.

In an exemplary embodiment of the present disclosure, the game screen is a first person view game screen, and changing the presentation view of the game scene screen on the graphical user interface according to the second touch operation includes:

Switching the first person view game screen to a third person view game screen, and changing a direction of the present view of the game scene screen on the graphical user interface according to the sliding track of the touch slide operation.

In an exemplary embodiment of the present disclosure, the second touch operation is a touch click operation.

In an exemplary embodiment of the present disclosure, the changing a presentation view of the game scene screen on the graphical user interface according to the second touch operation includes:

And changing a presentation view of the game scene screen on the graphic user interface according to a position of a preset point in the second touch manipulation area and a position of the click of the touch click operation.

In an exemplary embodiment of the present disclosure, the changing a presentation view of the game scene screen on the graphical user interface according to the second touch operation includes:

And changing a presentation view of the game scene screen on the graphical user interface according to a position of a preset line line in the second touch manipulation area and a click position of the touch click operation.

In an exemplary embodiment of the present disclosure, the providing the second touch manipulation area in the graphical user interface includes:

A preset touch operation at the graphical user interface is detected, and a second touch manipulation area is presented on the graphical user interface.

In an exemplary embodiment of the present disclosure, the preset touch operation includes any one of the following: a re-press, a long press, and a double-click.

In an exemplary embodiment of the present disclosure, the controlling the state of the presentation of the game scene screen on the graphical user interface to the state before the second touch operation comprises:

Controlling the presentation field of view of the game scene screen on the graphical user interface to return to the presentation field of view before the second touch operation; or

Controlling the rendering visual field of the game scene screen on the graphical user interface to return to calculating the rendering visual field according to the rendering visual field computing logic before the second touch operation.

According to an aspect of the present disclosure, there is provided a display control device for a game screen, the game screen including a graphical user interface obtained by executing a software application on a processor of the mobile terminal and rendering on a display of the mobile terminal The content presented by the graphical user interface includes a game scene screen and at least partially includes a virtual character, the apparatus comprising:

a first providing module, configured to provide a first touch manipulation area in the graphical user interface, and configured to configure the virtual character to be performed in the game scene screen according to a first touch operation received by the first touch manipulation area Displacement and / or rotation;

a second providing module, configured to provide a second touch manipulation area on the graphical user interface, configured to configure a presentation visual field of the game scene screen on the graphical user interface as a second received according to the second touch manipulation area Change by touch operation;

The first detecting module is configured to detect the second touch operation located in the second touch manipulation area, and change a presentation view of the game scene picture on the graphic user interface according to the second touch operation;

The second detecting module is configured to detect the end of the second touch operation, and control a rendering visual field of the game scene screen on the graphic user interface to return to a state before the second touch operation.

According to an aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program, the computer program being executed by a processor to implement a display control method of a game screen according to any one of the above.

According to an aspect of the present disclosure, an electronic device is provided, including:

Processor;

a memory, configured to store executable instructions of the processor;

The processor is configured to execute the display control method of the game screen according to any one of the above items by executing the executable instruction.

In a display control method for a game screen according to an exemplary embodiment of the present disclosure, a first touch manipulation area is provided in the graphic user interface, and a first touch operation received according to the first touch manipulation area is Controlling the virtual character to perform displacement and/or rotation in the game scene screen; providing a second touch manipulation area in the graphical user interface and detecting the second touch operation in the second touch manipulation area And changing a rendering view of the game scene screen on the graphical user interface. Upon detecting the end of the second touch operation, controlling the presentation visual field of the game scene screen on the graphical user interface to return to the state before the second touch operation. In one aspect, by providing the first touch manipulation area in the graphical user interface and according to the detected first touch operation occurring in the first touch manipulation area, the first touch operation received according to the first touch manipulation area is Displacement and/or rotation in the game scene picture; on the other hand, by providing a second touch manipulation area in the graphical user interface and according to the detected second touch operation occurring in the second touch manipulation area, The second touch operation received by the second touch manipulation area changes the presentation view of the game scene on the graphical user interface. When the second touch operation ends, the presentation view of the game scene on the graphical user interface returns to the original state. The second touch operation of the user in the second touch manipulation area may change the presentation field of view of the game scene on the graphical user interface, and return to the state before the second touch operation when the second touch operation ends, providing the user with a convenient Quickly present the way of adjusting the visual field to meet the needs of users and improve the user experience.

DRAWINGS

The above and other features and advantages of the present disclosure will become more apparent from the detailed description of exemplary embodiments. It is apparent that the drawings in the following description are only some of the embodiments of the present disclosure, and other drawings may be obtained from those skilled in the art without departing from the drawings. In the drawing:

1 is a flowchart of a display control method of a game screen according to the present disclosure;

2 is a cross-sectional view of a game scene provided by an exemplary embodiment of the present disclosure;

FIG. 3 is a schematic diagram of changing a rendering field of view according to a sliding operation according to an exemplary embodiment of the present disclosure; FIG.

4 is a block diagram of a display control device for a game screen according to the present disclosure

FIG. 5 is a schematic block diagram of an electronic device in an exemplary embodiment of the present disclosure.

detailed description

Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments can be embodied in a variety of forms and should not be construed as being limited to the embodiments set forth herein. To those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are set forth However, one skilled in the art will appreciate that the technical solution of the present disclosure may be practiced without one or more of the specific details, or other methods, components, materials, devices, steps, etc. may be employed. In other instances, well-known technical solutions are not shown or described in detail to avoid obscuring aspects of the present disclosure.

In addition, the drawings are merely schematic illustrations of the present disclosure, and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and the repeated description thereof will be omitted.

First, a display control method of a game screen including a graphical user interface obtained by executing a software application on a processor of the mobile terminal and rendering on a display of the mobile terminal is disclosed in the exemplary embodiment. The content presented by the graphical user interface includes a game scene screen and at least partially includes a virtual character. Referring to FIG. 1 , the display control method of the game screen may include the following steps:

Step S110. Providing a first touch manipulation area in the graphical user interface, and configuring the virtual character to be displaced in the game scene screen according to a first touch operation received by the first touch manipulation area. Rotate

Step S120. Providing a second touch manipulation area on the graphic user interface, and configuring a presentation visual field of the game scene screen on the graphic user interface to be changed according to a second touch operation received by the second touch manipulation area. ;

Step S130. Detecting the second touch operation located in the second touch manipulation area, and changing a presentation view of the game scene picture on the graphic user interface according to the second touch operation;

Step S140. Detecting the end of the second touch operation, controlling the presentation visual field of the game scene screen on the graphical user interface to return to the state before the second touch operation.

With the display control method of the game screen in the present exemplary embodiment, on the one hand, by providing the first touch manipulation area in the graphical user interface and according to the detected first touch operation occurring in the second touch manipulation area, ie The first touch operation received according to the first touch manipulation area may be displaced and/or rotated in the game scene screen; on the other hand, by providing the second touch manipulation area in the graphical user interface and according to the detected occurrence a second touch operation of the touch control area, wherein the display view of the game scene on the graphical user interface is changed according to the second touch operation received by the second touch control area, when the second touch operation ends, on the graphical user interface The rendered view of the game scene returns to its original state. The second touch operation of the user in the second touch manipulation area may change the presentation field of view of the game scene on the graphical user interface, and return to the state before the second touch operation when the second touch operation ends. It provides users with a convenient and fast way to adjust the visual field, which satisfies the needs of users and improves the user experience.

Next, each step of the game screen display control method in the present exemplary embodiment will be further described.

In step S110, a first touch manipulation area is provided at the graphical user interface, and the virtual character is configured to be displaced in the game scene screen according to a first touch operation received by the first touch manipulation area. / or rotate.

The first touch manipulation area may be, for example, a virtual joystick manipulation area, a direction control virtual key area, and the like, which is not specifically limited in this exemplary embodiment.

In an optional embodiment, the first touch manipulation area is a virtual joystick manipulation area, and the first touch operation received according to the virtual joystick manipulation area controls the virtual character to be displaced and/or rotated in the game scene.

In an optional implementation manner, the first touch manipulation area is a virtual cross key area/virtual direction key (D-PAD) area, and the first touch operation received according to the virtual cross key area controls the virtual character to be performed in the game scene. Displacement and / or rotation.

In an optional implementation manner, the first touch manipulation area is a touch manipulation area having a visual indication, such as a touch manipulation area having a bounding box, or a touch manipulation area filled with a color, or having a predetermined transparency. A touch manipulation area, or other manipulation area capable of visually indicating the range of the first touch manipulation area, controls the virtual character to be displaced and/or rotated in the game scene according to a touch operation such as a slide, click, or the like received by the touch manipulation area. The touch control area with visual indication enables the user to quickly locate the touch control area, which can reduce the difficulty of the game novice.

In an alternative embodiment, the first touch manipulation area is a touch manipulation area in the graphical user interface that does not have a visual indication. The touch control area without visual indication will not cover or affect the game screen, provide better picture effects, and save screen space, suitable for the operation of the game master.

The displacement of the virtual character in the game scene refers to the change of the position of the virtual character in the game scene; the rotation of the virtual character in the game scene refers to the change of the orientation of the virtual character in the game scene.

By providing a first touch manipulation area in the graphical user interface and according to the detected first touch operation occurring in the first touch manipulation area, the first touch operation received according to the first touch manipulation area may be in the game scene screen. Displace and/or rotate.

In step S120, a second touch manipulation area is provided in the graphical user interface, and a presentation visual field of the game scene screen on the graphical user interface is configured as a second touch operation received according to the second touch manipulation area. And change.

The second touch manipulation area is a touch control area with a visual indication in the graphical user interface, such as a touch manipulation area having a bounding box, or a touch manipulation area filled with color, or a touch manipulation area having a predetermined transparency. , or other manipulation area capable of visually indicating the range of the second touch manipulation area. The touch control area with visual indication enables the user to quickly locate the touch control area, which can reduce the difficulty of the game novice.

In an alternative embodiment, the second touch manipulation area is a touch manipulation area of the graphical user interface that does not have a visual indication. The touch control area without visual indication will not cover or affect the game screen, provide better picture effects, and save screen space, suitable for the operation of the game master.

The change in the presentation field of the game scene picture on the graphical user interface includes a change in the presentation range of the game scene picture on the graphical user interface and/or a change in the presentation angle of the game scene picture on the graphical user interface, and the game is on the graphical user interface When the presentation visual field of the scene picture changes according to the second touch operation received by the second touch manipulation area, the orientation of the virtual character and the position of the crosshair do not change.

The following describes an example of a change in the presentation field of the game scene screen on the graphical user interface with an example.

Figure 2 shows a cross-sectional view of a game scene, in the XY coordinate plane shown in Figure 2, where the Z direction is perpendicular to the paper surface (XY plane) outward, where 1 is the game scene, 2 It is a virtual camera, and 3 is a hillside in the game scene. The virtual camera 2 is set at point A, the angle of the shooting direction line OA is θ, and the point O is the intersection of the shooting direction line passing through point A and the game scene 1. The content of the game scene rendered on the display of the mobile terminal is equivalent to the scene content captured by the virtual camera 2, ranging from point B to point C.

When the virtual camera 2 is advanced to the game scene 1 along the shooting direction line AO, the rendering range of the game scene screen on the graphical user interface will become smaller, and the rendering angle will be unchanged; otherwise, the rendering range will become larger and the rendering angle will be unchanged;

When the game scene is small, for example, the game scene range is limited to from point E to point F, and within a certain range of shooting angles, the virtual camera 2 can capture the full range of the game scene. In this case, the position A of the virtual camera 2 is kept unchanged, and the shooting angle θ is changed within a certain range, and the presentation angle of the game scene screen on the graphical user interface changes, and the presentation range does not change.

In an alternative embodiment, a preset touch operation at the graphical user interface is detected, and a second touch manipulation area is provided on the graphical user interface.

For example, when a preset touch operation such as a re-press, a long press, or a double-click on the graphical user interface is detected, a second touch manipulation area is provided in the graphical user interface, and the presentation visual field of the game scene screen on the graphical user interface is configured as The second touch manipulation area receives a second touch operation and changes. In this way, the user can call up the second touch control area according to the requirements, avoiding misoperation and saving screen space.

In an optional embodiment, an option is provided in the setting of the game software application for the user to select, according to the content of the setting option, whether to enable the function of providing the second touch manipulation area in the graphical user interface.

In an optional implementation manner, the foregoing step S120 may be performed before step S110. That is, the above steps S110 and S120 do not have a limitation of the order.

In step S130, the second touch operation located in the second touch manipulation area is detected, and the presentation visual field of the game scene screen on the graphic user interface is changed according to the second touch operation.

The second touch operation is a touch sliding operation, and the display trajectory of the game scene screen on the graphical user interface is changed according to the sliding trajectory of the touch sliding operation, and the adjustment direction of the presentation visual field of the game scene screen on the graphical user interface is The sliding direction is the same, and when the presentation field of the game scene picture changes, the orientation of the virtual character and the position of the sight are not changed.

As shown in FIG. 2 and FIG. 3, when the second touch manipulation area receives the touch sliding operation in the right direction, the presentation field of the game scene picture on the graphic user interface changes, which is equivalent to the virtual camera 2 in the negative direction of the Z axis. Turn. The angle of rotation is determined by the distance of the sliding. The larger the sliding distance, the larger the angle of rotation.

As shown in FIG. 3, the user-controlled virtual character 6 is a tank, and the tank orientation and the weapon sight 7 are both pointed to the reference object 8 (for example, mountain). The user can control the displacement and/or rotation of the tank through a first touch manipulation area 4 (eg, a rocker area) located on the left side of the graphical user interface. The presentation field of the game scene picture is adjusted by a second touch manipulation area 5 located on the right side of the graphical user interface (eg, an area having a bounding box on the right side in FIG. 3). When the finger slides around the second touch manipulation area 5, the game scene screen presents a corresponding left and right adjustment of the visual field, and when the left and right adjustment corresponding to the visual field is presented, the orientation of the virtual character 6 tank and the weapon sight 7 are directed to the reference object 8.

When the sliding touch slide operation to the lower right direction is received, the presentation field of the game scene screen on the graphical user interface changes, which corresponds to the virtual camera 2 in FIG. 2 rotating in the negative direction of the Z axis and rotating in the Y negative direction.

Likewise, a touch swipe operation that receives other directions changes the rendering field of view accordingly.

In an alternative embodiment, the direction of adjustment of the field of view of the game scene picture on the graphical user interface is opposite to the direction of the slide.

For example, as shown in Figure 3, the user-controlled virtual character is a tank with tank orientation and weapon sights pointing to the mountains. The user can control the displacement and/or rotation of the tank through a first touch manipulation area (eg, a virtual joystick area) located on the left side of the graphical user interface through the second touch manipulation area located on the right side of the graphical user interface (right in FIG. 3) The area with the bounding box on the side) adjusts the rendering view of the game screen. When the finger slides to the right in the second touch manipulation area, the presentation field of the game scene screen is correspondingly adjusted to the left, which corresponds to the virtual camera 2 in FIG. 2 rotating in the positive direction of the Z axis.

In an optional embodiment, changing the rendering visual field of the game scene picture on the graphic user interface according to the sliding trajectory of the touch sliding operation is equivalent to changing the position A of the virtual camera and changing the shooting direction of the virtual camera 2.

For example, when the initial operation of the touch sliding operation is detected, the position of the virtual camera 2 in the game scene is changed to a preset position and the direction of the virtual camera is changed according to the sliding trajectory of the touch sliding operation, ie The direction in which the field of view of the game scene screen on the graphical user interface is changed is changed according to the sliding trajectory of the touch slide operation. For example, when the start operation of the touch slide operation is detected, the first person view game screen is switched to the third person view game screen, and at this time, the position of the virtual camera 2 is changed, and the touch slide operation is performed according to the touch operation. The sliding trajectory changes the direction of the field of view of the game scene picture on the graphical user interface.

The second touch operation is a touch slide operation that changes the position of the virtual camera according to the slide trajectory of the touch slide operation to change the presentation view of the game scene screen on the graphical user interface.

For example, in FIG. 2-3, when the finger slides around the second touch manipulation area 5, the game screen presents a corresponding left and right adjustment of the field of view, which corresponds to the virtual camera 2 in FIG. 2 moving along the Z axis; When the two touch control areas 5 slide up and down, the game screen presents a corresponding up and down adjustment of the field of view, which corresponds to the corresponding movement of the virtual camera 2 in FIG. 2 along the Y axis.

In an optional implementation manner, the second touch operation is a touch click operation, and the presentation of the game scene screen on the graphic user interface is changed according to the position of a preset point in the second touch manipulation area and the click position of the touch click operation. Vision.

For example, the preset point is the center point of the second touch manipulation area, and the click position of the touch click operation is on the right side of the center point, and the adjustment view field is turned to the right. Similarly, a corresponding change in the touch click operation that receives other orientations presents a field of view.

For example, the preset point is the center point of the second touch manipulation area, and the click position of the touch click operation is on the right side of the center point, and the position of the control virtual camera is moved to the right. Similarly, a corresponding change in the touch click operation that receives other orientations presents a field of view.

In an optional implementation manner, the second touch operation is a touch click operation, and the game on the graphical user interface is changed according to a position of a preset line in the second touch manipulation area and a position of the click of the touch click operation. The rendering view of the scene picture. For example, the preset line is the center line of the horizontal direction of the second touch control area, the click position of the touch click operation is on the right side of the center line, the adjustment view field is turned to the right, and the click position of the touch click operation is on the left side of the center line. Adjust the rendering field to the left. For another example, the preset line is the center line of the vertical direction of the second touch control area, the click position of the touch click operation is on the upper side of the center line, the adjustment display field is turned up, and the click position of the touch click operation is below the center line. Adjust the rendering view to go down.

For example, the preset line is the center line of the horizontal direction of the second touch manipulation area, the click position of the touch click operation is on the right side of the center line, the position of the control virtual camera is moved to the right, and the position of the click operation of the touch click operation is on the center line. On the left, the position of the control virtual camera moves to the left. For example, the preset line is the center line of the vertical direction of the second touch manipulation area, the click position of the touch click operation is on the upper side of the center line, and the position of the control virtual camera is moved upward, and the click position of the touch click operation is below the center line. , control the position of the virtual camera to move down.

In step S140, detecting the end of the second touch operation, controlling the presentation visual field of the game scene screen on the graphical user interface to return to the state before the second touch operation.

The end of the second touch operation refers to a finger or other touch object leaving the touch screen.

For example, when the second touch operation is a touch slide operation, the user raises the finger to restore the current presentation field of view to the state before the touch slide operation.

The game user can change the direction of the field of view of the game scene picture on the graphical user interface by sliding the touch operation, and does not change the orientation of the virtual character and the direction of the weapon sight. After the touch slide operation is finished, the game screen presented on the terminal can be quickly restored. . Provides a convenient and fast way to adjust the field of view.

It should be noted that the returning of the presentation visual field of the present disclosure to the state before the second touch operation includes: controlling the rendering visual field of the game scene screen on the graphical user interface to return to the presentation visual field before the second touch operation; or, controlling The rendering visual field of the game scene screen on the graphical user interface is restored to calculate the rendering visual field according to the rendering visual field computing logic before the second touch operation.

Controlling the presentation visual field of the game scene screen on the graphical user interface to return to the presentation visual field before the second touch operation, that is, to restore the presented visual field range to the state before the second touch operation: the virtual camera of the game screen The absolute position and the absolute angle/direction are restored to the state before the second touch operation, for example, before the second touch operation, the position of the virtual camera 2 is the point A in the absolute coordinates of the game scene, and the image capturing direction is the direction vector AO; Absolutely restoring the rendered field of view to the state before the second touch operation based on the A point and the direction AO for absolute recovery, that is, based on the position of the virtual camera in the absolute coordinates of the game scene before the second touch operation and absolute The shooting direction in the coordinates controls the rendering view of the game scene screen on the graphical user interface.

Controlling the rendering visual field of the game scene screen on the graphical user interface to restore the rendering visual field according to the rendering visual field computing logic before the second touch operation, that is, returning the visual field to the control state before the second touch operation, for example: Before the second touch operation, the game calculates the field of view according to predetermined calculation logic (for example, the virtual camera is set at the head of the virtual character and rotates following the rotation of the virtual character), in such a case, the present disclosure The state in which the field of view is restored to the state before the second touch operation may also be to restore the calculation logic before the second touch operation to calculate the field of view; for example, the position of the virtual camera 2 is the relative coordinate associated with the virtual character before the second touch operation Point A in the middle (for example, a distance W behind the virtual character and a height H), the imaging direction is the direction vector AO, which is related to the orientation of the virtual character and/or the direction of the weapon sight (for example, the direction vector AO is The projection in the horizontal direction is the same as the orientation of the virtual character in the horizontal direction. When restoring, the position of the virtual camera 2 is still in the virtual state. a point where the rearward distance of the character is W and the height is H, and the imaging direction of the virtual camera 2 is associated with the orientation of the virtual character and/or the direction of the weapon sight, that is, based on the current virtual character in the absolute coordinates of the game scene. a position, a current orientation of the virtual character, and/or a weapon sighting direction of the virtual character, a positional relationship of the virtual camera in the game scene relative to the virtual character before the second touch operation, the second The orientation of the virtual character before the touch operation and/or the relationship between the weapon crosshair direction of the virtual character and the virtual camera shooting direction controls the rendering view of the game scene screen on the graphical user interface.

The scope of the claimed invention should at least include both of the above.

In an optional embodiment, when the rendering view of the game scene picture is changed by the second touch manipulation area, the displacement and/or rotation of the virtual character may be changed by the first touch manipulation area. That is, it is possible to change the orientation of the virtual character and/or the direction of the weapon sight by the first touch operation of the first touch manipulation region while observing the enemy scene by the touch manipulation change in the second touch manipulation region, thereby realizing rapid observation and Collaborative operation.

In an optional implementation manner, the end of the second touch operation is detected and the touch operation is not received by the second touch manipulation area within the predetermined duration, and the presentation visual field of the game scene screen on the control graphical user interface is restored to before the second touch operation. status.

By providing a second touch manipulation area in the graphical user interface and according to the detected second touch operation occurring in the second touch manipulation area, the graphic user can be changed according to the second touch operation received by the second touch manipulation area The rendering view of the game scene on the interface, when the second touch operation ends, the rendering view of the game scene on the graphical user interface returns to the original state. The second touch operation of the user in the second touch manipulation area may change the presentation field of view of the game scene on the graphical user interface, and return to the state before the second touch operation when the second touch operation ends. It provides users with a convenient and fast way to adjust the visual field, which satisfies the needs of users and improves the user experience. It is to be noted that the above-described drawings are merely illustrative of the processes included in the method according to the exemplary embodiments of the present disclosure, and are not intended to be limiting. It is easy to understand that the processing shown in the above figures does not indicate or limit the chronological order of these processes. In addition, it is also easy to understand that these processes may be performed synchronously or asynchronously, for example, in a plurality of modules.

Also disclosed in the exemplary embodiment is a display control device for a game screen. Referring to FIG. 4, the game screen includes performing a software application on a processor of the mobile terminal and rendering on a display of the mobile terminal. The obtained graphical user interface, the content presented by the graphical user interface includes a game scene image and at least partially includes a virtual character, and the display control device 100 of the game screen may include: a first providing module 101 and a second providing module 102. The first detecting module 103 and the second detecting module 104. among them:

The first providing module 101 may be configured to provide a first touch manipulation area on the graphical user interface, and configure the virtual character to be in the game scene picture according to a first touch operation received according to the first touch manipulation area Perform displacement and / or rotation;

The second providing module 102 may be configured to provide a second touch manipulation area on the graphical user interface, and configure a presentation visual field of the game scene screen on the graphical user interface to be received according to the second touch manipulation area. Change by two touch operations;

The first detecting module 103 may be configured to detect the second touch operation located in the second touch manipulation area, and change a presentation view of the game scene screen on the graphic user interface according to the second touch operation;

The second detecting module 104 may be configured to detect the end of the second touch operation, and control a rendering visual field of the game scene screen on the graphical user interface to return to a state before the second touch operation.

The specific details of the display control device module of each of the above game screens have been described in detail in the corresponding display control method of the game screen, and therefore will not be described herein.

It should be noted that although several modules or units of equipment for action execution are mentioned in the detailed description above, such division is not mandatory. Indeed, in accordance with embodiments of the present disclosure, the features and functions of two or more modules or units described above may be embodied in one module or unit. Conversely, the features and functions of one of the modules or units described above may be further divided into multiple modules or units.

In an exemplary embodiment of the present disclosure, there is also provided a computer readable storage medium having stored thereon a computer program that, when executed by a processor, implements the display control method of the game screen described above.

The computer readable storage medium can include a data signal that is propagated in the baseband or as part of a carrier, carrying readable program code. Such propagated data signals can take a variety of forms including, but not limited to, electromagnetic signals, optical signals, or any suitable combination of the foregoing. A computer readable storage medium can transmit, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.

Program code embodied in a computer readable storage medium may be transmitted using any suitable medium, including but not limited to wireless, wireline, optical cable, radio frequency, etc., or any suitable combination of the foregoing.

In an exemplary embodiment of the present disclosure, an electronic device is also proposed. As shown in FIG. 5, the electronic device 200 includes a processing component 201, one or more processors, and memory resources represented by the memory 202. For storing instructions executable by the processing component 201, such as an application. An application stored in memory 202 may include one or more modules each corresponding to a set of instructions. Further, the processing component 201 is configured to execute an instruction to perform the display control method of the game screen described above.

The electronic device 200 may further include: a power supply component configured to perform power management on the execution electronic device 200; a wired or wireless network interface 203 configured to connect the electronic device 200 to the network; and an input and output ( I/O) interface 204. The electronic device 200 may operate based on an operating system stored in a memory, such as Android, iOS, Windows Server TM, Mac OS X TM, Unix TM, Linux TM, FreeBSD TM or the like.

Through the description of the above embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented by software or by software in combination with necessary hardware. Therefore, the technical solution according to an embodiment of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a USB flash drive, a mobile hard disk, etc.) or on a network. A number of instructions are included to cause a computing device (which may be a personal computer, server, electronic device, or network device, etc.) to perform a method in accordance with an embodiment of the present disclosure.

Other embodiments of the present disclosure will be apparent to those skilled in the <RTIgt; The present application is intended to cover any variations, uses, or adaptations of the present disclosure, which are in accordance with the general principles of the disclosure and include common general knowledge or common technical means in the art that are not disclosed in the present disclosure. . The specification and examples are to be regarded as illustrative only,

It is to be understood that the invention is not limited to the details of the details and The scope of the disclosure is to be limited only by the appended claims.

Claims (15)

  1. A display control method for a game screen, the game screen comprising a graphical user interface obtained by executing a software application on a processor of the mobile terminal and rendering on a display of the mobile terminal, the content presented by the graphical user interface A game scene screen is included and at least partially includes a virtual character, the method comprising:
    Providing a first touch manipulation area at the graphical user interface, and configuring the virtual character to be displaced and/or rotated in the game scene screen according to a first touch operation received by the first touch manipulation area;
    Providing a second touch manipulation area on the graphical user interface, and configuring a presentation visual field of the game scene screen on the graphical user interface to be changed according to a second touch operation received by the second touch manipulation area;
    Detecting the second touch operation located in the second touch manipulation area, and changing a presentation view of the game scene picture on the graphic user interface according to the second touch operation;
    The end of the second touch operation is detected, and the presentation visual field of the game scene screen on the graphical user interface is controlled to return to the state before the second touch operation.
  2. The display control method of a game screen according to claim 1, wherein the first touch manipulation area is a virtual joystick manipulation area.
  3. The display control method of a game screen according to claim 1, wherein the second touch operation is a touch slide operation.
  4. The display control method of the game screen according to claim 3, wherein the changing the presentation visual field of the game scene screen on the graphical user interface according to the second touch operation comprises:
    Changing a rendering view of the game scene screen on the graphical user interface according to a sliding trajectory of the touch sliding operation.
  5. The display control method of the game screen according to claim 3, wherein the changing the presentation visual field of the game scene screen on the graphical user interface according to the second touch operation comprises:
    Changing the position of the virtual camera in the game scene to a preset position;
    The direction of the virtual camera is changed according to a sliding trajectory of the touch sliding operation.
  6. The display control method of a game screen according to claim 3, wherein the game screen is a first person perspective game screen, and the rendering of the game scene screen on the graphic user interface is changed according to the second touch operation The vision includes:
    Switching the first person view game screen to a third person view game screen, and changing a direction of the present view of the game scene screen on the graphical user interface according to the sliding track of the touch slide operation.
  7. The display control method of a game screen according to claim 1, wherein the second touch operation is a touch click operation.
  8. The display control method of the game screen according to claim 7, wherein the changing the presentation visual field of the game scene screen on the graphical user interface according to the second touch operation comprises:
    And changing a presentation view of the game scene screen on the graphic user interface according to a position of a preset point in the second touch manipulation area and a position of the click of the touch click operation.
  9. The display control method of the game screen according to claim 7, wherein the changing the presentation visual field of the game scene screen on the graphical user interface according to the second touch operation comprises:
    And changing a presentation view of the game scene screen on the graphical user interface according to a position of a preset line line in the second touch manipulation area and a click position of the touch click operation.
  10. The display control method of the game screen according to any one of claims 1 to 6, wherein the providing the second touch manipulation area in the graphic user interface comprises:
    A preset touch operation at the graphical user interface is detected, and a second touch manipulation area is presented on the graphical user interface.
  11. The display control method of the game screen according to claim 7, wherein the preset touch operation comprises any one of the following: a re-press, a long press, and a double-click.
  12. The display control method of the game screen according to claim 1, wherein the controlling the state of the presentation of the game scene screen on the graphical user interface to the state before the second touch operation comprises:
    Controlling the presentation field of view of the game scene screen on the graphical user interface to return to the presentation field of view before the second touch operation; or
    Controlling the rendering visual field of the game scene screen on the graphical user interface to return to calculating the rendering visual field according to the rendering visual field computing logic before the second touch operation.
  13. A display control device for a game screen, the game screen comprising a graphical user interface obtained by executing a software application on a processor of the mobile terminal and rendering on a display of the mobile terminal, the content presented by the graphical user interface Include a game scene screen and at least partially include a virtual character, the device comprising:
    a first providing module, configured to provide a first touch manipulation area in the graphical user interface, and configured to configure the virtual character to be performed in the game scene screen according to a first touch operation received by the first touch manipulation area Displacement and / or rotation;
    a second providing module, configured to provide a second touch manipulation area on the graphical user interface, configured to configure a presentation visual field of the game scene screen on the graphical user interface as a second received according to the second touch manipulation area Change by touch operation;
    The first detecting module is configured to detect the second touch operation located in the second touch manipulation area, and change a presentation view of the game scene picture on the graphic user interface according to the second touch operation;
    The second detecting module is configured to detect the end of the second touch operation, and control a rendering visual field of the game scene screen on the graphic user interface to return to a state before the second touch operation.
  14. A computer readable storage medium storing a computer program, wherein the computer program is executed by a processor, and the display control method of the game screen according to any one of claims 1 to 12 is implemented.
  15. An electronic device comprising:
    Processor;
    a memory, configured to store executable instructions of the processor;
    The processor is configured to execute the display control method of the game screen according to any one of claims 1 to 12 by executing the executable command.
PCT/CN2018/079756 2017-03-27 2018-03-21 Display control method and apparatus for game picture, storage medium and electronic device WO2018177170A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201710188700.1 2017-03-27
CN201710188700.1A CN106975219B (en) 2017-03-27 2017-03-27 Display control method and device, storage medium, the electronic equipment of game picture

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019518993A JP2020504851A (en) 2017-03-27 2018-03-21 Game screen display control method, device, storage medium, and electronic device
US16/346,141 US20190299091A1 (en) 2017-03-27 2018-03-21 Display control method and apparatus for game screen, storage medium, and electronic device

Publications (1)

Publication Number Publication Date
WO2018177170A1 true WO2018177170A1 (en) 2018-10-04

Family

ID=59339039

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/079756 WO2018177170A1 (en) 2017-03-27 2018-03-21 Display control method and apparatus for game picture, storage medium and electronic device

Country Status (4)

Country Link
US (1) US20190299091A1 (en)
JP (1) JP2020504851A (en)
CN (2) CN108905212B (en)
WO (1) WO2018177170A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109847354A (en) * 2018-12-19 2019-06-07 网易(杭州)网络有限公司 The method and device of virtual lens control in a kind of game

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108905212B (en) * 2017-03-27 2019-12-31 网易(杭州)网络有限公司 Game screen display control method and device, storage medium and electronic equipment
CN107617213B (en) 2017-07-27 2019-02-19 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
CN107648847B (en) 2017-08-22 2020-09-22 网易(杭州)网络有限公司 Information processing method and device, storage medium and electronic equipment
CN107741819B (en) 2017-09-01 2018-11-23 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN107715454B (en) * 2017-09-01 2018-12-21 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN107694086B (en) * 2017-10-13 2018-11-23 网易(杭州)网络有限公司 Information processing method and device, storage medium, the electronic equipment of game system
CN107890664A (en) 2017-10-23 2018-04-10 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
CN107930105A (en) * 2017-10-23 2018-04-20 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
CN107930114A (en) * 2017-11-09 2018-04-20 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
CN107823882A (en) * 2017-11-17 2018-03-23 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN107832001A (en) * 2017-11-17 2018-03-23 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN107982916B (en) * 2017-11-17 2020-11-06 网易(杭州)网络有限公司 Information processing method, information processing device, electronic equipment and storage medium
CN107844197A (en) * 2017-11-28 2018-03-27 歌尔科技有限公司 Virtual reality scenario display methods and equipment
CN109833624A (en) * 2017-11-29 2019-06-04 腾讯科技(成都)有限公司 The display methods and device for line information of marching on virtual map
CN108211358B (en) * 2017-11-30 2020-02-28 腾讯科技(成都)有限公司 Information display method and device, storage medium and electronic device
CN109568956B (en) * 2019-01-10 2020-03-10 网易(杭州)网络有限公司 In-game display control method, device, storage medium, processor and terminal
CN110585707B (en) * 2019-09-20 2020-12-11 腾讯科技(深圳)有限公司 Visual field picture display method, device, equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104076986A (en) * 2014-07-25 2014-10-01 上海逗屋网络科技有限公司 Touch control method and equipment used for multi-touch screen terminal
CN105607851A (en) * 2015-12-18 2016-05-25 上海逗屋网络科技有限公司 Scene control method and device for touch terminal
JP2016171874A (en) * 2015-03-16 2016-09-29 株式会社バンダイナムコエンターテインメント Game device and program
CN106975219A (en) * 2017-03-27 2017-07-25 网易(杭州)网络有限公司 Display control method and device, storage medium, the electronic equipment of game picture

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006304985A (en) * 2005-04-27 2006-11-09 Aruze Corp Game machine
EP2393000B1 (en) * 2010-06-04 2019-08-07 Lg Electronics Inc. Mobile terminal capable of providing multiplayer game and method of controlling operation of the mobile terminal
CN104436657B (en) * 2014-12-22 2018-11-13 青岛烈焰畅游网络技术有限公司 Game control method, device and electronic equipment
JP6018265B2 (en) * 2015-07-13 2016-11-02 株式会社Cygames GAME CONTROL PROGRAM, GAME CONTROL METHOD, AND GAME CONTROL DEVICE
CN105094920B (en) * 2015-08-14 2018-07-03 网易(杭州)网络有限公司 A kind of game rendering intent and device
CN105148520A (en) * 2015-08-28 2015-12-16 上海甲游网络科技有限公司 Method and device for automatic aiming of shooting games
CN105148514A (en) * 2015-09-06 2015-12-16 骆凌 Device and method for controlling game view angle
CN105094345B (en) * 2015-09-29 2018-07-27 腾讯科技(深圳)有限公司 A kind of information processing method, terminal and computer storage media
CN109432766A (en) * 2015-12-24 2019-03-08 网易(杭州)网络有限公司 A kind of game control method and device
CN106110659B (en) * 2016-07-15 2019-08-06 网易(杭州)网络有限公司 The processing method and processing device of game account
CN106296786B (en) * 2016-08-09 2019-02-15 网易(杭州)网络有限公司 The determination method and device of scene of game visibility region
CN106502670A (en) * 2016-10-20 2017-03-15 网易(杭州)网络有限公司 A kind of scene of game changing method and device
CN106492457A (en) * 2016-10-20 2017-03-15 北京乐动卓越科技有限公司 A kind of implementation method of full 3D actions mobile phone games fight interactive system and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104076986A (en) * 2014-07-25 2014-10-01 上海逗屋网络科技有限公司 Touch control method and equipment used for multi-touch screen terminal
JP2016171874A (en) * 2015-03-16 2016-09-29 株式会社バンダイナムコエンターテインメント Game device and program
CN105607851A (en) * 2015-12-18 2016-05-25 上海逗屋网络科技有限公司 Scene control method and device for touch terminal
CN106975219A (en) * 2017-03-27 2017-07-25 网易(杭州)网络有限公司 Display control method and device, storage medium, the electronic equipment of game picture

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109847354A (en) * 2018-12-19 2019-06-07 网易(杭州)网络有限公司 The method and device of virtual lens control in a kind of game

Also Published As

Publication number Publication date
CN108905212B (en) 2019-12-31
CN106975219B (en) 2019-02-12
CN106975219A (en) 2017-07-25
US20190299091A1 (en) 2019-10-03
CN108905212A (en) 2018-11-30
JP2020504851A (en) 2020-02-13

Similar Documents

Publication Publication Date Title
US10845890B1 (en) Gesture keyboard method and apparatus
US10055191B2 (en) Systems and methods for providing audio to a user based on gaze input
CN105597310B (en) Game control method and device
US20180321903A1 (en) Systems and methods for providing audio to a user based on gaze input
US9842435B2 (en) Image processing to provide stable environment recognition
US9685005B2 (en) Virtual lasers for interacting with augmented reality environments
US10133342B2 (en) Human-body-gesture-based region and volume selection for HMD
US20160283081A1 (en) Facilitate user manipulation of a virtual reality environment view using a computing device with touch sensitive surface
US10349034B2 (en) Information processing apparatus, stereoscopic display method, and program
US20170357473A1 (en) Mobile device with touch screens and method of controlling the same
US10146343B2 (en) Terminal device having virtual operation key
US9218116B2 (en) Touch interaction with a curved display
JP5735472B2 (en) Game providing device
WO2015188614A1 (en) Method and device for operating computer and mobile phone in virtual world, and glasses using same
EP2954397B1 (en) Methods for system engagement via 3d object detection
KR102034367B1 (en) Information processing method and terminal, and computer storage media
KR102121592B1 (en) Method and apparatus for protecting eyesight
CN105148517B (en) A kind of information processing method, terminal and computer-readable storage medium
US20140223490A1 (en) Apparatus and method for intuitive user interaction between multiple devices
KR102092451B1 (en) Information processing method, terminal, and computer storage medium
JP2019502178A (en) Teleportation in augmented and / or virtual reality environments
US10789776B2 (en) Structural modeling using depth sensors
KR20150012234A (en) Split-screen display method and apparatus, and electronic device thereof
US9149720B2 (en) Computer-readable storage medium having information processing program stored therein, information processing apparatus, information processing system, and information processing method
US9533225B2 (en) Computer-readable storage medium having information processing program stored therein, information processing apparatus, information processing system, and information processing method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18777028

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019518993

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18777028

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 18777028

Country of ref document: EP

Kind code of ref document: A1