CN107982916B - Information processing method, information processing device, electronic equipment and storage medium - Google Patents

Information processing method, information processing device, electronic equipment and storage medium Download PDF

Info

Publication number
CN107982916B
CN107982916B CN201711148063.1A CN201711148063A CN107982916B CN 107982916 B CN107982916 B CN 107982916B CN 201711148063 A CN201711148063 A CN 201711148063A CN 107982916 B CN107982916 B CN 107982916B
Authority
CN
China
Prior art keywords
game scene
user interface
graphical user
visual field
touch operation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711148063.1A
Other languages
Chinese (zh)
Other versions
CN107982916A (en
Inventor
翟公望
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201711148063.1A priority Critical patent/CN107982916B/en
Publication of CN107982916A publication Critical patent/CN107982916A/en
Application granted granted Critical
Publication of CN107982916B publication Critical patent/CN107982916B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/424Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving acoustic input signals, e.g. by using the results of pitch or rhythm extraction or voice recognition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5258Changing parameters of virtual cameras by dynamically adapting the position of the virtual camera to keep a game object or game character in its viewing frustum, e.g. for tracking a character or a ball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5375Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for graphically or textually suggesting an action, e.g. by displaying an arrow indicating a turn in a driving game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/54Controlling the output signals based on the game progress involving acoustic signals, e.g. for simulating revolutions per minute [RPM] dependent engine sounds in a driving game or reverberation against a virtual wall
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses an information processing method, an information processing device, electronic equipment and a storage medium, wherein the method comprises the following steps: when sound in a preset range in a game scene is detected, acquiring sound source azimuth information of the sound, and providing a first visual field control area in a graphical user interface; when a third touch operation acting on the first visual field control area is detected, the presenting visual field of the game scene on the graphical user interface is controlled according to the position of the virtual main body in the game scene and the sound source azimuth information, so that the presenting visual field of the game scene faces the sound source. The invention solves the technical problem that the sound source can not be locked quickly in the mobile terminal interaction mode.

Description

Information processing method, information processing device, electronic equipment and storage medium
Technical Field
The invention relates to the technical field of games, in particular to an information processing method, an information processing device, electronic equipment and a storage medium.
Background
With the development of mobile intelligent terminals and the game industry, a great amount of handgames with different subjects emerge to meet the requirements of players. In various shooting-type game applications, it is often desirable to observe the surrounding environmental conditions in real time.
In the existing shooting type mobile phone game application, the mobile operation is generally completed by a left hand, and the visual field of the game scene is adjusted by a right hand. The interaction mode can not quickly observe a specific area, so that the action load of a player is increased, and the operation efficiency of the player is reduced; meanwhile, the operation threshold of a novice player is improved, and the game experience is reduced.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
An object of the present invention is to provide an information processing method, an information processing apparatus, an electronic device, and a computer-readable storage medium, which overcome one or more of the problems due to the limitations and disadvantages of the related art, at least to some extent.
Additional features and advantages of the invention will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of the present invention, there is provided an information processing method applied to a touch terminal capable of presenting a graphical user interface, where contents presented by the graphical user interface at least partially include a game scene and a virtual subject, the method including:
providing a movement control area, and when a first touch operation acting on the movement control area is detected, controlling a virtual main body to move in a game scene according to the movement of a touch point of the first touch operation;
providing an orientation control area, and controlling the orientation of the virtual main body in the game scene according to the movement of the touch point of the second touch operation when the second touch operation acting on the orientation control area is detected;
controlling a presentation field of view of the game scene on the graphical user interface according to the position and orientation of the virtual subject in the game scene;
when sound in a preset range in a game scene is detected, acquiring sound source azimuth information of the sound, and providing a first visual field control area in a graphical user interface;
when a third touch operation acting on the first visual field control area is detected, the presenting visual field of the game scene on the graphical user interface is controlled according to the position of the virtual main body in the game scene and the sound source azimuth information, so that the presenting visual field of the game scene faces the sound source.
Optionally, adjusting a rendering field of view of the game scene on the graphical user interface according to the position of the virtual subject in the game scene and the sound source orientation information, so as to orient the rendering field of view of the game scene towards the sound source, comprising:
and controlling the direction of the virtual camera corresponding to the graphical user interface according to the direction indicated by the sound source orientation information so as to enable the presenting visual field of the game scene to face the sound source.
Optionally, the method further comprises:
and when the presenting visual field of the game scene on the graphical user interface is oriented to the sound source, maintaining the orientation of the game main body in the game scene before the third touch operation.
Optionally, the method further comprises:
and when the presenting visual field of the game scene faces to the sound source on the graphical user interface, keeping controlling the virtual main body to move in the game scene according to the movement of the touch point of the first touch operation.
Optionally, the method further comprises:
and when the third touch operation acting on the first visual field control area is detected to be finished, controlling the presenting visual field of the game scene on the graphical user interface to be restored to the state before the third touch operation.
Optionally, the method further comprises:
and when a fourth touch operation acting on a cancel operation area is detected, controlling the display visual field of the game scene on the graphical user interface to be restored to the state before the third touch operation.
Optionally, controlling the rendering view of the game scene on the graphical user interface to return to the state before the third touch operation includes:
controlling the display visual field of the game scene on the graphical user interface to recover to the display visual field before the third touch operation; or,
and controlling the display visual field of the game scene on the graphical user interface to be restored to the display visual field calculated according to the display visual field calculation logic before the third touch operation.
Optionally, the method further comprises: the first vision control area is hidden.
Optionally, the method comprises:
and when the presenting visual field of the game scene on the graphical user interface faces the sound source, when the preset action of the third touch operation is detected, the presenting visual field of the game scene on the graphical user interface is adjusted according to the preset action.
Optionally, the preset action of the third touch operation is a touch sliding operation.
Optionally, adjusting the presentation field of view of the game scene on the graphical user interface according to the preset action includes:
and adjusting the presenting visual field of the game scene on the graphical user interface according to the sliding track of the touch sliding operation.
Optionally, the preset action of the third touch operation is a touch click operation.
Optionally, adjusting the presentation field of view of the game scene on the graphical user interface according to the preset action includes:
and changing the presentation visual field of the game scene on the graphical user interface according to the position of a preset point in the first visual field control area and the click position of the touch click operation.
Optionally, after obtaining the sound source orientation information of the sound, the method further comprises:
providing a first graphic identification according to the current position of the virtual main body and the azimuth information of the sound source;
the first graphical identification is used to indicate an azimuthal relationship between the sound source and the virtual subject position.
Optionally, after obtaining the sound source orientation information of the sound, the method further comprises:
providing a second graphic identification according to the current position and orientation of the virtual main body and the sound source azimuth information;
the second graphical identification is used to indicate a positional relationship between the sound source and the virtual subject orientation.
Optionally, the method further comprises:
providing a second visual field control area;
when a fifth touch operation acting on the second visual field control area is detected, changing the presentation visual field of the game scene on the graphical user interface according to the fifth touch operation;
and when the end of the fifth touch operation is detected, controlling the display visual field of the game scene on the graphical user interface to be restored to the state before the fifth touch operation.
According to a second aspect of the present invention, there is provided an information processing apparatus for a touch terminal capable of presenting a graphical user interface, the content presented by the graphical user interface at least partially including a game scene, and a virtual subject, the apparatus comprising:
the first interaction unit is used for providing a movement control area, and when a first touch operation acting on the movement control area is detected, the virtual main body is controlled to move in a game scene according to the movement of a touch point of the first touch operation;
the second interaction unit is used for providing an orientation control area, and when a second touch operation acting on the orientation control area is detected, the orientation of the virtual main body in the game scene is controlled according to the movement of the touch point of the second touch operation;
the first control unit is used for controlling the presentation visual field of the game scene on the graphical user interface according to the position and the orientation of the virtual main body in the game scene;
the display unit is used for detecting sound in a preset range in a game scene, acquiring sound source azimuth information of the sound and providing a first visual field control area in a graphical user interface;
and the second control unit is used for controlling the presenting visual field of the game scene on the graphical user interface according to the position of the virtual main body in the game scene and the sound source azimuth information when detecting the third touch operation acting on the first visual field control area, so that the presenting visual field of the game scene faces to the sound source.
According to a third aspect of the present invention, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the information processing method of any one of the above.
According to a fourth aspect of the present invention, there is provided an electronic apparatus comprising: a processor; and a memory for storing executable instructions for the processor; wherein the processor is configured to perform the information processing method of any one of the above via execution of executable instructions.
In an information processing method, an information processing apparatus, an electronic device, and a computer-readable storage medium according to an exemplary embodiment of the present invention, when a sound within a preset range in a game scene is detected, sound source direction information of the sound is acquired, and a first visual field control area is provided in a graphical user interface; when a third touch operation acting on the first visual field control area is detected, the presenting visual field of the game scene on the graphical user interface is controlled according to the position of the virtual main body in the game scene and the sound source azimuth information, so that the presenting visual field of the game scene faces the sound source.
The method provided by the invention can conveniently and quickly switch the user from the original visual field operation to the new visual field operation so as to quickly observe the direction of the sound source, and quickly switch back to the original visual field operation system when not needed. That is, the player can quickly view the direction of the sound source, for example, the sound is a gunshot sound, and the player can quickly view the direction of the gunshot sound to avoid or adjust the corresponding game strategy in time.
Because the player does not need to spend excessive energy to search for the sound source through sliding operation, the convenient and efficient interaction method improves the operation efficiency, brings more convenient game experience to the player, and improves the game strategy; meanwhile, the operation threshold of a novice player is reduced, and the technical problem that a sound source cannot be locked quickly in a mobile terminal interaction mode is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 schematically illustrates a flow chart of a method of information processing in an exemplary embodiment of the disclosure;
FIG. 2 schematically illustrates a first graphical user interface diagram of a mobile terminal in an exemplary embodiment of the disclosure;
FIG. 3 schematically illustrates a diagram of a second graphical user interface of a mobile terminal in an exemplary embodiment of the disclosure;
FIG. 4 schematically illustrates a diagram of a third graphical user interface of a mobile terminal in an exemplary embodiment of the disclosure;
FIG. 5 schematically illustrates a fourth graphical user interface of a mobile terminal in an exemplary embodiment of the disclosure;
FIG. 6 schematically illustrates a fifth graphical user interface of a mobile terminal in an exemplary embodiment of the disclosure;
FIG. 7 is a diagram illustrating schematically a sixth graphical user interface of a mobile terminal in an exemplary embodiment of the disclosure;
FIG. 8 schematically illustrates a diagram of a seventh graphical user interface of a mobile terminal in an exemplary embodiment of the present disclosure;
fig. 9 schematically illustrates a diagram of an eighth graphical user interface of a mobile terminal in an exemplary embodiment of the present disclosure.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In accordance with one embodiment of the present invention, there is provided an information processing method, wherein the steps shown in the flowchart of the figure may be executed in a computer system such as a set of computer executable instructions, and wherein, although a logical order is shown in the flowchart, in some cases, the steps shown or described may be executed in an order different from that shown.
The exemplary embodiment first discloses an information processing method, which is applied to a touch terminal capable of presenting a graphical user interface, where the touch terminal may be various electronic devices with touch screens, such as a mobile phone, a tablet computer, a notebook computer, a game machine, and a PDA. The graphical user interface may be obtained by executing a software application on a processor of the touch terminal and rendering on a display of the touch terminal, the content presented by the graphical user interface at least partially comprising a game scene and a virtual body.
As shown in fig. 1, the information processing method may include the steps of:
step S110, providing a movement control area, and controlling a virtual main body to move in a game scene according to the movement of a touch point of a first touch operation when the first touch operation acting on the movement control area is detected;
step S130, providing an orientation control area, and controlling the orientation of the virtual main body in the game scene according to the movement of the touch point of the second touch operation when the second touch operation acting on the orientation control area is detected;
step S150, controlling the display visual field of the game scene on the graphical user interface according to the position and the orientation of the virtual main body in the game scene;
step S170, when detecting a sound within a preset range in a game scene, acquiring sound source azimuth information of the sound, and providing a first visual field control area in a graphical user interface;
step S190, when a third touch operation acting on the first visual field control area is detected, controlling the presenting visual field of the game scene on the graphical user interface according to the position of the virtual main body in the game scene and the sound source azimuth information, so that the presenting visual field of the game scene faces to the sound source.
The method provided by the invention can conveniently and quickly switch the user from the original visual field operation to the new visual field operation so as to quickly observe the direction of the sound source, and quickly switch back to the original visual field operation system when not needed. That is, the player can quickly view the orientation of the sound source and observe the game scene near the sound source. For example, the sound is a gunshot sound, and the player can quickly check the direction of the gunshot sound and avoid or adjust the corresponding game strategy in time.
Because the player does not need to spend excessive energy to search for the sound source through sliding operation, the convenient and efficient interaction method improves the operation efficiency, brings more convenient game experience to the player, and improves the game strategy; meanwhile, the operation threshold of a novice player is reduced, and the technical problem that a sound source cannot be locked quickly in a mobile terminal interaction mode is solved.
Next, the steps of the information processing method in the present exemplary embodiment are further described with reference to fig. 2 to 9.
In the exemplary embodiment, a software application is executed on a processor of the mobile terminal and rendered on a touch-sensitive display of the mobile terminal resulting in a graphical user interface 200, the content presented by the graphical user interface 200 at least partially comprising a game scene 210, and a virtual subject 220.
The content presented by the graphical user interface 200 may include all of the game scene 210 or may be a portion of the game scene 210. For example, in an embodiment of the present invention, as shown in fig. 2, since the game scene 210 is relatively large, the partial content of the game scene 210 is displayed on the graphic user interface 200 of the mobile terminal during the game.
Step S110, providing a movement control area, and when a first touch operation acting on the movement control area is detected, controlling the virtual body to move in the game scene according to the movement of the touch point of the first touch operation.
In the present exemplary embodiment, as shown in fig. 2, a movement control area 230 is provided in the gui 200, and when a first touch operation applied to the movement control area 230 is detected, the virtual body 220 is controlled to move in the game scene 210 according to the movement of the touch point of the first touch operation.
Specifically, the movement control area 230 may be an area having a visual indication effect in the graphic user interface 200, or may be an area having no visual indication effect; an operation area such as a virtual joystick or a direction control virtual key may be displayed in the movement control area 230, which is not limited in the present exemplary embodiment.
In an embodiment of the present invention, the movement control area 230 is a virtual joystick control area, which is located at the lower left of the gui 200 and controls the virtual body 220 to move in the game scene 210 according to the first touch operation received by the virtual joystick control area.
It is understood that, in other embodiments, the movement control area 230 may also be a virtual cross key area/virtual direction key (D-PAD) area, and the virtual body 220 is controlled to move in the game scene 210 according to the first touch operation received by the virtual cross key area.
As an alternative embodiment, the movement control area 230 may be a visually indicated area of the graphical user interface 200, for example, the movement control area 230 may have a bounding box, or a range of fill colors, or a range of predetermined transparencies, or in some other manner that visually distinguishes the movement control area 230. The virtual body 220 is controlled to move in the game scene 210 according to the first touch operation received by the movement control area 230. The movement control area 230 with the visual indication enables the user to quickly locate the area, which can reduce the difficulty of operation by a novice game player.
As another alternative, the movement control area 230 may be an area of the graphical user interface 200 that does not have a visual indication. The movement control area 230 without the visual indication does not cover or affect the game screen, provides better picture effect, and saves screen space. But is not easily perceived by the player because it does not have a visual indication, as an improved embodiment, a visual guidance control may be displayed in the movement control area 230, for example, in an embodiment of the present invention, when a virtual joystick is used as the direction control scheme of the virtual body 220, a virtual joystick may be displayed in the movement control area 230 to visually guide the player.
Step S130, providing an orientation control area, and when a second touch operation acting on the orientation control area is detected, controlling the orientation of the virtual body in the game scene according to the movement of the touch point of the second touch operation.
In the present exemplary embodiment, as shown in fig. 2, an orientation control area 240 is provided in the gui 200, and when a second touch operation applied to the orientation control area 240 is detected, the orientation of the virtual body 220 in the game scene 210 is controlled according to the movement of the touch point of the second touch operation.
Specifically, the heading control region 240 and the movement control region 230 are disposed on different sides of the graphical user interface 200, e.g., the heading control region 240 may be located anywhere to the right of the graphical user interface and the corresponding movement control region 230 may be located anywhere to the left of the graphical user interface. Preferably, in an embodiment of the present invention, as shown in fig. 2, the orientation control area 240 is disposed at a lower right position of the gui 200 for controlling the orientation of the virtual subject 220 in the game scene 210; the movement control area 230 is disposed at a lower left position of the graphic user interface 200, and is used for controlling the virtual body 220 to move in the game scene 210; thus, the user can control the movement of the virtual body 220 in the game scene 210 by the left hand and the orientation of the virtual body 220 in the game scene 210 by the right hand.
The orientation control area 240 may be an area of the graphical user interface 200 having a visual indication effect or an area having no visual indication effect; an operation area such as a virtual joystick or a direction control virtual key may be displayed in the direction control area 240, which is not limited in the present exemplary embodiment.
In an embodiment of the present invention, the orientation control area 240 is a virtual joystick control area located at the lower right of the gui 200, and controls the orientation of the virtual body 220 in the game scene 210 according to the movement of the touch point of the second touch operation received by the virtual joystick control area.
It is understood that, in other embodiments, the orientation control area 240 may also be a virtual cross key area/virtual direction key (D-PAD) area, and controls the orientation of the virtual body 220 in the game scene 210 according to the movement of the touch point of the second touch operation received by the virtual cross key area.
As an alternative embodiment, the orientation control area 240 may be a visually indicated area of the GUI 200, for example, the orientation control area 240 may have a bounding box, or a range of fill colors, or a range of predetermined transparencies, or in other ways that can visually distinguish the orientation control area 240. The orientation of the virtual body 220 in the game scene 210 is controlled according to the movement of the touch point of the second touch operation received toward the control area 240. The orientation control area 240 with the visual indication enables the user to quickly locate the area, which can reduce the difficulty of operation by a novice game player.
As another alternative, the orientation control area 240 may be an area of the graphical user interface 200 that does not have a visual indication. The orientation control area 240 without visual indication does not obscure or affect the game screen, providing better picture effect and saving screen space. But is not readily perceived by the player because it does not have a visual indication, as an improved embodiment, a visual guidance control may be displayed in the orientation control area 240, for example, in an embodiment of the present invention, when a virtual joystick is used as the orientation control scheme of the virtual body 220, a virtual joystick may be displayed in the orientation control area 240 to visually guide the player.
And S150, controlling the presentation visual field of the game scene on the graphical user interface according to the position and the orientation of the virtual main body in the game scene.
In the present exemplary embodiment, the position and orientation of the virtual camera corresponding to the graphical user interface 200 is controlled according to the position and orientation of the virtual subject 220 on the game scene 210, thereby controlling the presentation field of view of the game scene 210 on the graphical user interface 200.
In the first person game, the virtual camera may be an "eye" of the user in the game, the virtual camera may be disposed on the head of the virtual body, the orientation of the virtual camera rotates along with the rotation of the virtual body, and the game scene content rendered on the display of the touch terminal is equivalent to the scene content captured by the virtual camera. In the third person game, a virtual camera may be disposed at the upper rear of the virtual body, and all game scenes may be photographed. A mapping relation can be set between the vector distance of the virtual rocker control and the rotation angle of the virtual camera so as to control the virtual camera to rotate.
Specifically, the virtual body 220 is controlled to move in the game scene according to the movement of the touch point of the first touch operation in step 110, so as to control the movement of the virtual camera corresponding to the graphical user interface 200; meanwhile, the orientation of the virtual body 220 in the game scene 210 is controlled according to the movement of the touch point of the second touch operation in step 130, thereby controlling the direction of the virtual camera corresponding to the graphic user interface 200. The real-time rendering field of view of the game scene 210 on the graphical user interface 200 is controlled based on the position of the virtual subject 220 on the game scene 210 and the position and orientation of the virtual camera corresponding to the orientation control graphical user interface 200.
Step S170, when detecting a sound within a preset range in the game scene, acquiring sound source direction information of the sound, and providing a first view control area in the graphical user interface.
In the present exemplary embodiment, the preset range may be an auditory range of the virtual subject 220, and may also be a visual range of the virtual subject 220; the location of the source of the sound within the game scene 210 may be within the graphical user interface 200 or outside the graphical user interface 200. For example, as shown in fig. 3, when a sound signal within the hearing range of the virtual subject 220 in the game scene 210 outside the gui 200 is detected, sound source azimuth information of the sound source 250 of the sound signal is acquired, and a first visual field control region 260 is provided in the gui 200.
In the present exemplary embodiment, the first visual field control region 260 and the movement control region 230 are disposed on different sides of the graphical user interface 200, e.g., the first visual field control region 260 may be located at any position on the right side of the graphical user interface 200 and the corresponding movement control region 230 may be located at any position on the left side of the graphical user interface 200. Preferably, in an embodiment of the present invention, as shown in fig. 3, the first visual field control region 260 is disposed at a right position of the gui 200 and above the orientation control region 240 for controlling the rendering visual field in the game scene 210; the movement control area 230 is disposed at a lower left position of the graphic user interface 200, and is used for controlling the virtual body 220 to move in the game scene 210; thus, the user switches the rendering field of view of the game scene 210 with the right hand while controlling the virtual body 220 to move in the game scene 210 with the left hand.
In the present exemplary embodiment, the first visual field control region 260 may be a region having a visual indication effect in the graphic user interface 200, or may be a region having no visual indication effect; an operation area such as a virtual joystick or a direction control virtual key may be displayed in the first field control area 260, which is not limited in the present exemplary embodiment.
As a preferred embodiment, the first visual control area 260 may be a region of the graphical user interface 200 having a visual indication, for example, the first visual control area 260 may have a bounding box, or a range of fill colors, or a range of predetermined transparencies, or in other ways that visually distinguish the first visual control area 260. The first field of view control region 260 with a visual indication allows the user to quickly locate the region, thereby reducing the difficulty of operation by the novice of the game.
In the present exemplary embodiment, the first visual field control region 260 may further have an information indicator corresponding to the sound content for indicating a category of the sound content, and the sound content may be a gunshot, a footstep, a door closing sound, an explosion sound, or the like. The information indicating identification may be to render the first vision control area 260 into a particular pattern or shape, the content of which pattern or shape representation corresponds to sound content; it is also possible to identify, i.e. render, a first visual field control area 260 with an arbitrary pattern, and add a text description corresponding to the sound content in the first visual field control area 260.
The information indicator can be rendered into any pattern which can be associated with the sound content, and the first visual field control area 260 with the information indicator can enable a user to quickly identify the type of the detected sound, adjust the game strategy in real time and improve the operation efficiency.
Step S190, when a third touch operation acting on the first visual field control area is detected, controlling the presenting visual field of the game scene on the graphical user interface according to the position of the virtual main body in the game scene and the sound source azimuth information, so that the presenting visual field of the game scene faces to the sound source.
In the present exemplary embodiment, the third touch operation may be one or a combination of a click operation, a touch operation over a preset time, and a press operation over a preset pressure.
Taking the third touch operation as a click operation as an example, when a click operation acting on the first visual field control region 260 is detected, the presenting visual field of the game scene on the graphical user interface is controlled according to the position of the virtual subject in the game scene and the sound source azimuth information, so that the presenting visual field of the game scene faces the sound source.
Specifically, the method for adjusting the presentation visual field of the game scene on the graphical user interface according to the position of the virtual main body in the game scene and the sound source azimuth information so as to enable the presentation visual field of the game scene to be towards the sound source comprises the following steps: and controlling the direction of the virtual camera corresponding to the graphical user interface according to the direction indicated by the sound source orientation information so as to enable the presenting visual field of the game scene to face the sound source.
For example, fig. 4 shows the presentation visual field of the game scene 210 on the gui 200 when a click operation on the first visual field control region 260 is detected, and with respect to fig. 3, when a sound signal within the hearing range of the virtual subject 220 in the game scene 210 outside the gui 200 is detected, sound source orientation information of the sound source 250 of the sound signal is acquired, and the first visual field control region 260 is rendered on the gui 200; when a click operation on the first visual field control area 260 is detected, the direction of the virtual camera corresponding to the graphical user interface 200 is controlled according to the direction indicated by the sound source orientation information (the position of the virtual camera may be kept constant), and the virtual camera is controlled to face the sound source 250, so that the presentation visual field of the game scene 210 on the graphical user interface 200 is controlled to face the sound source. At this time, the virtual body 220 may be presented on the graphic user interface, and may not be presented on the graphic user interface.
In the present exemplary embodiment, the rendering field of view of the game scene toward the sound source includes: the sound source is positioned within a rendered field of view of the game scene. That is, when the presentation field of view of the game scene 210 is directed to the sound source, the sound source may be displayed on the graphical user interface or may not be displayed on the graphical user interface (e.g., blocked by a game screen such as another virtual building or a virtual lawn). For example, the sound source is a virtual character shooting, if the virtual character stands in an open and unobstructed field, and when the rendering view of the game scene is oriented to the virtual character, the virtual character can be displayed in the graphical user interface, and the user can see the sound source; and if the virtual character crawls on the virtual grassland, or is positioned in a house or a rock mass, when the presenting view field of the game scene faces the virtual character, the virtual character is shielded by the shelter and is not displayed in the graphical user interface, and then the user cannot see the sound source.
Through the third touch operation acting on the first visual field control area, the user can timely make feedback according to the sound of the surrounding game environment, quickly check the fighting condition of the sound source, and improve the operation efficiency.
In the present exemplary embodiment, while the presentation visual field of the game scene is oriented toward the sound source on the graphical user interface, the orientation of the game subject in the game scene before the third touch operation is maintained.
Specifically, before the third touch operation, the virtual body 220 is controlled to move in the game scene according to the movement of the touch point of the first touch operation in step 110; meanwhile, the orientation of the virtual body 220 in the game scene 210 is controlled according to the movement of the touch point of the second touch operation in step 130.
As previously described, the heading control area 240 and the movement control area 230 are disposed on different sides of the gui 200, and the first touch area and the movement control area 230 are disposed on different sides of the gui 200. That is, the facing control area 240 and the first touch area are disposed on the same side of the gui 200, and the user can only select one of the facing control area 240 and the first touch area.
Upon detection of the third touch operation on the first touch area, the second touch operation on the direction toward the control area 240 is ended, at which point the presented view of the game scene 210 on the graphical user interface 200 is controlled toward the sound source, and the direction of the virtual subject 220 in the game scene 210 at the end of step 130 is maintained. That is, the third touch operation does not change the orientation of the virtual body 220 in the game scene 210.
In the present exemplary embodiment, the orientation of the virtual subject in the game scene is kept controlled in accordance with the movement of the touch point of the second touch operation while the presentation field of view of the game scene is directed to the sound source on the graphical user interface.
Specifically, when a second touch operation acting on the control area 240 is detected while the presentation field of view of the game scene is directed to the sound source on the graphical user interface, the orientation of the virtual body 220 in the game scene is controlled according to the movement of the touch point of the second touch operation; at the same time, the rendered field of view of the game scene is kept oriented towards the sound source. That is, the second touch operation does not change the presentation field of view of the game scene while the presentation field of view of the game scene is directed toward the sound source on the graphical user interface.
In the present exemplary embodiment, the movement of the virtual body in the game scene according to the movement of the touch point of the first touch operation is kept controlled while the presentation field of view of the game scene is directed to the sound source on the graphical user interface.
Specifically, when a first touch operation acting on the movement control area 230 is detected while a presentation field of the game scene is directed to a sound source on the graphical user interface, the virtual body 220 is controlled to move in the game scene according to the movement of the touch point of the first touch operation; at the same time, the rendered field of view of the game scene is kept oriented towards the sound source. That is, the first touch operation does not change the presentation field of view of the game scene while the presentation field of view of the game scene is directed toward the sound source on the graphical user interface.
It is to be understood that, in an embodiment of the present invention, when the presentation field of view of the game scene on the graphical user interface is directed to the sound source, an auxiliary display window for displaying the presentation field of view of the game scene on the graphical user interface according to the position and orientation of the virtual subject in the game scene may be further provided on the graphical user interface.
As shown in fig. 5, when the presentation field of view of the game scene on the gui is directed to the sound source 250, a secondary display window 270 is provided on the gui 200, and the secondary display window 270 is used to display the presentation field of view of the game scene on the gui controlled according to the position and orientation of the virtual subject 220 in the game scene 210.
In the present exemplary embodiment, when it is detected that the third touch operation applied to the first visual field control region is ended, the presentation visual field of the game scene on the graphical user interface is controlled to be restored to the state before the third touch operation. That is, when the third touch operation is detected, the presenting view of the game scene on the graphical user interface is controlled to face the sound source; and after the third touch operation is finished, controlling the display visual field of the game scene on the graphical user interface to be restored to the state before the third touch operation.
As an alternative embodiment, a cancel operation area may be further set on the graphical user interface, and when a fourth touch operation acting on the cancel operation area is detected, the display view of the game scene on the graphical user interface is controlled to return to the state before the third touch operation.
The cancel operation area may be the first touch area or another area on the graphical user interface different from the first view control area. And when a fourth touch operation acting on the cancel operation area is detected, controlling the display visual field of the game scene on the graphical user interface to be restored to the state before the third touch operation.
In this exemplary embodiment, controlling the rendering field of view of the game scene on the graphical user interface to return to the state before the third touch operation includes: controlling the display visual field of the game scene on the graphical user interface to recover to the display visual field before the third touch operation; or controlling the display visual field of the game scene on the graphical user interface to be restored to the display visual field calculated according to the display visual field calculation logic before the third touch operation.
Specifically, the present view field of the game scene screen on the graphical user interface is controlled to return to the present view field before the third touch operation, that is, the present view field range is controlled to return to the state before the third touch operation: the position and angle/direction of the virtual camera in the game scene are restored to the state before the third touch operation. That is, the presentation field of view of the game scene screen on the graphical user interface is controlled based on the position of the virtual camera in the game scene coordinates and the shooting direction in the coordinates before the third touch operation.
Controlling the display visual field of the game scene picture on the graphical user interface to be restored to the display visual field calculated according to the display visual field calculation logic before the third touch operation, namely restoring the visual field to the control state before the third touch operation, for example: before the third touch operation, the game calculates the visual field according to the preset calculation logic, (for example, the virtual camera is arranged at the head of the virtual body and rotates along with the rotation of the virtual body), in such a case, the visual field of the invention is restored to the state before the third touch operation, and the calculation logic before the third touch operation is adopted to calculate the visual field can also be restored. That is, the presentation field of view of the scene screen on the graphical user interface is controlled based on the position of the current virtual body in the game scene coordinates, the orientation of the current virtual body and/or the weapon sight direction of the virtual body, the positional relationship of the virtual camera relative to the virtual body in the game scene before the third touch operation, the orientation of the virtual body and/or the association relationship between the weapon sight direction of the virtual body and the shooting direction of the virtual camera.
In the present exemplary embodiment, the first visual field control region is hidden when the presentation visual field of the game scene on the control graphical user interface is restored to the state before the third touch operation.
Specifically, to save screen space and avoid blocking or affecting game screens, when the presentation field of view of the game scene on the control gui is restored to the state before the third touch operation, the first field-of-view control region is hidden, and the presentation field of view of the game scene on the control gui is controlled according to the position and orientation of the virtual body in the game scene.
In the present exemplary embodiment, when the presentation field of view of the game scene on the graphical user interface is directed to the sound source, when the preset action of the third touch operation is detected, the presentation field of view of the game scene on the graphical user interface is controlled according to the preset action.
Specifically, when the presenting view of the game scene on the graphical user interface is directed to the sound source, and the preset action of the third touch operation is detected, the presenting view of the game scene on the graphical user interface is controlled according to the preset action, so that the game scene in the preset range near the sound source can be observed. For example, as shown in fig. 6, the touch point of the preset action of the third touch operation applied to the first view control area is located in the 12 o 'clock direction of the first view control area, and the presentation view of the game scene in the graphical user interface moves to the 12 o' clock direction of the sound source 250, compared to fig. 4.
In an embodiment of the invention, the preset action of the third touch operation is a touch sliding operation.
Controlling a presentation field of view of a game scene on a graphical user interface according to a preset action, comprising: and controlling the presenting visual field of the game scene on the graphical user interface according to the sliding track of the touch sliding operation.
Specifically, when a third touch operation is detected, the presenting visual field of the game scene on the graphical user interface is controlled to face the sound source, and the third touch operation can be a touch operation, a long press operation or a heavy press operation which acts on the first visual field control area; when the sliding operation of the third touch operation is detected, the orientation of the virtual camera is changed according to the sliding track of the sliding operation, so that the presenting visual field of the game scene picture on the graphical user interface is changed by changing the orientation of the virtual camera, the adjusting rotation direction of the presenting visual field of the game scene picture on the graphical user interface is the same as the sliding direction, and the game scene in the preset range near the sound source is observed.
In another embodiment of the present invention, the predetermined action of the third touch operation is a touch click operation.
Adjusting a presentation field of a game scene on a graphical user interface according to a preset action, comprising: and changing the presentation visual field of the game scene on the graphical user interface according to the position of a preset point in the first visual field control area and the click position of the touch click operation.
Specifically, when a third touch operation is detected, the presenting view of the game scene on the graphical user interface is controlled to face the sound source, and the third touch operation may be a click operation acting on the first view control area, or an operation of leaving the touch screen after long pressing or re-pressing; when touch click of a third touch operation is detected, determining a vector between a click position of the touch click operation and a preset point position according to the position of a preset point in the first visual field control area and the click position of the touch click operation; and changing the rotation angle of the virtual camera corresponding to the presenting view field according to the vector to determine the orientation of the virtual camera, so as to control the presenting view field of the game scene on the graphical user interface and observe the game scene in a preset range near the sound source.
In the present exemplary embodiment, after acquiring sound source orientation information of sound, a first graphic identification is provided according to the current position of the virtual subject and the sound source orientation information; the first graphical identification is used to indicate an azimuthal relationship between the sound source and the virtual subject position.
The first graphic indicia may be a ruler, compass, or other form capable of indicating a bearing relationship. Taking the form of a ruler as an example of the first graphic identification, as shown in fig. 7, after acquiring the sound source azimuth information of the sound source 250 of the sound, a first graphic identification 280 indicating the azimuth relationship between the sound source 250 and the position of the virtual subject 220 is provided based on the current position of the virtual subject 22 and the sound source azimuth information of the sound source 250. Specifically, a plane coordinate is established in the game scene 210 with the virtual subject 220 as a center of circle, and the 12 o 'clock direction of the virtual subject 220 in the game scene is set as true north, the 3 o' clock direction is true east, the 6 o 'clock direction is true south, and the 9 o' clock direction is true west; sound source bearing information of the sound source 250 is acquired, and in the present embodiment, it is determined that the sound source 250 is located 45 ° north of the virtual subject 220 based on the position of the virtual subject and the sound source bearing information of the sound source 250, and a first graphic identification 280 indicating a bearing relationship between the sound source 250 and the virtual subject 220 is provided based on the position of the virtual subject 220 and the sound source bearing information of the sound source 250.
In the present exemplary embodiment, after the sound source position information of the sound is acquired, the second graphic identification is provided according to the current position and orientation of the virtual subject, and the sound source position information; the second graphical identification is used to indicate a positional relationship between the sound source and the virtual subject orientation.
The second graphic mark can be words or other forms capable of indicating the position relationship. Taking characters as the second graphic identifier for example, as shown in fig. 8, for example, the virtual body 220 is oriented west, after sound source azimuth information of the sound source 250 of the sound is acquired, it is determined that the sound source 250 is located right behind the virtual body 220 according to the position and orientation of the virtual body 220 and the sound source azimuth information of the sound source 250, and the characters "right rear" are rendered on the graphical user interface 200 as the second graphic identifier 281 indicating the positional relationship between the sound source 250 and the orientation of the virtual body 220.
In the present exemplary embodiment, a second vision control area is provided; when a fifth touch operation acting on the second visual field control area is detected, changing the presentation visual field of the game scene on the graphical user interface according to the fifth touch operation; and when the end of the fifth touch operation is detected, controlling the display visual field of the game scene on the graphical user interface to be restored to the state before the fifth touch operation.
Specifically, as shown in fig. 9, a second field of view control region 290 is provided; when a fifth touch operation acting on the second visual field control area 290 is detected, changing the presentation visual field of the game scene on the graphical user interface according to the fifth touch operation; when the end of the fifth touch operation is detected, the presenting view of the game scene on the control graphical user interface is restored to the state before the fifth touch operation, that is, the presenting view of the game scene on the control graphical user interface is controlled according to the position and the orientation of the virtual main body in the game scene.
The method provided by the invention can conveniently and quickly switch the user from the original visual field operation to the new visual field operation so as to quickly observe the direction of the sound source, and quickly switch back to the original visual field operation system when not needed. That is, the player can quickly view the direction of the sound source, for example, the sound is a gunshot sound, and the player can quickly view the direction of the gunshot sound to avoid or adjust the corresponding game strategy in time.
Because the player does not need to spend excessive energy to search for the sound source through sliding operation, the convenient and efficient interaction method improves the operation efficiency, brings more convenient game experience to the player, and improves the game strategy; meanwhile, the operation threshold of a novice player is reduced, and the technical problem that a sound source cannot be locked quickly in a mobile terminal interaction mode is solved.
According to an embodiment of the present invention, there is provided an information processing apparatus applied to a touch terminal capable of presenting a graphical user interface, where contents presented by the graphical user interface include a game scene and a virtual subject, the apparatus including:
the first interaction unit is used for providing a movement control area, and controlling the virtual main body to move in the game scene according to a first touch operation when the first touch operation acting on the movement control area is detected;
the second interaction unit is used for providing an orientation control area, detecting second touch operation acting on the orientation control area, and controlling the orientation of the virtual main body in the game scene according to the movement of a touch point of the second touch operation;
the first control unit is used for controlling the presentation visual field of the game scene on the graphical user interface according to the position and the orientation of the virtual main body in the game scene;
the display unit is used for detecting sound in a preset range in a game scene, acquiring sound source azimuth information of the sound and providing a first visual field control area in a graphical user interface;
and the second control unit is used for controlling the presenting visual field of the game scene on the graphical user interface according to the position of the virtual main body in the game scene and the sound source azimuth information when detecting the third touch operation acting on the first visual field control area, so that the presenting visual field of the game scene faces to the sound source.
The details of each information processing apparatus unit are already described in detail in the corresponding information processing method, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
There is further provided, according to an embodiment of the present invention, a computer-readable storage medium having stored thereon a program product capable of implementing the above-mentioned method of the present specification. In some possible embodiments, aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above-mentioned "exemplary methods" section of the present description, when the program product is run on the terminal device. Which may employ a portable compact disc read only memory (CD-ROM) and include program code and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
According to an embodiment of the present invention, there is also provided an electronic apparatus including: the processing components, which may further include one or more processors, and memory resources, represented by memory, for storing instructions, such as application programs, that are executable by the processing components. The application program stored in the memory may include one or more modules that each correspond to a set of instructions. Further, the processing component is configured to execute the instructions to perform the information processing method described above.
The electronic device may further include: a power component configured to power manage an executing electronic device; a wired or wireless network interface configured to connect the electronic device to a network; and an input-output (I/O) interface. The electronic device may operate based on an operating system stored in memory, such as Android, iOS, Windows, Mac OS X, Unix, Linux, FreeBSD, or the like.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, a division of a unit may be a division of a logic function, and an actual implementation may have another division, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or may not be executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) to execute all or part of the steps of the method according to the embodiments of the present invention.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that it is obvious to those skilled in the art that various modifications and improvements can be made without departing from the principle of the present invention, and these modifications and improvements should also be considered as the protection scope of the present invention.

Claims (18)

1. An information processing method applied to a touch terminal capable of presenting a graphical user interface, wherein content presented by the graphical user interface at least partially includes a game scene and a virtual subject, the method comprising:
providing a movement control area, and when a first touch operation acting on the movement control area is detected, controlling the virtual main body to move in the game scene according to the movement of a touch point of the first touch operation;
providing an orientation control area, and when a second touch operation acting on the orientation control area is detected, controlling the orientation of the virtual main body in the game scene according to the movement of a touch point of the second touch operation;
controlling a presentation field of view of the game scene on the graphical user interface according to the position and orientation of the virtual subject in the game scene;
when sound in a preset range in the game scene is detected, obtaining sound source azimuth information of the sound, and providing a first visual field control area in the graphical user interface;
when a third touch operation acting on the first visual field control area is detected, controlling the presenting visual field of the game scene on the graphical user interface according to the position of the virtual main body in the game scene and the sound source azimuth information, so that the presenting visual field of the game scene faces to the sound source;
and when the presenting visual field of the game scene on the graphical user interface faces the sound source, keeping controlling the virtual main body to move in the game scene according to the movement of the touch point of the first touch operation.
2. The method of claim 1, wherein the adjusting the rendered field of view of the game scene on the graphical user interface based on the position of the virtual subject in the game scene and the sound source position information to orient the rendered field of view of the game scene toward the sound source comprises:
and controlling the direction of a virtual camera corresponding to the graphical user interface according to the direction indicated by the sound source orientation information so as to enable the presenting view field of the game scene to face the sound source.
3. The method of claim 1, wherein the method further comprises:
and when the presenting visual field of the game scene on the graphical user interface faces the sound source, keeping the orientation of the game main body in the game scene before the third touch operation.
4. The method of claim 1, wherein the method further comprises:
and when detecting that a third touch operation acting on the first visual field control area is finished, controlling the presenting visual field of the game scene on the graphical user interface to be restored to the state before the third touch operation.
5. The method of claim 1, wherein the method further comprises:
and when a fourth touch operation acting on a cancel operation area is detected, controlling the display visual field of the game scene on the graphical user interface to be restored to the state before the third touch operation.
6. The method of claim 4 or 5, wherein the controlling the rendered field of view of the game scene on the graphical user interface to revert to a state prior to the third touch operation comprises:
controlling the presentation visual field of the game scene on the graphical user interface to recover to the presentation visual field before the third touch operation; or,
and controlling the presentation visual field of the game scene on the graphical user interface to be restored to the presentation visual field calculated according to the presentation visual field calculation logic before the third touch operation.
7. The method of claim 4 or 5, wherein the method further comprises:
hiding the first vision control area.
8. The method of claim 1, wherein the method comprises:
and when the presenting visual field of the game scene on the graphical user interface faces the sound source, when the preset action of the third touch operation is detected, adjusting the presenting visual field of the game scene on the graphical user interface according to the preset action.
9. The method of claim 8, wherein the predetermined action of the third touch operation is a touch slide operation.
10. The method of claim 9, wherein said adjusting a rendered field of view of said game scene on said graphical user interface in accordance with said preset action comprises:
and adjusting the presenting visual field of the game scene on the graphical user interface according to the sliding track of the touch sliding operation.
11. The method of claim 8, wherein the predetermined action of the third touch operation is a touch-and-click operation.
12. The method of claim 11, wherein said adjusting a rendered field of view of said game scene on said graphical user interface in accordance with said preset action comprises:
and changing the presentation visual field of the game scene on the graphical user interface according to the position of a preset point in the first visual field control area and the click position of the touch click operation.
13. The method of claim 1, wherein after the obtaining of the sound source bearing information of the sound, the method further comprises:
providing a first graphic identifier according to the current position of the virtual main body and the sound source azimuth information;
the first graphical identification is used to indicate an azimuthal relationship between the sound source and the virtual subject location.
14. The method of claim 1, wherein after the obtaining of the sound source bearing information of the sound, the method further comprises:
providing a second graphic identification according to the current position and orientation of the virtual main body and the sound source azimuth information;
the second graphical identification is used to indicate a positional relationship between the sound source and the virtual subject orientation.
15. The method of claim 1, wherein the method further comprises:
providing a second visual field control area;
when a fifth touch operation acting on the second visual field control area is detected, changing the presenting visual field of the game scene on the graphical user interface according to the fifth touch operation;
and when the end of the fifth touch operation is detected, controlling the display visual field of the game scene on the graphical user interface to be restored to the state before the fifth touch operation.
16. An information processing apparatus applied to a touch terminal capable of presenting a graphical user interface, wherein content presented by the graphical user interface at least partially includes a game scene and a virtual body, the apparatus comprising:
the first interaction unit is used for providing a movement control area, and when a first touch operation acting on the movement control area is detected, the virtual main body is controlled to move in the game scene according to the movement of a touch point of the first touch operation;
the second interaction unit is used for providing an orientation control area, and when a second touch operation acting on the orientation control area is detected, the orientation of the virtual main body in the game scene is controlled according to the movement of the touch point of the second touch operation;
the first control unit is used for controlling the presentation visual field of the game scene on the graphical user interface according to the position and the orientation of the virtual body in the game scene;
the display unit is used for detecting sound in a preset range in the game scene, acquiring sound source azimuth information of the sound, and providing a first visual field control area in the graphical user interface;
a second control unit, configured to, when a third touch operation acting on the first view control area is detected, control a presentation view of the game scene on the graphical user interface according to the position of the virtual subject in the game scene and the sound source azimuth information, so that the presentation view of the game scene faces the sound source;
and the movement control unit is used for keeping controlling the virtual main body to move in the game scene according to the movement of the touch point of the first touch operation when the presenting visual field of the game scene on the graphical user interface faces to the sound source.
17. A computer-readable storage medium on which a computer program is stored, the computer program, when being executed by a processor, implementing the information processing method of any one of claims 1 to 15.
18. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the information processing method of any one of claims 1 to 15 via execution of the executable instructions.
CN201711148063.1A 2017-11-17 2017-11-17 Information processing method, information processing device, electronic equipment and storage medium Active CN107982916B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711148063.1A CN107982916B (en) 2017-11-17 2017-11-17 Information processing method, information processing device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711148063.1A CN107982916B (en) 2017-11-17 2017-11-17 Information processing method, information processing device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN107982916A CN107982916A (en) 2018-05-04
CN107982916B true CN107982916B (en) 2020-11-06

Family

ID=62031688

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711148063.1A Active CN107982916B (en) 2017-11-17 2017-11-17 Information processing method, information processing device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN107982916B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109675311A (en) * 2019-01-10 2019-04-26 网易(杭州)网络有限公司 Display control method, device, storage medium, processor and terminal in game
CN112245913A (en) * 2020-10-22 2021-01-22 努比亚技术有限公司 Game visual angle control method, mobile terminal and computer storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017047078A1 (en) * 2015-09-16 2017-03-23 株式会社カプコン Game system, control method thereof, computer device-readable non-volatile recording medium
CN106975219A (en) * 2017-03-27 2017-07-25 网易(杭州)网络有限公司 Display control method and device, storage medium, the electronic equipment of game picture
CN107203321A (en) * 2017-03-27 2017-09-26 网易(杭州)网络有限公司 Display control method and device, storage medium, the electronic equipment of game picture

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8954890B2 (en) * 2012-04-12 2015-02-10 Supercell Oy System, method and graphical user interface for controlling a game

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017047078A1 (en) * 2015-09-16 2017-03-23 株式会社カプコン Game system, control method thereof, computer device-readable non-volatile recording medium
CN106975219A (en) * 2017-03-27 2017-07-25 网易(杭州)网络有限公司 Display control method and device, storage medium, the electronic equipment of game picture
CN107203321A (en) * 2017-03-27 2017-09-26 网易(杭州)网络有限公司 Display control method and device, storage medium, the electronic equipment of game picture

Also Published As

Publication number Publication date
CN107982916A (en) 2018-05-04

Similar Documents

Publication Publication Date Title
CN107832001B (en) Information processing method, information processing device, electronic equipment and storage medium
US10765947B2 (en) Visual display method for compensating sound information, computer readable storage medium and electronic device
CN107823882B (en) Information processing method, information processing device, electronic equipment and storage medium
CN107913520B (en) Information processing method, information processing device, electronic equipment and storage medium
CN109908574B (en) Game role control method, device, equipment and storage medium
CN108355354B (en) Information processing method, device, terminal and storage medium
CN107913516B (en) Information processing method, information processing device, electronic equipment and storage medium
US11794096B2 (en) Information processing method
CN109966738B (en) Information processing method, processing device, electronic device, and storage medium
US20190099669A1 (en) Information Processing Method and Apparatus, Electronic Device, and Storage Medium
JP6731461B2 (en) Information processing method and apparatus, storage medium, electronic device
JP6875346B2 (en) Information processing methods and devices, storage media, electronic devices
US11880956B2 (en) Image processing method and apparatus, and computer storage medium
CN108376424A (en) Method, apparatus, equipment and storage medium for carrying out view angle switch to three-dimensional virtual environment
CN111566596B (en) Real world portal for virtual reality displays
CN108211350B (en) Information processing method, electronic device, and storage medium
CN109407959B (en) Virtual object control method, device and storage medium in virtual scene
CN110917616A (en) Orientation prompting method, device, equipment and storage medium in virtual scene
CN108854063A (en) Method of sight, device, electronic equipment and storage medium in shooting game
CN108744513A (en) Method of sight, device, electronic equipment in shooting game and storage medium
CN107982916B (en) Information processing method, information processing device, electronic equipment and storage medium
CN113559501B (en) Virtual unit selection method and device in game, storage medium and electronic equipment
CN111973984B (en) Coordinate control method and device for virtual scene, electronic equipment and storage medium
CN113680054A (en) Game interaction method and device based on computer vision library
CN112987924A (en) Method, apparatus, device and storage medium for device interaction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant