CN107832001B - Information processing method, information processing device, electronic equipment and storage medium - Google Patents

Information processing method, information processing device, electronic equipment and storage medium Download PDF

Info

Publication number
CN107832001B
CN107832001B CN201711148849.3A CN201711148849A CN107832001B CN 107832001 B CN107832001 B CN 107832001B CN 201711148849 A CN201711148849 A CN 201711148849A CN 107832001 B CN107832001 B CN 107832001B
Authority
CN
China
Prior art keywords
game scene
visual field
user interface
graphical user
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711148849.3A
Other languages
Chinese (zh)
Other versions
CN107832001A (en
Inventor
翟公望
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201711148849.3A priority Critical patent/CN107832001B/en
Publication of CN107832001A publication Critical patent/CN107832001A/en
Application granted granted Critical
Publication of CN107832001B publication Critical patent/CN107832001B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback

Abstract

The invention discloses an information processing method, an information processing device, electronic equipment and a storage medium, wherein the method comprises the following steps: when sound in a preset range in the game scene is detected, obtaining sound source azimuth information of the sound, and providing a first visual field control area in the graphical user interface; and when a third touch operation acting on the first visual field control area is detected, switching to a multi-window display state, wherein in the multi-window display state, the graphical user interface comprises a main display window and an auxiliary display window, and the display visual field of the game scene in the auxiliary display window is controlled according to the position of the virtual main body in the game scene and the azimuth information. The invention solves the technical problem that the environment around the virtual main body cannot be checked while the environment around the sound source cannot be observed in the mobile terminal interaction mode.

Description

Information processing method, information processing device, electronic equipment and storage medium
Technical Field
The invention relates to the technical field of games, in particular to an information processing method, an information processing device, electronic equipment and a storage medium.
Background
With the development of mobile intelligent terminals and the game industry, a great amount of handgames with different subjects emerge to meet the requirements of players. In various shooting-type game applications, it is often desirable to observe the surrounding environmental conditions in real time.
In the existing shooting type mobile phone game application, the mobile operation is generally completed by a left hand, and the visual field of the game scene is adjusted by a right hand so as to find a target or a direction. The interaction mode can not quickly observe a specific area, and the control of the virtual character can not be considered when the specific area is observed, so that the action load of a player is increased, and the operation efficiency of the player is reduced; meanwhile, the operation threshold of a novice player is improved, and the game experience is reduced.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
An object of the present invention is to provide an information processing method, an information processing apparatus, an electronic device, and a computer-readable storage medium, which overcome one or more of the problems due to the limitations and disadvantages of the related art, at least to some extent.
Additional features and advantages of the invention will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of the present invention, there is provided an information processing method applied to a touch terminal capable of presenting a graphical user interface, where contents presented by the graphical user interface at least partially include a game scene and a virtual subject, the method including:
providing a movement control area, and when a first touch operation acting on the movement control area is detected, controlling a virtual main body to move in a game scene according to the movement of a touch point of the first touch operation;
providing an orientation control area, and when a second touch operation acting on the orientation control area is detected, controlling the orientation of the virtual main body in the game scene according to the movement of the touch point of the second touch operation;
controlling a presentation field of view of the game scene on the graphical user interface according to the position and orientation of the virtual subject in the game scene;
when sound in a preset range in a game scene is detected, acquiring sound source azimuth information of the sound, and providing a first visual field control area in a graphical user interface;
and when a third touch operation acting on the first visual field control area is detected, switching to a multi-window display state, wherein in the multi-window display state, the graphical user interface comprises a main display window and an auxiliary display window, and the display visual field of the game scene in the auxiliary display window is controlled according to the position of the virtual main body in the game scene and the sound source azimuth information.
Optionally, in the multi-window display state, the method further includes:
controlling the presentation field of view of the game scene in the main display window on the graphical user interface according to the position and orientation of the virtual main body in the game scene;
and controlling the presentation visual field of the game scene in the auxiliary display window according to the position of the virtual main body in the game scene and the sound source azimuth information, so that the presentation visual field of the game scene in the auxiliary display window faces the sound source.
Optionally, controlling the rendered field of view of the game scene in the main display window on the graphical user interface according to the position and orientation of the virtual subject in the game scene comprises:
and controlling the direction of the first virtual camera corresponding to the main display window according to the orientation of the virtual main body, so that the presenting visual field of the game scene in the main display window is changed along with the movement of the virtual main body.
Optionally, controlling a rendering field of view of the game scene in the secondary display window according to the position of the virtual subject in the game scene and the sound source orientation information includes:
and controlling the direction of a second virtual camera corresponding to the auxiliary display window according to the direction indicated by the sound source azimuth information so as to enable the presenting view field of the game scene in the auxiliary display window to face the sound source.
Optionally, in the multi-window display state, the method further includes:
and providing a visual field switching control, and switching the presenting visual fields of the game scenes in the main display window and the auxiliary display window when detecting a fourth touch operation acting on the visual field switching control.
Optionally, the method further comprises:
and when the third touch operation acting on the first visual field control area is detected to be finished, canceling the auxiliary display window, and controlling the display visual field of the game scene on the graphical user interface to be restored to the state before the third touch operation.
Optionally, the method further comprises:
and when a fifth touch operation acting on a cancel operation area is detected, canceling the auxiliary display window, and controlling the display visual field of the game scene on the graphical user interface to be restored to the state before the third touch operation.
Optionally, controlling the rendering view of the game scene on the graphical user interface to return to the state before the third touch operation includes:
controlling the display visual field of the game scene on the graphical user interface to recover to the display visual field before the third touch operation; alternatively, the first and second electrodes may be,
and controlling the display visual field of the game scene on the graphical user interface to be restored to the display visual field calculated according to the display visual field calculation logic before the third touch operation.
Optionally, the method further comprises: the first vision control area is hidden.
Optionally, the method comprises:
and in the multi-window display state, when the preset action of the third touch operation is detected, adjusting the presentation visual field of the game scene in the auxiliary display window on the graphical user interface according to the preset action.
Optionally, the preset action of the third touch operation is a touch sliding operation.
Optionally, adjusting a presentation field of view of a game scene in an auxiliary display window on the graphical user interface according to a preset action includes:
and adjusting the presenting visual field of the game scene in the auxiliary display window on the graphical user interface according to the sliding track of the touch sliding operation.
Optionally, the preset action of the third touch operation is a touch click operation.
Optionally, adjusting a presentation field of view of a game scene in an auxiliary display window on the graphical user interface according to a preset action includes:
and changing the presentation visual field of the game scene in the auxiliary display window on the graphical user interface according to the position of a preset point in the first visual field control area and the click position of the touch click operation.
Optionally, after obtaining the sound source orientation information of the sound, the method further comprises:
providing a first graphic identification according to the current position of the virtual main body and the azimuth information of the sound source;
the first graphical identification is used to indicate an azimuthal relationship between the sound source and the virtual subject position.
Optionally, after obtaining the sound source orientation information of the sound, the method further comprises:
providing a second graphic identification according to the current position and orientation of the virtual main body and the sound source azimuth information;
the second graphical identification is used to indicate a positional relationship between the sound source and the virtual subject orientation.
Optionally, the method further comprises:
providing a second visual field control area;
when a sixth touch operation acting on the second visual field control area is detected, changing the presentation visual field of the game scene on the graphical user interface according to the sixth touch operation;
and when the end of the sixth touch operation is detected, controlling the display visual field of the game scene on the graphical user interface to be restored to the state before the sixth touch operation.
According to a second aspect of the present invention, there is provided an information processing apparatus for a touch terminal capable of presenting a graphical user interface, the content presented by the graphical user interface at least partially including a game scene, and a virtual subject, the apparatus comprising:
the first interaction unit is used for providing a movement control area, and when a first touch operation acting on the movement control area is detected, the virtual main body is controlled to move in a game scene according to the movement of a touch point of the first touch operation;
the second interaction unit is used for controlling the orientation of the virtual main body in the game scene according to the movement of the touch point of the second touch operation when the second touch operation acting on the orientation control area is detected;
the first control unit controls the presentation visual field of the game scene on the graphical user interface according to the position and the orientation of the virtual main body in the game scene;
the display unit is used for acquiring sound source azimuth information of sound when the sound in a preset range in a game scene is detected, and providing a first visual field control area in a graphical user interface;
and the second control unit is used for switching to a multi-screen display state when detecting a third touch operation acting on the first visual field control area, providing an auxiliary display window, and controlling the presenting visual field of the game scene in the auxiliary display window according to the position of the virtual main body in the game scene and the sound source azimuth information.
According to a third aspect of the present invention, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the information processing method of any one of the above.
According to a fourth aspect of the present invention, there is provided an electronic apparatus comprising: a processor; and a memory for storing executable instructions for the processor; wherein the processor is configured to perform the information processing method of any one of the above via execution of executable instructions.
In an information processing method, an information processing apparatus, an electronic device, and a computer-readable storage medium according to an exemplary embodiment of the present invention, when a sound within a preset range in a game scene is detected, sound source direction information of the sound is acquired, and a first visual field control area is provided in a graphical user interface; when a third touch operation acting on the first visual field control area is detected, the presenting visual field of the game scene on the graphical user interface is controlled according to the position of the virtual main body in the game scene and the sound source azimuth information, so that the presenting visual field of the game scene faces the sound source.
By the method provided by the invention, on one hand, a user can conveniently and quickly switch from the original visual field operation to the new visual field operation so as to quickly observe the direction of the sound source, and quickly switch back to the original visual angle operation system when not needed. On the other hand, by providing the auxiliary display window, the player can simultaneously control the virtual main body and observe the game environment around the virtual main body and the sound source, richer game experience is brought to the player, and game strategy feeling is also improved. That is, the player can quickly view the direction of the sound source, for example, the sound is a gunshot sound, and the player can quickly view the direction of the gunshot sound and control the virtual body to avoid or adjust the corresponding game strategy.
Because the player does not need to spend excessive energy to search for the sound source through sliding operation, and the real-time control on the virtual character is not influenced while the sound source is locked, the convenient and efficient interaction method improves the operation efficiency, brings more convenient game experience to the player, and improves the game strategy; meanwhile, the operation threshold of a novice player is reduced, and the technical problem that the environment around the virtual main body cannot be checked while the environment around the sound source cannot be observed in the mobile terminal interaction mode is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 schematically illustrates a flow chart of a method of information processing in an exemplary embodiment of the disclosure;
FIG. 2 schematically illustrates a first graphical user interface diagram of a mobile terminal in an exemplary embodiment of the disclosure;
FIG. 3 schematically illustrates a diagram of a second graphical user interface of a mobile terminal in an exemplary embodiment of the disclosure;
FIG. 4 schematically illustrates a diagram of a third graphical user interface of a mobile terminal in an exemplary embodiment of the disclosure;
FIG. 5 schematically illustrates a fourth graphical user interface of a mobile terminal in an exemplary embodiment of the disclosure;
FIG. 6 schematically illustrates a fifth graphical user interface of a mobile terminal in an exemplary embodiment of the disclosure;
FIG. 7 is a diagram illustrating schematically a sixth graphical user interface of a mobile terminal in an exemplary embodiment of the disclosure;
FIG. 8 schematically illustrates a diagram of a seventh graphical user interface of a mobile terminal in an exemplary embodiment of the present disclosure;
fig. 9 schematically illustrates a diagram of an eighth graphical user interface of a mobile terminal in an exemplary embodiment of the present disclosure.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In accordance with one embodiment of the present invention, there is provided an information processing method, wherein the steps shown in the flowchart of the figure may be executed in a computer system such as a set of computer executable instructions, and wherein, although a logical order is shown in the flowchart, in some cases, the steps shown or described may be executed in an order different from that shown.
The exemplary embodiment first discloses an information processing method, which is applied to a touch terminal capable of presenting a graphical user interface, where the touch terminal may be various electronic devices with touch screens, such as a mobile phone, a tablet computer, a notebook computer, a game machine, and a PDA. The graphical user interface may be obtained by executing a software application on a processor of the touch terminal and rendering on a display of the touch terminal, the content presented by the graphical user interface at least partially comprising a game scene and a virtual body.
As shown in fig. 1, the information processing method may include the steps of:
step S110, providing a movement control area, and controlling a virtual main body to move in a game scene according to the movement of a touch point of a first touch operation when the first touch operation acting on the movement control area is detected;
step S130, providing an orientation control area, and controlling the orientation of the virtual main body in the game scene according to the movement of the touch point of the second touch operation when the second touch operation acting on the orientation control area is detected;
step S150, controlling the display visual field of the game scene on the graphical user interface according to the position and the orientation of the virtual main body in the game scene;
step S170, when detecting a sound within a preset range in a game scene, acquiring sound source azimuth information of the sound, and providing a first visual field control area in a graphical user interface;
step S190, when a third touch operation acting on the first visual field control area is detected, switching to a multi-window display state, wherein in the multi-window display state, the graphical user interface comprises a main display window and an auxiliary display window, and controlling the display visual field of the game scene in the auxiliary display window according to the position of the virtual main body in the game scene and the source azimuth information.
The method provided by the invention can conveniently and quickly switch the user from the original visual field operation to the new visual field operation so as to quickly observe the direction of the sound source, and quickly switch back to the original visual field operation system when not needed. That is, the player can quickly view the orientation of the sound source and observe the game scene near the sound source. For example, the sound is a gunshot sound, and the player can quickly check the direction of the gunshot sound and avoid or adjust the corresponding game strategy in time.
Because the player does not need to spend excessive energy to search for the sound source through sliding operation, the convenient and efficient interaction method improves the operation efficiency, brings more convenient game experience to the player, and improves the game strategy; meanwhile, the operation threshold of a novice player is reduced, and the technical problem that a sound source cannot be locked quickly in a mobile terminal interaction mode is solved.
Next, the steps of the information processing method in the present exemplary embodiment are further described with reference to fig. 2 to 9.
In the exemplary embodiment, a software application is executed on a processor of the mobile terminal and rendered on a touch-sensitive display of the mobile terminal resulting in a graphical user interface 200, the content presented by the graphical user interface 200 at least partially comprising a game scene 210, and a virtual subject 220.
The content presented by the graphical user interface 200 may include all of the game scene 210 or may be a portion of the game scene 210. For example, in an embodiment of the present invention, as shown in fig. 2, since the game scene 210 is relatively large, the partial content of the game scene 210 is displayed on the graphic user interface 200 of the mobile terminal during the game.
Step S110, providing a movement control area, and when a first touch operation acting on the movement control area is detected, controlling the virtual body to move in the game scene according to the movement of the touch point of the first touch operation.
In the present exemplary embodiment, as shown in fig. 2, a movement control area 230 is provided in the gui 200, and when a first touch operation applied to the movement control area 230 is detected, the virtual body 220 is controlled to move in the game scene 210 according to the movement of the touch point of the first touch operation.
Specifically, the movement control area 230 may be an area having a visual indication effect in the graphic user interface 200, or may be an area having no visual indication effect; an operation area such as a virtual joystick or a direction control virtual key may be displayed in the movement control area 230, which is not limited in the present exemplary embodiment.
In an embodiment of the present invention, the movement control area 230 is a virtual joystick control area, which is located at the lower left of the gui 200 and controls the virtual body 220 to move in the game scene 210 according to the first touch operation received by the virtual joystick control area.
It is understood that, in other embodiments, the movement control area 230 may also be a virtual cross key area/virtual direction key (D-PAD) area, and the virtual body 220 is controlled to move in the game scene 210 according to the first touch operation received by the virtual cross key area.
As an alternative embodiment, the movement control area 230 may be a visually indicated area of the graphical user interface 200, for example, the movement control area 230 may have a bounding box, or a range of fill colors, or a range of predetermined transparencies, or in some other manner that visually distinguishes the movement control area 230. The virtual body 220 is controlled to move in the game scene 210 according to the first touch operation received by the movement control area 230. The movement control area 230 with the visual indication enables the user to quickly locate the area, which can reduce the difficulty of operation by a novice game player.
As another alternative, the movement control area 230 may be an area of the graphical user interface 200 that does not have a visual indication. The movement control area 230 without the visual indication does not cover or affect the game screen, provides better picture effect, and saves screen space. But is not easily perceived by the player because it does not have a visual indication, as an improved embodiment, a visual guidance control may be displayed in the movement control area 230, for example, in an embodiment of the present invention, when a virtual joystick is used as the direction control scheme of the virtual body 220, a virtual joystick may be displayed in the movement control area 230 to visually guide the player.
Step S130, providing an orientation control area, and when a second touch operation acting on the orientation control area is detected, controlling the orientation of the virtual body in the game scene according to the movement of the touch point of the second touch operation.
In the present exemplary embodiment, as shown in fig. 2, an orientation control area 240 is provided in the gui 200, and when a second touch operation applied to the orientation control area 240 is detected, the orientation of the virtual body 220 in the game scene 210 is controlled according to the movement of the touch point of the second touch operation.
Specifically, the heading control region 240 and the movement control region 230 are disposed on different sides of the graphical user interface 200, e.g., the heading control region 240 may be located anywhere to the right of the graphical user interface and the corresponding movement control region 230 may be located anywhere to the left of the graphical user interface. Preferably, in an embodiment of the present invention, as shown in fig. 2, the orientation control area 240 is disposed at a lower right position of the gui 200 for controlling the orientation of the virtual subject 220 in the game scene 210; the movement control area 230 is disposed at a lower left position of the graphic user interface 200, and is used for controlling the virtual body 220 to move in the game scene 210; thus, the user can control the movement of the virtual body 220 in the game scene 210 by the left hand and the orientation of the virtual body 220 in the game scene 210 by the right hand.
The orientation control area 240 may be an area of the graphical user interface 200 having a visual indication effect or an area having no visual indication effect; an operation area such as a virtual joystick or a direction control virtual key may be displayed in the direction control area 240, which is not limited in the present exemplary embodiment.
In an embodiment of the present invention, the orientation control area 240 is a virtual joystick control area located at the lower right of the gui 200, and controls the orientation of the virtual body 220 in the game scene 210 according to the movement of the touch point of the second touch operation received by the virtual joystick control area.
It is understood that, in other embodiments, the orientation control area 240 may also be a virtual cross key area/virtual direction key (D-PAD) area, and controls the orientation of the virtual body 220 in the game scene 210 according to the movement of the touch point of the second touch operation received by the virtual cross key area.
As an alternative embodiment, the orientation control area 240 may be a visually indicated area of the GUI 200, for example, the orientation control area 240 may have a bounding box, or a range of fill colors, or a range of predetermined transparencies, or in other ways that can visually distinguish the orientation control area 240. The orientation of the virtual body 220 in the game scene 210 is controlled according to the movement of the touch point of the second touch operation received toward the control area 240. The orientation control area 240 with the visual indication enables the user to quickly locate the area, which can reduce the difficulty of operation by a novice game player.
As another alternative, the orientation control area 240 may be an area of the graphical user interface 200 that does not have a visual indication. The orientation control area 240 without visual indication does not obscure or affect the game screen, providing better picture effect and saving screen space. But is not readily perceived by the player because it does not have a visual indication, as an improved embodiment, a visual guidance control may be displayed in the orientation control area 240, for example, in an embodiment of the present invention, when a virtual joystick is used as the orientation control scheme of the virtual body 220, a virtual joystick may be displayed in the orientation control area 240 to visually guide the player.
And S150, controlling the presentation visual field of the game scene on the graphical user interface according to the position and the orientation of the virtual main body in the game scene.
In the present exemplary embodiment, the position and orientation of the virtual camera corresponding to the graphical user interface 200 is controlled according to the position and orientation of the virtual subject 220 on the game scene 210, thereby controlling the presentation field of view of the game scene 210 on the graphical user interface 200.
In the first person game, the virtual camera may be an "eye" of the user in the game, the virtual camera may be disposed on the head of the virtual body, the orientation of the virtual camera rotates along with the rotation of the virtual body, and the game scene content rendered on the display of the touch terminal is equivalent to the scene content captured by the virtual camera. In the third person game, a virtual camera may be disposed at the upper rear of the virtual body, and all game scenes may be photographed. A mapping relation can be set between the vector distance of the virtual rocker control and the rotation angle of the virtual camera so as to control the virtual camera to rotate.
Specifically, the virtual body 220 is controlled to move in the game scene according to the movement of the touch point of the first touch operation in step 110, so as to control the movement of the virtual camera corresponding to the graphical user interface 200; meanwhile, the orientation of the virtual body 220 in the game scene 210 is controlled according to the movement of the touch point of the second touch operation in step 130, thereby controlling the direction of the virtual camera corresponding to the graphic user interface 200. The real-time rendering field of view of the game scene 210 on the graphical user interface 200 is controlled based on the position of the virtual subject 220 on the game scene 210 and the position and orientation of the virtual camera corresponding to the orientation control graphical user interface 200.
Step S170, when detecting a sound within a preset range in the game scene, acquiring sound source direction information of the sound, and providing a first view control area in the graphical user interface.
In the present exemplary embodiment, the preset range may be an auditory range of the virtual subject 220, and may also be a visual range of the virtual subject 220; the location of the source of the sound within the game scene 210 may be within the graphical user interface 200 or outside the graphical user interface 200. For example, as shown in fig. 3, when a sound signal within the hearing range of the virtual subject 220 in the game scene 210 outside the gui 200 is detected, sound source azimuth information of the sound source 250 of the sound signal is acquired, and a first visual field control region 260 is provided in the gui 200.
In the present exemplary embodiment, the first visual field control region 260 and the movement control region 230 are disposed on different sides of the graphical user interface 200, e.g., the first visual field control region 260 may be located at any position on the right side of the graphical user interface 200 and the corresponding movement control region 230 may be located at any position on the left side of the graphical user interface 200. Preferably, in an embodiment of the present invention, as shown in fig. 3, the first visual field control region 260 is disposed at a right position of the gui 200 and above the orientation control region 240 for controlling the rendering visual field in the game scene 210; the movement control area 230 is disposed at a lower left position of the graphic user interface 200, and is used for controlling the virtual body 220 to move in the game scene 210; thus, the user switches the rendering field of view of the game scene 210 with the right hand while controlling the virtual body 220 to move in the game scene 210 with the left hand.
In other embodiments, the first field of view control region 260 may also be disposed on the same side of the graphical user interface 200 as the movement control region 230.
In the present exemplary embodiment, the first visual field control region 260 may be a region having a visual indication effect in the graphic user interface 200, or may be a region having no visual indication effect; an operation area such as a virtual joystick or a direction control virtual key may be displayed in the first field control area 260, which is not limited in the present exemplary embodiment.
As a preferred embodiment, the first visual control area 260 may be a region of the graphical user interface 200 having a visual indication, for example, the first visual control area 260 may have a bounding box, or a range of fill colors, or a range of predetermined transparencies, or in other ways that visually distinguish the first visual control area 260. The first field of view control region 260 with a visual indication allows the user to quickly locate the region, thereby reducing the difficulty of operation by the novice of the game.
In the present exemplary embodiment, the first visual field control region 260 may further have an information indicator corresponding to the sound content for indicating a category of the sound content, and the sound content may be a gunshot, a footstep, a door closing sound, an explosion sound, or the like. The information indicating identification may be to render the first vision control area 260 into a particular pattern or shape, the content of which pattern or shape representation corresponds to sound content; it is also possible to identify, i.e. render, a first visual field control area 260 with an arbitrary pattern, and add a text description corresponding to the sound content in the first visual field control area 260.
The information indicator can be rendered into any pattern which can be associated with the sound content, and the first visual field control area 260 with the information indicator can enable a user to quickly identify the type of the detected sound, adjust the game strategy in real time and improve the operation efficiency.
Step S190, when a third touch operation acting on the first visual field control area is detected, switching to a multi-window display state, wherein in the multi-window display state, the graphical user interface comprises a main display window and an auxiliary display window, and controlling the display visual field of the game scene in the auxiliary display window according to the position of the virtual main body in the game scene and the source azimuth information.
In the present exemplary embodiment, the third touch operation may be one or a combination of a click operation, a touch operation over a preset time, and a press operation over a preset pressure.
Taking the third touch operation as a click operation as an example, when a click operation acting on the first view control area is detected, the display state is switched to a multi-window display state, as shown in fig. 4, in the multi-window display state, the graphical user interface includes a main display window 201 and an auxiliary display window 202, and the rendering view of the game scene in the auxiliary display window 202 is controlled according to the position of the virtual main body 220 in the game scene and the sound source orientation information of the sound source 250.
In alternative embodiments, the area of the main display window 201 may be larger than the auxiliary display window 202, or may be equal to the auxiliary display window 202; the auxiliary display window 202 may be non-overlapping or at least partially overlapping with the main display window 201, for example, the main display window 201 may be provided on a graphical user interface, and after detecting a third touch operation applied to the first visual field control region 260, the auxiliary display window 202 may be presented on the graphical user interface, and the auxiliary display window 202 may be partially overlaid on the main display window 201.
In the present exemplary embodiment, in the multi-window display state, the presentation field of view of the game scene in the main display window on the graphical user interface is controlled in accordance with the position and orientation of the virtual subject in the game scene; and controlling the presentation visual field of the game scene in the auxiliary display window according to the position of the virtual main body in the game scene and the sound source azimuth information, so that the presentation visual field of the game scene in the auxiliary display window faces the sound source. That is, the main display window 201 is used to display a game scene picture in which the virtual subject 220 is located, and the auxiliary display window 202 is used to display a game scene picture in which the sound source is located.
Specifically, the method for controlling the presentation visual field of a game scene in a main display window on a graphical user interface according to the position and the orientation of a virtual main body in the game scene comprises the following steps: and controlling the direction of the first virtual camera corresponding to the main display window according to the orientation of the virtual main body, so that the presenting visual field of the game scene in the main display window is changed along with the movement of the virtual main body.
As described above, the virtual body 220 is controlled to move in the game scene according to the movement of the touch point of the first touch operation in step 110, so as to control the movement of the first virtual camera corresponding to the main display window 201 on the graphical user interface 200; meanwhile, the orientation of the virtual body 220 in the game scene 210 is controlled according to the movement of the touch point of the second touch operation in step 130, thereby controlling the direction of the first virtual camera corresponding to the main display window 210 on the graphical user interface 200. The real-time rendering view of the game scene 210 in the main display window 201 on the graphical user interface 200 is controlled in accordance with the position and orientation of the virtual subject 220 in the game scene 210 and the position and orientation of the first virtual camera corresponding to the main display window 201 on the control graphical user interface 200.
Controlling a rendering field of view of the game scene in the secondary display window according to the position of the virtual subject in the game scene and the source orientation information, comprising: and controlling the direction of a second virtual camera corresponding to the auxiliary display window according to the direction indicated by the sound source azimuth information so as to enable the presenting view field of the game scene in the auxiliary display window to face the sound source.
For example, fig. 4 shows that when a third touch operation on the first visual field control region 260 is detected, the visual field of the game scene 210 in the auxiliary display window 202 on the graphical user interface 200 is presented, and with respect to fig. 3, when a sound signal within the hearing range of the virtual subject 220 in the game scene 210 outside the graphical user interface 200 is detected, sound source orientation information of the sound source 250 of the sound signal is acquired, and the first visual field control region 260 is rendered on the graphical user interface 200; when the third touch operation acting on the first visual field control area 260 is detected, an auxiliary display window 202 is provided, the direction of a second virtual camera corresponding to the auxiliary display window 202 on the graphical user interface 200 is controlled according to the direction indicated by the sound source orientation information of the sound source 250, the second virtual camera is controlled to face the sound source 250, and therefore the presenting visual field of the game scene 210 in the auxiliary display window 202 on the graphical user interface 200 is controlled, so that the presenting visual field of the game scene 210 faces the sound source 250.
In the present exemplary embodiment, the rendering field of view of the game scene toward the sound source includes: the sound source is positioned within a rendered field of view of the game scene. That is, when the presentation field of view of the game scene 210 in the auxiliary display window 202 is directed to the sound source, the sound source may be displayed in the auxiliary display window 202 or may not be displayed in the auxiliary display window 202 (for example, may be blocked by a game screen such as another virtual building or a virtual lawn). For example, the sound source is a virtual character which is shooting, and if the virtual character stands in an open and unobstructed field, and when the presentation field of view of the game scene in the auxiliary display window 202 faces the virtual character, the virtual character is displayed in the auxiliary display window 202 on the graphical user interface, the user can see the sound source; if the virtual character crawls over the virtual grassland, or is located in a house or a rock mass, when the presenting view of the game scene in the auxiliary display window 202 faces the virtual character, the virtual character is shielded by the rock mass and is not displayed in the auxiliary display window 202 on the graphical user interface, and the user cannot see the sound source.
Through the third touch operation acting on the first visual field control area, the user can timely make feedback according to the sound of the surrounding game environment while not influencing the control of the virtual main body, so that the environmental condition of the sound source is quickly checked, and the operation efficiency is improved.
In the present exemplary embodiment, in the multi-window display state, a view switching control is provided, and when a fourth touch operation acting on the view switching control is detected, the presentation views of the game scene in the main display window and the auxiliary display window are switched.
Specifically, as shown in fig. 5, in the multi-window display state, a view switching control 270 is provided, and when a fourth touch operation on the view switching control 270 is detected, the presentation views of the game scenes in the main display window 201 and the auxiliary display window 202 are switched. That is, the rendered field of view of the game scene in the display window 202 is controlled on the graphical user interface in accordance with the position and orientation of the virtual subject 220 in the game scene; the presentation field of view of the game scene in the main display window 201 is controlled in accordance with the position of the virtual subject 220 in the game scene and the source orientation information so as to direct the presentation field of view of the game scene in the main display window 201 toward the sound source 250.
In the present exemplary embodiment, when it is detected that the third touch operation applied to the first visual field control region is ended, the auxiliary display window is cancelled, and the presentation visual field of the game scene on the graphical user interface is controlled to be restored to the state before the third touch operation.
As an alternative embodiment, a cancel operation area may be further set on the graphical user interface, and when a fifth touch operation acting on the cancel operation area is detected, the auxiliary display window is cancelled, and the presentation field of the game scene on the graphical user interface is controlled to return to a state before the third touch operation.
The cancel operation area may be the first touch area or another area on the graphical user interface different from the first view control area. And when a fifth touch operation acting on the cancel operation area is detected, canceling the auxiliary display window, and controlling the display visual field of the game scene on the graphical user interface to be restored to the state before the third touch operation.
In this exemplary embodiment, controlling the rendering field of view of the game scene on the graphical user interface to return to the state before the third touch operation includes: controlling the display visual field of the game scene on the graphical user interface to recover to the display visual field before the third touch operation; or controlling the display visual field of the game scene on the graphical user interface to be restored to the display visual field calculated according to the display visual field calculation logic before the third touch operation.
Specifically, the present view field of the game scene screen on the graphical user interface is controlled to return to the present view field before the third touch operation, that is, the present view field range is controlled to return to the state before the third touch operation: the position and angle/direction of the virtual camera in the game scene are restored to the state before the third touch operation. That is, the presentation field of view of the game scene screen on the graphical user interface is controlled based on the position of the virtual camera in the game scene coordinates and the shooting direction in the coordinates before the third touch operation.
Controlling the display visual field of the game scene picture on the graphical user interface to be restored to the display visual field calculated according to the display visual field calculation logic before the third touch operation, namely restoring the visual field to the control state before the third touch operation, for example: before the third touch operation, the game calculates the visual field according to the preset calculation logic, (for example, the virtual camera is arranged at the head of the virtual body and rotates along with the rotation of the virtual body), in such a case, the visual field of the invention is restored to the state before the third touch operation, and the calculation logic before the third touch operation is adopted to calculate the visual field can also be restored. That is, the presentation field of view of the scene screen on the graphical user interface is controlled based on the position of the current virtual body in the game scene coordinates, the orientation of the current virtual body and/or the weapon sight direction of the virtual body, the positional relationship of the virtual camera relative to the virtual body in the game scene before the third touch operation, the orientation of the virtual body and/or the association relationship between the weapon sight direction of the virtual body and the shooting direction of the virtual camera.
In the present exemplary embodiment, the first visual field control region is hidden when the presentation visual field of the game scene on the control graphical user interface is restored to the state before the third touch operation.
Specifically, to save screen space and avoid blocking or affecting game screens, when the presentation field of view of the game scene on the control gui is restored to the state before the third touch operation, the first field-of-view control region is hidden, and the presentation field of view of the game scene on the control gui is controlled according to the position and orientation of the virtual body in the game scene.
In the present exemplary embodiment, in the multi-window display state, when the preset action of the third touch operation is detected, the presentation field of view of the game scene in the auxiliary display window on the graphical user interface is controlled according to the preset action.
Specifically, in the multi-window display state, when a preset action of the third touch operation is detected, the presenting view of the game scene in the auxiliary display window on the graphical user interface is controlled according to the preset action, so that the game scene in the preset range near the sound source can be observed. For example, as shown in fig. 6, the touch point of the preset action of the third touch operation applied to the first visual field control region is located in the 12 o 'clock direction of the first visual field control region, and the presentation visual field of the game scene in the auxiliary display window on the graphical user interface moves to the 12 o' clock direction of the sound source 250, compared to fig. 4.
In an embodiment of the invention, the preset action of the third touch operation is a touch sliding operation.
Controlling a presentation field of view of a game scene on a graphical user interface according to a preset action, comprising: and controlling the presenting visual field of the game scene in the auxiliary display window on the graphical user interface according to the sliding track of the touch sliding operation.
Specifically, when a third touch operation is detected, the presenting visual field of the game scene in the auxiliary display window on the graphical user interface is controlled to be directed to the sound source, and the third touch operation can be a touch operation, a long press operation or a heavy press operation which acts on the first visual field control area; when the sliding operation consistent with the third touch operation is detected, the direction of the second virtual camera is changed according to the sliding track of the sliding operation, the presenting visual field of the game scene picture in the auxiliary display window on the graphical user interface is changed by changing the direction of the second virtual camera, and the adjusting rotation direction of the presenting visual field of the game scene picture in the auxiliary display window is the same as the sliding direction so as to observe the game scene in the preset range near the sound source.
In another embodiment of the present invention, the predetermined action of the third touch operation is a touch click operation.
Adjusting a presentation field of a game scene on a graphical user interface according to a preset action, comprising: and changing the presentation visual field of the game scene in the auxiliary display window on the graphical user interface according to the position of a preset point in the first visual field control area and the click position of the touch click operation.
Specifically, when a third touch operation is detected, the presenting view of the game scene in the auxiliary display window on the graphical user interface is controlled to face the sound source, and the third touch operation can be a click operation acting on the first view control area or an operation of leaving the touch screen after long pressing or heavy pressing; when touch clicking after the third touch operation is detected, determining a vector between a clicking position of the touch clicking operation and a preset point position according to the position of a preset point in the first visual field control area and the clicking position of the touch clicking operation; and changing the rotation angle of the second virtual camera corresponding to the presenting view according to the vector to determine the orientation of the second virtual camera, so as to control the presenting view of the game scene in the auxiliary display window on the graphical user interface and observe the game scene in the preset range near the sound source.
In the present exemplary embodiment, after acquiring sound source orientation information of sound, a first graphic identification is provided according to the current position of the virtual subject and the sound source orientation information; the first graphical identification is used to indicate an azimuthal relationship between the sound source and the virtual subject position.
The first graphic indicia may be a ruler, compass, or other form capable of indicating a bearing relationship. Taking the form of a ruler as an example of the first graphic identification, as shown in fig. 7, after acquiring the sound source azimuth information of the sound source 250 of the sound, a first graphic identification 280 indicating the azimuth relationship between the sound source 250 and the position of the virtual subject 220 is provided based on the current position of the virtual subject 22 and the sound source azimuth information of the sound source 250. Specifically, a plane coordinate is established in the game scene 210 with the virtual subject 220 as a center of circle, and the 12 o 'clock direction of the virtual subject 220 in the game scene is set as true north, the 3 o' clock direction is true east, the 6 o 'clock direction is true south, and the 9 o' clock direction is true west; sound source bearing information of the sound source 250 is acquired, and in the present embodiment, it is determined that the sound source 250 is located 45 ° north of the virtual subject 220 based on the position of the virtual subject and the sound source bearing information of the sound source 250, and a first graphic identification 280 indicating a bearing relationship between the sound source 250 and the virtual subject 220 is provided based on the position of the virtual subject 220 and the sound source bearing information of the sound source 250.
In the present exemplary embodiment, after the sound source position information of the sound is acquired, the second graphic identification is provided according to the current position and orientation of the virtual subject, and the sound source position information; the second graphical identification is used to indicate a positional relationship between the sound source and the virtual subject orientation.
The second graphic mark can be words or other forms capable of indicating the position relationship. Taking characters as the second graphic identifier for example, as shown in fig. 8, for example, the virtual body 220 is oriented west, after sound source azimuth information of the sound source 250 of the sound is acquired, it is determined that the sound source 250 is located right behind the virtual body 220 according to the position and orientation of the virtual body 220 and the sound source azimuth information of the sound source 250, and the characters "right rear" are rendered on the graphical user interface 200 as the second graphic identifier 281 indicating the positional relationship between the sound source 250 and the orientation of the virtual body 220.
In the present exemplary embodiment, a second vision control area is provided; when a fifth touch operation acting on the second visual field control area is detected, changing the presentation visual field of the game scene on the graphical user interface according to the fifth touch operation; and when the end of the fifth touch operation is detected, controlling the display visual field of the game scene on the graphical user interface to be restored to the state before the fifth touch operation.
Specifically, as shown in fig. 9, a second field of view control region 290 is provided; when a fifth touch operation acting on the second visual field control area 290 is detected, changing the presentation visual field of the game scene on the graphical user interface according to the fifth touch operation; when the end of the fifth touch operation is detected, the presenting view of the game scene on the control graphical user interface is restored to the state before the fifth touch operation, that is, the presenting view of the game scene on the control graphical user interface is controlled according to the position and the orientation of the virtual main body in the game scene.
By the method provided by the invention, on one hand, a user can conveniently and quickly switch from the original visual field operation to the new visual field operation so as to quickly observe the direction of the sound source, and quickly switch back to the original visual angle operation system when not needed. On the other hand, by providing the auxiliary display window, the player can simultaneously control the virtual main body and observe the game environment around the virtual main body and the sound source, richer game experience is brought to the player, and game strategy feeling is also improved. That is, the player can quickly view the direction of the sound source, for example, the sound is a gunshot sound, and the player can quickly view the direction of the gunshot sound and control the virtual body to avoid or adjust the corresponding game strategy.
Because the player does not need to spend excessive energy to search for the sound source through sliding operation, and the real-time control on the virtual character is not influenced while the sound source is locked, the convenient and efficient interaction method improves the operation efficiency, brings more convenient game experience to the player, and improves the game strategy; meanwhile, the operation threshold of a novice player is reduced, and the technical problem that the environment around the virtual main body cannot be checked while the environment around the sound source cannot be observed in the mobile terminal interaction mode is solved.
According to an embodiment of the present invention, there is provided an information processing apparatus applied to a touch terminal capable of presenting a graphical user interface, where contents presented by the graphical user interface include a game scene and a virtual subject, the apparatus including:
the first interaction unit is used for providing a movement control area, and controlling the virtual main body to move in the game scene according to a first touch operation when the first touch operation acting on the movement control area is detected;
the second interaction unit is used for providing an orientation control area, detecting second touch operation acting on the orientation control area, and controlling the orientation of the virtual main body in the game scene according to the movement of a touch point of the second touch operation;
the first control unit is used for controlling the presentation visual field of the game scene on the graphical user interface according to the position and the orientation of the virtual main body in the game scene;
the display unit is used for detecting sound in a preset range in a game scene, acquiring sound source azimuth information of the sound and providing a first visual field control area in a graphical user interface;
and the second control unit is used for switching to a multi-screen display state when detecting a third touch operation acting on the first visual field control area, providing an auxiliary display window, and controlling the presenting visual field of the game scene in the auxiliary display window according to the position of the virtual main body in the game scene and the sound source azimuth information.
The details of each information processing apparatus unit are already described in detail in the corresponding information processing method, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
There is further provided, according to an embodiment of the present invention, a computer-readable storage medium having stored thereon a program product capable of implementing the above-mentioned method of the present specification. In some possible embodiments, aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above-mentioned "exemplary methods" section of the present description, when the program product is run on the terminal device. Which may employ a portable compact disc read only memory (CD-ROM) and include program code and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
According to an embodiment of the present invention, there is also provided an electronic apparatus including: the processing components, which may further include one or more processors, and memory resources, represented by memory, for storing instructions, such as application programs, that are executable by the processing components. The application program stored in the memory may include one or more modules that each correspond to a set of instructions. Further, the processing component is configured to execute the instructions to perform the information processing method described above.
The electronic device may further include a power supply component configured to power manage the executing electronic device, a wired or wireless network interface configured to connect the electronic device to a network, and an input output (I/O) interface.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, a division of a unit may be a division of a logic function, and an actual implementation may have another division, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or may not be executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) to execute all or part of the steps of the method according to the embodiments of the present invention.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that it is obvious to those skilled in the art that various modifications and improvements can be made without departing from the principle of the present invention, and these modifications and improvements should also be considered as the protection scope of the present invention.

Claims (19)

1. An information processing method applied to a touch terminal capable of presenting a graphical user interface, wherein content presented by the graphical user interface at least partially includes a game scene and a virtual subject, the method comprising:
providing a movement control area, and when a first touch operation acting on the movement control area is detected, controlling the virtual main body to move in the game scene according to the movement of a touch point of the first touch operation;
providing an orientation control area, and when a second touch operation acting on the orientation control area is detected, controlling the orientation of the virtual main body in the game scene according to the movement of a touch point of the second touch operation;
controlling a presentation field of view of the game scene on the graphical user interface according to the position and orientation of the virtual subject in the game scene;
when sound in a preset range in the game scene is detected, obtaining sound source azimuth information of the sound, and providing a first visual field control area corresponding to the sound in the graphical user interface;
when a third touch operation acting on the first visual field control area is detected, switching to a multi-window display state, wherein in the multi-window display state, the graphical user interface comprises a main display window and an auxiliary display window, the display visual field of the game scene in the main display window on the graphical user interface is controlled according to the position and the orientation of the virtual main body in the game scene, and the display visual field of the game scene in the auxiliary display window is controlled according to the position of the virtual main body in the game scene and the sound source orientation information, so that the display visual field of the game scene in the auxiliary display window faces the sound source.
2. The method of claim 1, wherein said controlling a rendered field of view of the game scene in a main display window on the graphical user interface based on the position and orientation of the virtual body in the game scene comprises:
and controlling the direction of a first virtual camera corresponding to the main display window according to the orientation of the virtual body, so that the presentation visual field of the game scene in the main display window changes along with the movement of the virtual body.
3. The method of claim 1, wherein said controlling a rendered field of view of a game scene in the secondary display window based on the position of the virtual subject in the game scene and the sound source position information comprises:
and controlling the direction of a second virtual camera corresponding to the auxiliary display window according to the direction indicated by the sound source azimuth information so as to enable the presenting view field of the game scene in the auxiliary display window to face the sound source.
4. The method of claim 1, wherein in the multi-window display state, the method further comprises:
and providing a visual field switching control, and switching display contents in the main display window and the auxiliary display window when a fourth touch operation acting on the visual field switching control is detected.
5. The method of claim 1, wherein the method further comprises:
and when detecting that a third touch operation acting on the first visual field control area is finished, canceling the auxiliary display window, and controlling the display visual field of the game scene on the graphical user interface to be restored to the state before the third touch operation.
6. The method of claim 1, wherein the method further comprises:
and when a fifth touch operation acting on a cancel operation area is detected, canceling the auxiliary display window, and controlling the display visual field of the game scene on the graphical user interface to be restored to the state before the third touch operation.
7. The method of claim 5 or 6, wherein the controlling the rendered field of view of the game scene on the graphical user interface to revert to a state prior to the third touch operation comprises:
controlling the presentation visual field of the game scene on the graphical user interface to recover to the presentation visual field before the third touch operation; alternatively, the first and second electrodes may be,
and controlling the presentation visual field of the game scene on the graphical user interface to be restored to the presentation visual field calculated according to the presentation visual field calculation logic before the third touch operation.
8. The method of claim 5 or 6, further comprising:
hiding the first vision control area.
9. The method of claim 1, wherein the method comprises:
and in the multi-window display state, when a preset action of the third touch operation is detected, adjusting the display view of the game scene in the auxiliary display window on the graphical user interface according to the preset action.
10. The method of claim 9, wherein the predetermined action of the third touch operation is a touch slide operation.
11. The method of claim 10, wherein said adjusting a rendered field of view of said game scene in said secondary display window on said graphical user interface in accordance with said preset action comprises:
and adjusting the presenting visual field of the game scene in the auxiliary display window on the graphical user interface according to the sliding track of the touch sliding operation.
12. The method of claim 9, wherein the predetermined action of the third touch operation is a touch-and-click operation.
13. The method of claim 12, wherein said adjusting a rendered field of view of said game scene in said secondary display window on said graphical user interface in accordance with said preset action comprises:
and changing the presentation visual field of the game scene in the auxiliary display window on the graphical user interface according to the position of a preset point in the first visual field control area and the click position of the touch click operation.
14. The method of claim 1, wherein after the obtaining of the sound source bearing information of the sound, the method further comprises:
providing a first graphic identifier according to the current position of the virtual main body and the sound source azimuth information;
the first graphical identification is used to indicate an azimuthal relationship between the sound source and the virtual subject location.
15. The method of claim 1, wherein after the obtaining of the sound source bearing information of the sound, the method further comprises:
providing a second graphic identification according to the current position and orientation of the virtual main body and the sound source azimuth information;
the second graphical identification is used to indicate a positional relationship between the sound source and the virtual subject orientation.
16. The method of claim 1, wherein the method further comprises:
providing a second visual field control area;
when a sixth touch operation acting on the second visual field control area is detected, changing the presenting visual field of the game scene on the graphical user interface according to the sixth touch operation;
and when the end of the sixth touch operation is detected, controlling the display visual field of the game scene on the graphical user interface to be restored to the state before the sixth touch operation.
17. An information processing apparatus applied to a touch terminal capable of presenting a graphical user interface, wherein content presented by the graphical user interface at least partially includes a game scene and a virtual body, the apparatus comprising:
the first interaction unit is used for providing a movement control area, and when a first touch operation acting on the movement control area is detected, the virtual main body is controlled to move in the game scene according to the movement of a touch point of the first touch operation;
the second interaction unit is used for providing an orientation control area, and when a second touch operation acting on the orientation control area is detected, the orientation of the virtual main body in the game scene is controlled according to the movement of the touch point of the second touch operation;
the first control unit is used for controlling the presentation visual field of the game scene on the graphical user interface according to the position and the orientation of the virtual body in the game scene;
the display unit is used for acquiring sound source azimuth information of sound when the sound in a preset range in the game scene is detected, and providing a first visual field control area corresponding to the sound in the graphical user interface;
the second control unit is used for switching to a multi-screen display state when detecting a third touch operation acting on the first visual field control area, providing an auxiliary display window, controlling the presentation visual field of the game scene in the main display window on the graphical user interface according to the position and the orientation of the virtual main body in the game scene, and controlling the presentation visual field of the game scene in the auxiliary display window according to the position of the virtual main body in the game scene and the sound source orientation information, so that the presentation visual field of the game scene in the auxiliary display window faces the sound source.
18. A computer-readable storage medium on which a computer program is stored, the computer program, when being executed by a processor, implementing the information processing method according to any one of claims 1 to 16.
19. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the information processing method of any one of claims 1 to 16 via execution of the executable instructions.
CN201711148849.3A 2017-11-17 2017-11-17 Information processing method, information processing device, electronic equipment and storage medium Active CN107832001B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711148849.3A CN107832001B (en) 2017-11-17 2017-11-17 Information processing method, information processing device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711148849.3A CN107832001B (en) 2017-11-17 2017-11-17 Information processing method, information processing device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN107832001A CN107832001A (en) 2018-03-23
CN107832001B true CN107832001B (en) 2020-07-10

Family

ID=61652081

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711148849.3A Active CN107832001B (en) 2017-11-17 2017-11-17 Information processing method, information processing device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN107832001B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110448904B (en) * 2018-11-13 2023-04-25 网易(杭州)网络有限公司 Game view angle control method and device, storage medium and electronic device
CN111991800A (en) 2019-02-22 2020-11-27 网易(杭州)网络有限公司 Game role control method, device, equipment and storage medium
CN110170170A (en) * 2019-05-30 2019-08-27 维沃移动通信有限公司 A kind of information display method and terminal device
CN110354506B (en) * 2019-08-20 2023-11-21 网易(杭州)网络有限公司 Game operation method and device
CN110624248A (en) * 2019-09-18 2019-12-31 网易(杭州)网络有限公司 Game control method, device, electronic equipment and storage medium
CN116419118A (en) * 2021-12-31 2023-07-11 华为技术有限公司 Input detection device, system and related equipment thereof

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105103457A (en) * 2013-03-28 2015-11-25 三星电子株式会社 Portable terminal, hearing aid, and method of indicating positions of sound sources in the portable terminal
CN105688409A (en) * 2016-01-27 2016-06-22 网易(杭州)网络有限公司 Game control method and device
JP2016206740A (en) * 2015-04-16 2016-12-08 株式会社コロプラ User interface program
WO2017047078A1 (en) * 2015-09-16 2017-03-23 株式会社カプコン Game system, control method thereof, computer device-readable non-volatile recording medium
CN106975219A (en) * 2017-03-27 2017-07-25 网易(杭州)网络有限公司 Display control method and device, storage medium, the electronic equipment of game picture
CN107203321A (en) * 2017-03-27 2017-09-26 网易(杭州)网络有限公司 Display control method and device, storage medium, the electronic equipment of game picture

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006343447A (en) * 2005-06-08 2006-12-21 Konami Digital Entertainment:Kk Virtual space sharing system and control method of same
US8954890B2 (en) * 2012-04-12 2015-02-10 Supercell Oy System, method and graphical user interface for controlling a game
US20160063728A1 (en) * 2015-11-10 2016-03-03 Mediatek Inc. Intelligent Nanny Assistance
CN205460941U (en) * 2016-02-19 2016-08-17 上海盟云移软网络科技股份有限公司 Developments virtual reality drives recreation system
CN107050851B (en) * 2017-03-27 2020-12-08 熊庆生 Sound enhancement method and system for game content effect

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105103457A (en) * 2013-03-28 2015-11-25 三星电子株式会社 Portable terminal, hearing aid, and method of indicating positions of sound sources in the portable terminal
JP2016206740A (en) * 2015-04-16 2016-12-08 株式会社コロプラ User interface program
WO2017047078A1 (en) * 2015-09-16 2017-03-23 株式会社カプコン Game system, control method thereof, computer device-readable non-volatile recording medium
CN105688409A (en) * 2016-01-27 2016-06-22 网易(杭州)网络有限公司 Game control method and device
CN106975219A (en) * 2017-03-27 2017-07-25 网易(杭州)网络有限公司 Display control method and device, storage medium, the electronic equipment of game picture
CN107203321A (en) * 2017-03-27 2017-09-26 网易(杭州)网络有限公司 Display control method and device, storage medium, the electronic equipment of game picture

Also Published As

Publication number Publication date
CN107832001A (en) 2018-03-23

Similar Documents

Publication Publication Date Title
CN107832001B (en) Information processing method, information processing device, electronic equipment and storage medium
CN107913520B (en) Information processing method, information processing device, electronic equipment and storage medium
US10765947B2 (en) Visual display method for compensating sound information, computer readable storage medium and electronic device
US11565181B2 (en) Virtual object control method and apparatus, computer device, and storage medium
CN107823882B (en) Information processing method, information processing device, electronic equipment and storage medium
US10716995B2 (en) Information processing method and apparatus, storage medium, and electronic device
CN107913516B (en) Information processing method, information processing device, electronic equipment and storage medium
CN108355354B (en) Information processing method, device, terminal and storage medium
CN107469354B (en) Visible sensation method and device, storage medium, the electronic equipment of compensating sound information
JP6875346B2 (en) Information processing methods and devices, storage media, electronic devices
US10191612B2 (en) Three-dimensional virtualization
CN111530073B (en) Game map display control method, storage medium and electronic device
CN109224439A (en) The method and device of game aiming, storage medium, electronic device
CN107583271A (en) The exchange method and device of selection target in gaming
CN109589605B (en) Game display control method and device
CN110448904B (en) Game view angle control method and device, storage medium and electronic device
CN109407959B (en) Virtual object control method, device and storage medium in virtual scene
CN108854063A (en) Method of sight, device, electronic equipment and storage medium in shooting game
JP2023139033A (en) Method, apparatus, device, terminal, and computer program for rotation of view point
CN108744513A (en) Method of sight, device, electronic equipment in shooting game and storage medium
CN107982916B (en) Information processing method, information processing device, electronic equipment and storage medium
CN113680054A (en) Game interaction method and device based on computer vision library
CN112965773A (en) Method, apparatus, device and storage medium for information display
CN112987924A (en) Method, apparatus, device and storage medium for device interaction
CN113559501B (en) Virtual unit selection method and device in game, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant