CN107930119B - Information processing method, information processing device, electronic equipment and storage medium - Google Patents

Information processing method, information processing device, electronic equipment and storage medium Download PDF

Info

Publication number
CN107930119B
CN107930119B CN201711167726.4A CN201711167726A CN107930119B CN 107930119 B CN107930119 B CN 107930119B CN 201711167726 A CN201711167726 A CN 201711167726A CN 107930119 B CN107930119 B CN 107930119B
Authority
CN
China
Prior art keywords
target object
information processing
operation interface
information
processing method
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711167726.4A
Other languages
Chinese (zh)
Other versions
CN107930119A (en
Inventor
魏一明
胡校膑
黄亚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201711167726.4A priority Critical patent/CN107930119B/en
Publication of CN107930119A publication Critical patent/CN107930119A/en
Application granted granted Critical
Publication of CN107930119B publication Critical patent/CN107930119B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5372Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for tagging characters, objects or locations in the game scene, e.g. displaying a circle under the character controlled by the player
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/303Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
    • A63F2300/306Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display for displaying a marker associated to an object or location in the game field

Abstract

The disclosure provides an information processing method, an information processing device, electronic equipment and a computer readable storage medium, and relates to the technical field of human-computer interaction. The method comprises the following steps: judging whether the state of at least one target object in a preset area in the game scene meets a trigger condition or not; when the state of the target object meets the trigger condition, acquiring azimuth information of the target object; and adjusting the display parameters of at least one function control in the corresponding direction of the operation interface according to the direction information of the target object. The information of the target object can be quickly transmitted by adjusting the display parameters of the function control, so that the user experience is improved.

Description

Information processing method, information processing device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of human-computer interaction, and in particular, to an information processing method, an information processing apparatus, an electronic device, and a computer-readable storage medium.
Background
With the rapid development of mobile communication technology, a large number of game applications have emerged. In the scenes of various client-side game applications, a user usually hears the footstep sound of an enemy, but in the mobile phone game application, if the user does not wear an earphone or open the function of the mobile phone playing outside, the motion state of the enemy, such as the current direction and the moving direction of the enemy, is difficult to feel.
In the related art, as shown in fig. 1, the approximate position of an enemy is directly guided or suggested around a screen by icons or symbols to provide visual information of the enemy's direction and distance to a user; in addition, the related art shown in fig. 2 provides visual information of the enemy's position and distance to the player by marking the enemy's position and distance directly on a small map located in the upper left corner of the screen.
However, in the actual game application, the two methods have the following problems: firstly, the position of an enemy can only be indicated on a screen through an icon, the distance between the enemy and a virtual object cannot be judged, and the moving speed of the enemy cannot be determined, so that the motion state of the enemy cannot be accurately determined; secondly, because the guide icon is displayed on the screen, a part of the operation interface is occupied, so that other scene information on the operation interface is shielded, and the screen utilization rate is low; and thirdly, because the screen of the terminal is too small, the coordinate information on the small map is not easy to identify and directly represents the motion state of the enemy, so that the user experience is poor.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
An object of the present disclosure is to provide an information processing method and an information processing apparatus, an electronic device, and a computer-readable storage medium, which overcome one or more of the problems due to the limitations and disadvantages of the related art, at least to some extent.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to an aspect of the present disclosure, there is provided an information processing method applied to a terminal capable of presenting an operation interface, where the operation interface at least partially includes a game scene and at least includes a virtual object, including:
judging whether the state of at least one target object in a preset area in the game scene meets a trigger condition or not;
when the state of the target object meets the trigger condition, acquiring azimuth information of the target object;
and adjusting the display parameters of at least one function control in the corresponding direction of the operation interface according to the direction information of the target object.
In an exemplary embodiment of the present disclosure, the trigger condition includes that the state of the target object is a motion state, and the motion state includes one or more of movement, jumping, boarding a vehicle, shooting, and throwing.
In an exemplary embodiment of the present disclosure, the display parameters of the functionality control include one or more of brightness, size, color, and dynamic effect.
In an exemplary embodiment of the present disclosure, the method further comprises:
and acquiring the motion information of the target object, and adjusting the display parameters of the function control according to the motion information.
In an exemplary embodiment of the present disclosure, the motion information of the target object includes a moving direction and a moving speed of the target object.
In an exemplary embodiment of the present disclosure, adjusting the display parameter of the functionality control according to the motion information includes:
and providing a virtual sound wave by taking the target object as a center, and driving the function control to vibrate according to the vibration parameters corresponding to the motion information through the virtual sound wave so as to adjust the display parameters according to the vibration parameters.
In an exemplary embodiment of the present disclosure, the vibration parameter includes one or more of a vibration transmission direction, a vibration frequency, and a vibration amplitude.
In an exemplary embodiment of the present disclosure, the vibration transfer direction of the function control coincides with a moving direction of the target object.
In an exemplary embodiment of the present disclosure, the vibration frequency of the function control is positively correlated with a moving speed of the target object.
In an exemplary embodiment of the present disclosure, the vibration amplitude of the functionality control is related to a property parameter of the target object and/or a distance between the target object and the virtual object.
In an exemplary embodiment of the disclosure, the vibration amplitude of the functionality control is attenuated in proportion to the number of vibrations.
In an exemplary embodiment of the present disclosure, the method further comprises:
and canceling the adjustment of the display parameters of the function control when the state of the target object does not meet the trigger condition or the target object and the virtual object are both in the visual field range presented by the operation interface.
In an exemplary embodiment of the present disclosure, the method further comprises:
and when the target object and the virtual object are both in the visual field range presented by the operation interface, providing a prompt mark on the operation interface.
According to an aspect of the present disclosure, there is provided an information processing apparatus applied to a terminal capable of presenting an operation interface, the operation interface at least partially including a game scene and at least including a virtual object, including:
the trigger judgment module is used for judging whether the state of at least one target object in a preset area in the game scene meets a trigger condition or not;
the azimuth acquisition module is used for acquiring azimuth information of the target object when the state of the target object meets the trigger condition;
and the display adjusting module is used for adjusting the display parameters of at least one functional control in the corresponding direction of the operation interface according to the direction information of the target object.
According to an aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the information processing method of any one of the above.
According to an aspect of the present disclosure, there is provided an electronic device including:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform any one of the information processing methods described above via execution of the executable instructions.
According to an aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the information processing method of any one of the above.
According to an aspect of the present disclosure, there is provided an electronic device including:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform any one of the information processing methods described above via execution of the executable instructions.
In an information processing method, an information processing apparatus, an electronic device, and a computer-readable storage medium provided by an exemplary embodiment of the present disclosure, by determining whether a state of at least one target object in a preset area in a game scene satisfies a trigger condition; when the state of the target object meets the trigger condition, acquiring azimuth information of the target object; and adjusting the display parameters of at least one function control in the corresponding direction of the operation interface according to the direction information of the target object. When the state of the target object meets the trigger condition, the azimuth information of the target object can be quickly transmitted by adjusting the display parameters of the function control, so that the information transmission efficiency is improved; on the other hand, the azimuth information of the target object can be presented only through the original function control on the operation interface, so that the use of the control is reduced, and the screen utilization rate is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The above and other features and advantages of the present disclosure will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty. In the drawings:
fig. 1 is a schematic view of a first operation interface for displaying target object information in the related art;
FIG. 2 is a diagram illustrating a second interface for displaying target object information according to the related art;
FIG. 3 schematically illustrates an information processing method in an exemplary embodiment of the disclosure;
FIG. 4 schematically illustrates a preset area schematic in an exemplary embodiment of the present disclosure;
FIG. 5 is a schematic diagram illustrating an operation interface of a functionality control corresponding to each target object in an exemplary embodiment of the present disclosure;
FIG. 6 is a schematic diagram schematically illustrating the vibration transmission direction of the functionality control in an exemplary embodiment of the disclosure;
FIG. 7 is another diagram schematically illustrating the vibration transmission direction of the functionality control in an exemplary embodiment of the present disclosure;
fig. 8 schematically shows a detailed flowchart of an information processing method in an exemplary embodiment of the present disclosure;
fig. 9 schematically shows a configuration diagram of an information processing apparatus in an exemplary embodiment of the present disclosure;
FIG. 10 schematically illustrates a block diagram of an electronic device in an exemplary embodiment of the disclosure;
FIG. 11 schematically illustrates a program product in an exemplary embodiment of the disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The same reference numerals denote the same or similar parts in the drawings, and thus, a repetitive description thereof will be omitted.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the embodiments of the disclosure can be practiced without one or more of the specific details, or with other methods, components, materials, devices, steps, and so forth. In other instances, well-known structures, methods, devices, implementations, materials, or operations are not shown or described in detail to avoid obscuring aspects of the disclosure.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. That is, these functional entities may be implemented in the form of software, or in one or more software-hardened modules, or in different networks and/or processor devices and/or microcontroller devices.
The exemplary embodiment first discloses an information processing method, which can be applied to a terminal capable of presenting an operation interface, wherein the operation interface at least partially comprises a game scene and at least comprises a virtual object. The terminal may be a touch terminal or a non-touch terminal, for example. The terminal may be, for example, a mobile phone, a tablet computer, a notebook computer, a game machine, a PDA, and other various electronic devices. An operation interface on which a virtual object, a plurality of function controls, a virtual battle scene, a virtual natural environment, and the like in a game application can be presented can be obtained by executing a software application on a processor of a terminal and rendering on a display of the terminal. The operation interface may be the whole area of the screen or may be a partial area of the screen, which is not particularly limited in this exemplary embodiment.
Referring to fig. 4, the virtual object 400 refers to a game character manipulated by a user, and is displayed in an operation interface in a partial or whole form of the game character, for example, in a first person perspective, the content of the operation interface is presented in a main control perspective of the user, the immersive experience of the user is enhanced, and only a partial part of the game character, such as a hand or a foot, is visible in the interface; in the third person's perspective, the game character can be presented in the operation interface as a whole, so as to reduce the operation difficulty and vertigo feeling of the user, and to emphasize the action feeling of the screen, specifically, which mode is displayed, the developer can select according to the game content, and the invention is not limited thereto.
In this example, referring to fig. 3, the information processing method may include the steps of:
step S310, judging whether the state of at least one target object in a preset area in the game scene meets a trigger condition;
s320, when the state of the target object meets the trigger condition, acquiring azimuth information of the target object;
s330, adjusting display parameters of at least one function control in the corresponding direction of the operation interface according to the direction information of the target object.
According to the information processing method in the present exemplary embodiment, on one hand, when the state of the target object meets the trigger condition, the orientation information of the target object can be quickly conveyed by adjusting the display parameter of the function control, so that the information conveying efficiency is improved; on the other hand, the azimuth information of the target object can be presented only through the original function control on the operation interface, so that the use of the control is reduced, and the screen utilization rate is improved.
Next, each step in the information processing method is further explained with reference to fig. 3 to 8.
In step S310, it is determined whether a state of at least one target object in a preset area in the game scene satisfies a trigger condition.
In the present exemplary embodiment, the preset area may be a range outside the visual field of the virtual object presented by the lens, that is, a game scene not displayed on the operation interface, and the size of the preset area may be set by the developer according to the game content, for example, the size may be the entire range outside the visual field of the virtual object, or may be a partial range determined according to the auditory range of the virtual object, for example, the preset area 411 shown in fig. 4.
The target object may be an enemy or enemy object of the virtual object corresponding to the user. The trigger condition may include that the state of the target object is a motion state, and the motion state includes at least one of moving, jumping, boarding a vehicle, shooting, and throwing, although other motion states may also be included. When the target object is in any of the above states, the trigger condition may be considered to be satisfied. If the target object meeting the trigger condition does not exist, other interaction functions can be normally performed, and whether the state of the target object meets the trigger condition can be detected again after the other interaction functions are completed.
In addition, before determining whether the target object satisfies the trigger condition, an effective sensing range may be provided, in which the motion of the target object can affect the virtual object or the game scene presented on the current operation interface, such as 412 shown in fig. 4. It should be noted that the distances at which the target objects in different orientations meet the trigger condition may be the same or different.
In step S320, when the state of the target object satisfies the trigger condition, the azimuth information of the target object is acquired.
In the present exemplary embodiment, when the target object satisfies the trigger condition and is in a motion state, the orientation information of the target object within the effective sensing range may be acquired to determine the influence of the target object on the virtual object. The orientation of the target object relative to the virtual object may be acquired by a coordinate system or a position acquisition module. Because the range outside the virtual object visual line is a relatively large range, and contains a large amount of irrelevant or less influence information on the virtual object, the target object far away from the virtual object can be ignored, so that the information processing process is reduced, and the information processing speed is further improved.
S330, adjusting display parameters of at least one function control in the corresponding direction of the operation interface according to the direction information of the target object.
The function control may be a function control located at an edge of the operation interface, and may include, for example, a player information control 401, a security area prompt control 402, a recording control 403, a weapon control 404, and the like as shown in fig. 4, and may adjust display parameters of one or more function controls corresponding to the orientation and/or distance of each target object according to the orientation information of the target object. It should be noted that the function control herein does not include a virtual joystick control or other control for controlling the movement of the virtual object or controlling the adjustment state of the virtual object, so as to avoid affecting the state of the virtual object.
The display parameters may include one or more of a size, color, brightness, and vibration effect of the functionality control. The display parameters of the corresponding function control can be adjusted through the azimuth information of each target object so as to be displayed distinctively.
The at least one function control in the corresponding orientation of the operation interface can comprise a function control which is the same as or similar to the orientation of the target object. For example, as shown in fig. 5, when it is detected that a target object a exists on the upper left side of the virtual object operation interface, or a target object B exists on the lower left side, or a target object C exists on the right upper left side, the object name control 401 or the equipment control 405 in the control area 1 may be displayed in a highlighted form; when the target object D is detected to exist right in front, the safety zone prompt control 402 in the zone 2 can be controlled to be enlarged; when detecting that the target object E exists on the right side right above the recording area or the target object F exists on the right side, the color of the recording control 403 in the area 3 may be controlled to change; upon detecting the presence of target object G directly below, weapon control 404 in zone 4 may be controlled to vibrate. When the functional control does not exist in the relative position corresponding to the target object, the functional control with the closest position and/or distance can be taken as the control corresponding to the target object.
It should be noted that the number of the function controls included in each region may be the same or different, and when it is detected that one target object is moving, the display parameters of one or more function controls in the corresponding orientation may be adjusted at the same time. In addition, if the target object moves back and forth along a direction, the display parameters of all the function controls in the direction are sequentially changed according to the moving direction of the function controls.
In the example, the azimuth information of the target object outside the visual field visible range can be quickly transmitted through the original plurality of function controls on the operation interface without adding other controls or visual information, and compared with a mode of displaying the target object information on the operation interface through icons, the method avoids shielding of other information on the operation interface, reduces the visual load of a user, and further improves the utilization rate of the function controls; the inconvenience caused by acquiring the target object information through the small map is avoided, and the convenience is improved.
Specifically, the method may comprise: and acquiring the motion information of the target object, and adjusting the display parameters of the function control according to the motion information.
The motion information of the target object may include any or all of the information of the moving direction and the moving speed of the target object. For example, the moving direction of the target object, the moving speed of the target object, and the moving direction and the moving speed of the target object may be obtained, and what motion information needs to be obtained may be set according to actual requirements.
Further, adjusting the display parameter of the functionality control according to the motion information may include:
and providing a virtual sound wave by taking the target object as a center, and driving the function control to vibrate according to the vibration parameters corresponding to the motion information through the virtual sound wave so as to adjust the display parameters according to the vibration parameters.
In this example, the virtual sound wave has the same action as the real sound wave, i.e. the corresponding object can be controlled to vibrate. The vibration parameters correspond to the motion information corresponding to each target object, and the vibration parameters may include one or more of vibration transmission direction, vibration frequency, and vibration amplitude, for example. In addition to this, the vibration parameter may be associated with a property parameter of the target object, which may include, for example, the height, weight of a backpack, or weight of an armed weapon of the target object. In the game application, if the target object carries a backpack or a heavy weapon with various props with the user, the weight of the target object is correspondingly increased, so that the sound in the game scene is represented as heavier footsteps, and correspondingly, stronger visual prompt marks can be provided on an operation interface, so that the user can better acquire the attribute parameters of the target object in real time and quickly prepare the target object.
And only one vibration frequency and vibration amplitude corresponding to each moving direction is provided. It should be noted that when a plurality of target objects are detected within the effective sensing range and the target objects move in the same moving direction, only one vibration amplitude and vibration frequency need to be selected.
On this basis, specific relationships between the respective vibration parameters and the target object motion information and the attribute parameters are provided in the present example. The vibration transmission direction of the functional control can be controlled through the moving direction of the target object, the vibration frequency of the functional control can be controlled through the moving speed or the moving frequency of the target object, and the vibration amplitude and/or the vibration frequency of the functional control can be controlled through the attribute parameters of the target object or the distance between the target object and the virtual object.
Specifically, in this example, the direction of vibration transmission of the function control coincides with the direction of movement of the target object.
In the present exemplary embodiment, the moving direction of the target object may be indicated by the vibration transfer direction of the function control. Further, in order to enable the user to quickly recognize and judge the moving direction of the target object, the vibration transmission direction of the function control can be represented by a direction identifier, the direction identifier can be in the form of an arrow, a triangle or a pointer, for example, and the color of the direction identifier can be set according to actual requirements. Wherein the pointing direction of the direction indicator changes following the moving direction of each target object.
Referring to fig. 6, for example, if it is detected that the target object a moves from the current position I to the position J in the game scene, the movement direction presented in the interface is I '-J', and I '-J' is the vibration transmission direction of the function control included in the area 1, and it appears that the control 405 and the control 401 vibrate in sequence on the operation interface. Similarly, if the moving direction of the target object C in the interface is K '-L', it is specifically represented that the controls 401 and 402 sequentially vibrate. In addition, the vibration of the function control can be based on the X-Y axis direction of the interface, for example, the vibration direction of the controls 401 and 405 in fig. 7 can be set to be along the X axis direction, the vibration direction of the control 402 can be set to be along the Y axis direction, and the specific direction in which the control vibrates is mainly determined according to the layout of the controls on the operation interface and the reasonable distance between the controls, so as to better feedback the visual vibration effect to the user.
In addition to this, the vibration frequency of the function control is positively correlated with the moving speed of the target object. Specifically, for the same target object, when the moving speed of the target object is relatively fast, the vibration frequency of the function control is relatively fast accordingly. Wherein the vibration frequency is low and can be obtained by, for example, a vibration sensor. For example, if the moving speed of a certain target object is 4m/s, the vibration frequency of the corresponding function control may be 4 times/s or 4Hz, for example; when the target object applies equipment or actively increases the moving speed, for example, the target object rides a vehicle in a game, such as a motorcycle or an automobile, and the moving speed is increased to 8m/s, the vibration frequency of the corresponding function control can be correspondingly adjusted to 8 Hz.
Further, the vibration amplitude of the function control is related to the property parameter of the target object and/or the distance between the target object and the virtual object.
In the present exemplary embodiment, the attribute parameters of the target object may include height, weight, backpack weight, or armed weapon weight of the target object, and the like. The vibration amplitude of the function control can be positively correlated with the attribute parameters of the target object, and can be negatively correlated with the distance between the target object and the virtual object. In this example, the motion information, the orientation information, and the like of the target object can be determined more intuitively by the vibration amplitude.
Specifically, for the target object a and the target object B having the same moving speed, if all the attribute parameters of the target object a, including height, weight of a backpack, or weight of an equipped weapon, are greater than the attribute parameters of the target object B, the vibration amplitude of the function control corresponding to the target object a is greater than the vibration amplitude of the function control corresponding to the target object B.
In addition, for the target object C and the target object D with the same moving speed, if it is detected that the distance between the current position of the target object C and the position of the virtual object is smaller than the distance between the current position of the target object D and the position of the virtual object, the vibration amplitude of the function control in the direction corresponding to the target object C is greater than the vibration amplitude of the function control in the direction corresponding to the target object D.
It should be added that if there are multiple target objects meeting the trigger condition and the multiple target objects move in the same moving direction, only one vibration amplitude and vibration frequency need to be selected for feedback. In this example, the maximum resonance amplitude and the maximum resonance frequency may be selected according to the property parameter of the target object and the distance between the target object and the virtual object. However, for a plurality of target objects, the influence of the attribute parameters and the distance on the vibration amplitude of the function control cannot be superposed with each other. For a certain target object, the influence of the attribute parameters and the distance between the target object and the virtual object on the vibration amplitude of the function control can be superposed, and a developer can set respective weights and final effects by himself, but the invention is not limited to this.
For example, if the vibration amplitude of the function control corresponding to the target object a is smaller than the vibration amplitude of the function control corresponding to the target object C, the vibration amplitude of the function control corresponding to the target object C is taken as the vibration amplitude of the function control corresponding to the moving direction. Through the vibration amplitude of the function control in the example, the game scene outside the visual field range of the virtual object and the specific situation of the target object can be represented more accurately and more quickly, the position information of the enemy does not need to be displayed through a small map, and convenience and user experience are improved.
Further, in this example, the vibration amplitude of the function control is attenuated proportionally to the vibration times.
In order to simulate a real environment more realistically and improve the experience of a game application, the virtual sound wave provided by taking the target object as the center is attenuated due to absorption or diffusion in the propagation process, so that the vibration process of the function control at the edge of the screen driven by the virtual sound wave is also an attenuation process. In this example, when the target object is at the same position and does not move, the vibration amplitude of the function control may be proportionally attenuated with the vibration times. The attenuation factor can be specifically set according to actual requirements.
For example, when the vibration frequency is 1, the function control immediately vibrates once according to the remaining volume when the virtual sound wave arrives for the first time; when the vibration frequency is 2, the vibration amplitude of the second time is 50% of that of the first time; when the number of times of vibration is 3, the amplitude of vibration for the third time is 25% of that for the first time. Intensity of vibration
In addition, in the present exemplary embodiment, when the state of the target object does not satisfy the trigger condition or the target object and the virtual object are both within the visual field presented by the operation interface, the adjustment of the display parameter of the function control is cancelled.
In the exemplary embodiment, when the target object is not in the effective sensing range and the target object is in a motion state, the function control cannot be triggered to resonate. In addition, referring to fig. 4, when the target object and the virtual object are in the same operation interface 413, that is, the target object has entered the game scene currently displayed on the operation interface, the motion information of the target object may be accurately obtained without roughly representing the motion information of the target object in the form of controlling the function control by using the virtual sound wave, so that the control on the function control may be cancelled, and further, the adjustment of the display parameter of the function control may be cancelled.
In this example, when the target object and the virtual object are both within the visual field presented by the operation interface, a prompt identifier may be provided on the operation interface.
When one or more target objects enter the visual field presented by the operation interface where the virtual object is located, because the distance to the virtual object is short at this time, the footstep sound or other sounds of the movement of the target object can be considered to be heard, and at this time, a prompt identifier can be provided on the operation interface to remind information such as the current position of the target object and the distance to the virtual object. The prompt identifier may be anywhere on the operation interface, and may be consistent with the orientation of the target object, for example. The prompt mark can be a sound wave aperture, for example, and the sound wave aperture can be distinctively displayed in a highlight or flashing manner.
It needs to be supplemented that, when any function control does not exist on the operation interface, if the state of the target object does not meet the trigger condition, the operation interface keeps the default state and does not display any information; if the target object meets the trigger condition and passes through or approaches the effective sensing range, a function control is provided in the position corresponding to the current position of the target object on the operation interface, and the function control can be a sound control, can be in the form of a sound square bar, or can be in other forms. The provided sound control is the same as the vibration logic of the function control described in the above step, and the description is omitted here.
In summary, in this example, the direction information of the target object outside the sight line range can be quickly conveyed by adjusting the display parameters of the function controls on the screen, so that the information transmission efficiency is improved compared with the prior art; in addition, the information such as the motion state, the attribute parameters, the distance between the target object and the virtual object and the like can be transmitted only by operating the functional control at the upper edge of the interface, other controls do not need to be added, the use of the controls is reduced, and the utilization rate of the functional control is improved; furthermore, the virtual sound wave control function control is used for vibrating according to the corresponding vibration parameters, so that the user can still sense the movement and potential risks of the surrounding target object even without earphones and beyond the sight range, the immersion of the user on game application is enhanced, and the experience of the user in the game process is improved.
In an exemplary embodiment of the present disclosure, there is also provided an information processing apparatus 500, applied to a terminal capable of presenting an operation interface, where the operation interface at least partially includes a game scene and at least includes a virtual object, and referring to fig. 9, the apparatus 500 may include:
the trigger determining module 501 may be configured to determine whether a state of at least one target object in a preset area in the game scene meets a trigger condition;
a direction obtaining module 502, configured to obtain direction information of the target object when the state of the target object meets the trigger condition;
the display adjusting module 503 may be configured to adjust a display parameter of at least one function control in a direction corresponding to the operation interface according to the direction information of the target object.
The details of each functional module of the information processing apparatus 500 have been described in detail in the corresponding information processing method, and therefore are not described herein again.
In an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Thus, various aspects of the invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 600 according to this embodiment of the invention is described below with reference to fig. 10. The electronic device 600 shown in fig. 10 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 10, the electronic device 600 is embodied in the form of a general purpose computing device. The components of the electronic device 600 may include, but are not limited to: the at least one processing unit 610, the at least one memory unit 620, a bus 630 connecting different system components (including the memory unit 620 and the processing unit 610), and a display unit 640.
Wherein the storage unit stores program code that is executable by the processing unit 610 to cause the processing unit 610 to perform steps according to various exemplary embodiments of the present invention as described in the above section "exemplary methods" of the present specification. For example, the processing unit 610 may perform the steps as shown in fig. 3: step S310, judging whether the state of at least one target object in a preset area in the game scene meets a trigger condition; s320, when the state of the target object meets the trigger condition, acquiring azimuth information of the target object; s330, adjusting display parameters of at least one function control in the corresponding direction of the operation interface according to the direction information of the target object.
The storage unit 620 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM)6201 and/or a cache memory unit 6202, and may further include a read-only memory unit (ROM) 6203.
The memory unit 620 may also include a program/utility 6204 having a set (at least one) of program modules 6205, such program modules 6205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 630 may be one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 600 may also communicate with one or more external devices 700 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 600, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 600 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 650. Also, the electronic device 600 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via the network adapter 660. As shown, the network adapter 660 communicates with the other modules of the electronic device 600 over the bus 630. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 600, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above section "exemplary methods" of the present description, when said program product is run on the terminal device.
Referring to fig. 11, a program product 800 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Furthermore, the above-described figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (15)

1. An information processing method applied to a terminal capable of presenting an operation interface, wherein the operation interface at least partially comprises a game scene and at least comprises a virtual object, the method comprising the steps of:
judging whether the state of at least one target object in a preset area in the game scene meets a trigger condition or not;
when the state of the target object meets the trigger condition, acquiring azimuth information of the target object;
and adjusting display parameters of at least one function control in the corresponding position of the operation interface according to the position information of the target object, wherein the function control is an original control on the operation interface, and the display parameters of the function control comprise one or more of brightness, size, color and dynamic effect.
2. The information processing method according to claim 1, wherein the trigger condition includes that the state of the target object is a motion state, and the motion state includes one or more of movement, jumping, boarding a vehicle, shooting, and throwing.
3. The information processing method according to claim 1, characterized by further comprising:
and acquiring the motion information of the target object, and adjusting the display parameters of the function control according to the motion information.
4. The information processing method according to claim 3, wherein the motion information of the target object includes a moving direction and a moving speed of the target object.
5. The information processing method according to claim 3, wherein adjusting the display parameter of the functionality control according to the motion information comprises:
and providing a virtual sound wave by taking the target object as a center, and driving the function control to vibrate according to the vibration parameters corresponding to the motion information through the virtual sound wave so as to adjust the display parameters according to the vibration parameters.
6. The information processing method according to claim 5, wherein the vibration parameter includes one or more of a vibration transmission direction, a vibration frequency, and a vibration amplitude.
7. The information processing method according to claim 6, wherein the vibration transmission direction of the function control coincides with a movement direction of the target object.
8. The information processing method according to claim 6, wherein the vibration frequency of the function control is positively correlated with a moving speed of the target object.
9. The information processing method according to claim 6, wherein a vibration amplitude of the function control is related to a property parameter of the target object and/or a distance between the target object and the virtual object.
10. The information processing method according to claim 6, wherein the vibration amplitude of the function control is attenuated in proportion to the number of vibrations.
11. The information processing method according to claim 1, characterized by further comprising:
and canceling the adjustment of the display parameters of the function control when the state of the target object does not meet the trigger condition or the target object and the virtual object are both in the visual field range presented by the operation interface.
12. The information processing method according to claim 11, characterized by further comprising:
and when the target object and the virtual object are both in the visual field range presented by the operation interface, providing a prompt mark on the operation interface.
13. An information processing apparatus applied to a terminal capable of presenting an operation interface, the operation interface at least partially including a game scene and at least including a virtual object, the information processing apparatus comprising:
the trigger judgment module is used for judging whether the state of at least one target object in a preset area in the game scene meets a trigger condition or not;
the azimuth acquisition module is used for acquiring azimuth information of the target object when the state of the target object meets the trigger condition;
and the display adjusting module is used for adjusting display parameters of at least one functional control in the corresponding direction of the operation interface according to the direction information of the target object, wherein the functional control is an original control on the operation interface, and the display parameters of the functional control comprise one or more of brightness, size, color and dynamic effect.
14. A computer-readable storage medium on which a computer program is stored, the computer program, when being executed by a processor, implementing the information processing method of any one of claims 1 to 12.
15. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the information processing method of any one of claims 1 to 12 via execution of the executable instructions.
CN201711167726.4A 2017-11-21 2017-11-21 Information processing method, information processing device, electronic equipment and storage medium Active CN107930119B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711167726.4A CN107930119B (en) 2017-11-21 2017-11-21 Information processing method, information processing device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711167726.4A CN107930119B (en) 2017-11-21 2017-11-21 Information processing method, information processing device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN107930119A CN107930119A (en) 2018-04-20
CN107930119B true CN107930119B (en) 2021-04-06

Family

ID=61930555

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711167726.4A Active CN107930119B (en) 2017-11-21 2017-11-21 Information processing method, information processing device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN107930119B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108837504B (en) * 2018-06-05 2021-07-27 网易(杭州)网络有限公司 Information processing method, device, equipment and storage medium in racing game
CN109126120B (en) * 2018-08-17 2022-06-03 Oppo广东移动通信有限公司 Motor control method and related product
CN109788130A (en) * 2018-12-27 2019-05-21 努比亚技术有限公司 Terminal and its orientation based reminding method and computer readable storage medium
CN109745702B (en) * 2018-12-28 2022-04-22 超级魔方(北京)科技有限公司 Information prompting method and device
CN109966738B (en) * 2019-02-22 2020-07-31 网易(杭州)网络有限公司 Information processing method, processing device, electronic device, and storage medium
CN109908574B (en) * 2019-02-22 2020-09-08 网易(杭州)网络有限公司 Game role control method, device, equipment and storage medium
WO2020258225A1 (en) * 2019-06-28 2020-12-30 瑞声声学科技(深圳)有限公司 Gamepad and gamepad vibration method and apparatus
CN113645502B (en) * 2020-04-27 2022-09-16 海信视像科技股份有限公司 Method for dynamically adjusting control and display device
CN112767430A (en) * 2021-01-06 2021-05-07 维沃移动通信有限公司 Information prompting method and device
CN113546424A (en) * 2021-08-04 2021-10-26 网易(杭州)网络有限公司 Virtual resource use control method and device, computer equipment and storage medium
CN113769404A (en) * 2021-09-15 2021-12-10 网易(杭州)网络有限公司 Game movement control method and device and electronic equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4707039B2 (en) * 2008-09-18 2011-06-22 株式会社スクウェア・エニックス Network game system
CN106730827A (en) * 2016-12-06 2017-05-31 腾讯科技(深圳)有限公司 Method and terminal device that a kind of object shows

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4707039B2 (en) * 2008-09-18 2011-06-22 株式会社スクウェア・エニックス Network game system
CN106730827A (en) * 2016-12-06 2017-05-31 腾讯科技(深圳)有限公司 Method and terminal device that a kind of object shows

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
CH明明《荒野行动》搞笑解说:15杀吃鸡,打野也有春天#酷玩堂#;CH明明;《https://v.youku.com/v_show/id_XMzEzNjk2MDY5Mg==.html?sharekey=9d9b31ab5b081c7774b768ce07b5c6df9》;20171106;第0时0分0秒-第0时24分23秒 *
吃鸡高手必学听声辩位脚步与枪声的位置判断;李远洋;《https://zhuanlan.zhihu.com/p/29588477?utm_source=wechat_session&utm_medium=social&utm_oi=1108280735575912448&from=singlemessage&s_s_i=6rf7gElPvc%2Fz%2F1K8SeYlc456DQtzPlc2Z%2BjRiKFoOdg%3D&s_r=1》;20170922;全文 *
网易大厂的第二款吃鸡手游——终结者2试玩体验;古月妹纸i;《https://b23.tv/BV1px411j7EE》;20171105;第0时0分0秒-第0时24分15秒 *

Also Published As

Publication number Publication date
CN107930119A (en) 2018-04-20

Similar Documents

Publication Publication Date Title
CN107930119B (en) Information processing method, information processing device, electronic equipment and storage medium
US10765947B2 (en) Visual display method for compensating sound information, computer readable storage medium and electronic device
US10661172B2 (en) Visual display method and apparatus for compensating sound information, storage medium and device
CN107913520B (en) Information processing method, information processing device, electronic equipment and storage medium
CN107038112B (en) Application interface debugging method and device
CN107890672B (en) Visible sensation method and device, storage medium, the electronic equipment of compensating sound information
US9937422B2 (en) Voxel-based, real-time acoustic adjustment
US20190201789A1 (en) Method and apparatus for obtaining virtual resource of virtual object
CN109032464B (en) Interface switching method, device, equipment and storage medium in application program
CN108694073B (en) Control method, device and equipment of virtual scene and storage medium
KR20150079471A (en) Systems and methods for a haptically-enabled projected user interface
CN109260713B (en) Virtual object remote assistance operation method and device, storage medium and electronic equipment
CN112044065B (en) Virtual resource display method, device, equipment and storage medium
CN108211350B (en) Information processing method, electronic device, and storage medium
CN108776544B (en) Interaction method and device in augmented reality, storage medium and electronic equipment
CN109542323B (en) Interaction control method and device based on virtual scene, storage medium and electronic equipment
CN108905198A (en) A kind of control method and device of virtual objects jump
JP2022531372A (en) How to preview behavior during a match in an out-of-competition environment, devices, terminals, and computer programs
CN108153475B (en) Object position switching method and mobile terminal
US9665232B2 (en) Information-processing device, storage medium, information-processing method, and information-processing system for enlarging or reducing an image displayed on a display device
CN110215686B (en) Display control method and device in game scene, storage medium and electronic equipment
CN112965773A (en) Method, apparatus, device and storage medium for information display
CN113797527B (en) Game processing method, device, equipment, medium and program product
KR102204599B1 (en) Method for outputting screen and display device for executing the same
KR20210005832A (en) Method for outputting screen and display device for executing the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant