CN107174824B - Special effect information processing method and device, electronic equipment and storage medium - Google Patents

Special effect information processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN107174824B
CN107174824B CN201710367692.7A CN201710367692A CN107174824B CN 107174824 B CN107174824 B CN 107174824B CN 201710367692 A CN201710367692 A CN 201710367692A CN 107174824 B CN107174824 B CN 107174824B
Authority
CN
China
Prior art keywords
special effect
mobile terminal
effect information
physiological parameter
view mode
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710367692.7A
Other languages
Chinese (zh)
Other versions
CN107174824A (en
Inventor
孙卉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201710367692.7A priority Critical patent/CN107174824B/en
Publication of CN107174824A publication Critical patent/CN107174824A/en
Application granted granted Critical
Publication of CN107174824B publication Critical patent/CN107174824B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/219Input arrangements for video game devices characterised by their sensors, purposes or types for aiming at specific areas on the display, e.g. light-guns
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets

Abstract

The present disclosure provides a special effect information processing method, a special effect information processing apparatus, an electronic device, and a computer-readable storage medium, the method including: triggering the view mode of the game scene picture on the graphical user interface to be adjusted from a first view mode to a second view mode; acquiring one or more physiological parameters of a current user of the mobile terminal through the mobile terminal in the second visual field mode; enabling special effect information corresponding to the physiological parameters according to one or more physiological parameters. The present disclosure promotes the sense of immersion of the user in the game.

Description

Special effect information processing method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of human-computer interaction technologies, and in particular, to a special effect information processing method, a special effect information processing apparatus, an electronic device, and a computer-readable storage medium.
Background
With the rapid development of mobile communication technology, a large number of game applications are emerging on mobile terminals. In a First-Person Shooting Game FPS (First-Person Shooting Game) or a TPS (third-Person Shooting Game), it is essential for a user to perform an effective aiming Shooting operation.
For example, when a user uses a virtual sniper gun to perform aiming shooting operation, the game application displays elements such as a virtual aiming device interface, a sight and a scale for adjusting the sight distance in a full screen manner, and the elements are different from the interaction manner of a single sight of other shooting guns. However, in the current shooting game, for operations requiring technical and psychological balance, the characteristics of the shooting game and the real substituting feeling of the sniping gun cannot be displayed, and the user experience feeling is poor.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
An object of the present disclosure is to provide a special effect information processing method, a special effect information processing apparatus, an electronic device, and a computer-readable storage medium, thereby overcoming, at least to some extent, one or more of the problems due to the limitations and disadvantages of the related art.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to an aspect of the present invention, there is provided a special effect information processing method, for obtaining a graphical user interface by executing a software application on a processor of a mobile terminal and rendering the software application on a touch display of the mobile terminal, wherein content presented by the graphical user interface includes a game scene picture, the method including:
triggering the view mode of the game scene picture on the graphical user interface to be adjusted from a first view mode to a second view mode;
acquiring one or more physiological parameters of a current user of the mobile terminal through the mobile terminal in the second visual field mode;
enabling special effect information corresponding to the physiological parameters according to one or more physiological parameters.
In an exemplary embodiment of the present disclosure, triggering the adjustment of the view mode of the game scene screen on the graphical user interface from the first view mode to the second view mode includes:
detecting whether a trigger operation acting on a virtual aiming control area on the graphical user interface is received;
after detecting that the trigger operation acting on the virtual aiming control area is received, detecting whether the current user contacts an image sensor of the mobile terminal;
when the current user is detected to contact with the image sensor of the mobile terminal, the view mode of the game scene picture on the graphical user interface is triggered to be adjusted from a first view mode to a second view mode.
In an exemplary embodiment of the present disclosure, triggering the adjustment of the view mode of the game scene screen on the graphical user interface from the first view mode to the second view mode includes:
detecting whether the current user contacts an image sensor of the mobile terminal;
when the current user is detected to contact with the image sensor of the mobile terminal, the view mode of the game scene picture on the graphical user interface is triggered to be adjusted from a first view mode to a second view mode.
In an exemplary embodiment of the present disclosure, the obtaining one or more physiological parameters of the current user of the mobile terminal comprises:
acquiring image information of a contact part of the current user and the image sensor through the image sensor, and acquiring one or more physiological parameters related to the image information according to the change of the image information.
In an exemplary embodiment of the present disclosure, enabling special effect information corresponding to the physiological parameter includes:
comparing the acquired physiological parameter with a plurality of pre-stored physiological parameter intervals and judging whether the physiological parameter is matched with one of the pre-stored physiological parameter intervals;
and when the physiological parameter is judged to be matched with one of the pre-stored physiological parameter intervals, enabling one or more pieces of special effect information corresponding to the pre-stored physiological parameter intervals.
In an exemplary embodiment of the present disclosure, the effect information includes a display effect and/or a sound effect; wherein the display special effect comprises a shaking amplitude variation effect of a virtual aiming device sight on the graphical user interface.
In an exemplary embodiment of the present disclosure, the image sensor is a rear camera or a front camera of the mobile terminal, and the physiological parameter includes a heartbeat parameter.
In an exemplary embodiment of the present disclosure, the image sensor is a front camera of the mobile terminal, and the physiological parameter includes a pupil diameter.
According to an aspect of the present disclosure, there is provided a special effect information processing apparatus for obtaining a graphical user interface by executing a software application on a processor of a mobile terminal and rendering on a touch display of the mobile terminal, content presented by the graphical user interface including a game scene screen, the apparatus including:
the visual field mode adjusting module is used for triggering the visual field mode of the game scene picture on the graphical user interface to be adjusted from a first visual field mode to a second visual field mode;
the physiological parameter acquisition module is used for acquiring one or more physiological parameters of a current user of the mobile terminal through the mobile terminal in the second visual field mode;
and the special effect information enabling module is used for enabling special effect information corresponding to the physiological parameters according to one or more physiological parameters.
According to an aspect of the present disclosure, there is provided an electronic device including:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform any one of the above-described special effect information processing methods via execution of the executable instructions.
According to an aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the special effects information processing method of any one of the above.
In the special effect information processing method, the special effect information processing apparatus, the electronic device, and the computer-readable storage medium provided in an embodiment of the present disclosure, when the game scene screen is triggered to be in the second visual field mode on the graphical user interface, one or more physiological parameters of a current user of the mobile terminal are acquired and special effect information corresponding to the physiological parameters is enabled. On one hand, one or more physiological parameters of the current user are obtained through the mobile terminal and applied to a game scene, so that the physiological parameters can be conveniently obtained, and the diversity and interestingness of the game special effect are increased; on the other hand, the special effect information corresponding to the physiological parameters can be adjusted in real time according to the physiological parameters, so that the use scene of the shooting game is simulated more truly, and the immersion sense of the user in the game is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The above and other features and advantages of the present disclosure will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty. In the drawings:
fig. 1 schematically illustrates a diagram of a special effect information processing method in an exemplary embodiment of the present disclosure;
FIG. 2 schematically illustrates a diagram of a graphical user interface in a first mode of view in an exemplary embodiment of the present disclosure;
FIG. 3 schematically illustrates a diagram of a graphical user interface in a second mode of view in an exemplary embodiment of the present disclosure;
fig. 4 is a schematic diagram schematically illustrating a block diagram of a special effect information processing apparatus in an exemplary embodiment of the present disclosure;
FIG. 5 schematically illustrates a block diagram of an electronic device in an exemplary embodiment of the disclosure;
fig. 6 schematically illustrates a program product for special effects information processing in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The same reference numerals denote the same or similar parts in the drawings, and thus, a repetitive description thereof will be omitted.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the embodiments of the disclosure can be practiced without one or more of the specific details, or with other methods, components, materials, devices, steps, and so forth. In other instances, well-known structures, methods, devices, implementations, materials, or operations are not shown or described in detail to avoid obscuring aspects of the disclosure.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. That is, these functional entities may be implemented in the form of software, or in one or more software-hardened modules, or in different networks and/or processor devices and/or microcontroller devices.
The present exemplary embodiment first discloses a special effect information processing method that can be applied to various game scenes, such as various shooting games. The game scene can be presented through a graphical user interface obtained by executing a software application on a processor of the mobile terminal and rendering the game scene on a touch display of the mobile terminal, and the content presented by the graphical user interface can also comprise a virtual shooting object or not comprise the virtual shooting object. The graphical user interface may be the entire displayable area of the mobile terminal, i.e. a full screen display. The mobile terminal can be a tablet computer, a notebook computer, a game machine or a portable electronic device such as a smart phone, and the execution of a game system on the mobile terminal is realized by installing a game application or other software applications, for example, and the mobile terminal at least comprises a memory for storing data and a processor for processing the data. Referring to fig. 1, the special effect information processing method may include the steps of:
s110, triggering a view mode of the game scene picture on the graphical user interface to be adjusted from a first view mode to a second view mode;
step S120, in the second visual field mode, one or more physiological parameters of a current user of the mobile terminal are obtained through the mobile terminal;
step S130, enabling special effect information corresponding to the physiological parameters according to one or more physiological parameters.
In the special effect information processing method provided by an embodiment of the present disclosure, when the game scene picture is triggered to be in the second visual field mode on the graphical user interface, one or more physiological parameters of a current user of the mobile terminal are acquired and special effect information corresponding to the physiological parameters is enabled. On one hand, one or more physiological parameters of the current user are obtained through the mobile terminal and applied to a game scene, so that the physiological parameters can be conveniently obtained, and the diversity and interestingness of the game special effect are increased; on the other hand, the special effect information corresponding to the physiological parameters can be adjusted in real time according to the physiological parameters, so that the use scene of the shooting game is simulated more truly, and the immersion sense of the user in the game is improved.
Next, a special effect information processing method in the present exemplary embodiment will be further explained.
In step S110, the view mode of the game scene screen on the gui is triggered to be adjusted from the first view mode to the second view mode.
In the present exemplary embodiment, as shown in fig. 2, the graphical user interface may include a virtual shooting control 101, a virtual shooting subject 103, and a virtual aiming control 104. In addition, the graphic user interface may include a virtual shooting object 102, a virtual battle scene, a virtual natural environment, a map control area, an information panel control area, blank areas outside the control areas, and the like. The virtual shooting subject 103 may have various expressions, such as a virtual firearm controlled by a virtual object, and the virtual shooting subject 103 may also be a shooting gun, a mortar, and the like controlled by a virtual object, and the virtual firearm is taken as an example in the present exemplary embodiment for specific description.
The first view mode can be a panoramic view, that is, a virtual battle scene can be observed in all directions. In the first view mode, the display view of the graphical user interface can be correspondingly adjusted and changed according to the touch operation acting on the preset area. The preset area may be a touch area with a visual indication, for example, a touch area with a border frame, or a touch area filled with a color, or a touch area with a preset transparency, or other areas capable of visually indicating the range of the touch area. The touch operation may be one or any combination of multiple of clicking, sliding and pressing, for example, a presentation field of view of the graphical user interface may be adjusted according to a track of the sliding operation.
The second view mode may be a local view in which a local scene of a certain range may be clearly observed. When a user needs to shoot a certain virtual shooting object 102 far away by using a virtual sniping gun, the view mode of the game scene picture on the graphical user interface can be triggered to be adjusted from the first view mode to the second view mode, that is, the view of the game scene picture on the graphical user interface is triggered to be switched from a panoramic view to a local view.
In addition, in this example embodiment, triggering the view mode of the game scene screen on the graphical user interface to be adjusted from the first view mode to the second view mode may include:
detecting whether a trigger operation acting on a virtual aiming control area on the graphical user interface is received;
after detecting that the trigger operation acting on the virtual aiming control area is received, detecting whether the current user contacts an image sensor of the mobile terminal;
when the current user is detected to contact with the image sensor of the mobile terminal, the view mode of the game scene picture on the graphical user interface is triggered to be adjusted from a first view mode to a second view mode.
In this example embodiment, when the view mode of the game scene screen on the graphical user interface is adjusted from the first view mode to the second view mode, the view mode may be adjusted by a double verification method, and specifically, it may be first detected whether a trigger operation is received on the graphical user interface. The trigger operation may be a single operation or a combined operation of any single operation, such as clicking, double-clicking, translating, pressing, sliding and the like, performed on the touch interface by a finger. When it is detected that any region of the graphical user interface receives a trigger operation, whether the coordinate of the position where the trigger operation occurs is within the coordinate range of the virtual aiming control can be judged. The virtual targeting control can be anywhere on the graphical user interface, such as the lower left or lower right corner. The trigger operation may occur at any location of the virtual targeting control.
After detecting that the triggering operation acting on the virtual aiming control area is received, whether the current user contacts the image sensor of the mobile terminal or not can be detected through a distance sensor or a light sensor. Here, the contact is understood to mean that the current user touches or shields the image sensor of the mobile terminal with any body part, and may also be a press or other operation. For example, the current user may contact or block the image sensor of the mobile terminal by hand or face; it can also be understood that the distance between the current user and the image sensor of the mobile terminal is within a preset distance, and the preset distance may be a small value, for example, 5 cm. The image sensor may adopt a charge coupled device (ccd) chip or a complementary Metal Oxide semiconductor (cmos) chip.
When the fact that the current user touches the image sensor of the mobile terminal is detected, the current user can be triggered to operate a virtual sniping gun in a shooting game to shoot a virtual shooting object at a certain position in an aiming mode, meanwhile, the view field mode of the game scene picture on the graphical user interface can be triggered to be adjusted from a first view field mode to a second view field mode, namely, the panoramic view field mode is switched to a local view field mode, and the local view field can be a local view field in any target shooting object position range locked by the user. In addition, the user may change the direction in which the game scene screen presents views in the first view mode and the second view mode on the graphical user interface through a sliding operation, but not change the direction in which the virtual aiming control 104 is pointed. Referring to fig. 3, in the second view mode, other virtual controls and element identifiers on the graphical user interface may be hidden, and only the graphical user interface within the local view range is retained and displayed in full screen. After exiting the second view mode, all element identifications of the graphical user interface displaying the script may be restored.
In addition, in this example embodiment, triggering the view mode of the game scene screen on the graphical user interface to be adjusted from the first view mode to the second view mode may include:
detecting whether the current user contacts an image sensor of the mobile terminal;
when the current user is detected to contact with the image sensor of the mobile terminal, the view mode of the game scene picture on the graphical user interface is triggered to be adjusted from a first view mode to a second view mode.
In this exemplary embodiment, it may be determined whether to switch the view mode only by detecting whether the current user touches the image sensor of the mobile terminal. The specific determination method is the same as above, and is not described herein again. When a virtual sniping gun is needed to be used in a shooting game, the current user can naturally shield the image sensor of the mobile terminal by fingers, so that the view mode of the game scene picture is adjusted from a first view mode to a second view mode. In the process, the existing image sensor of the mobile terminal can be fully utilized to switch the view mode, the hardware cost of the mobile terminal does not need to be additionally increased, and the view mode of the game picture can be adjusted more conveniently.
In step S120, in the second visual field mode, one or more physiological parameters of a current user of the mobile terminal are acquired by the mobile terminal.
In the exemplary embodiment, after the first visual field mode is adjusted to the second visual field mode by detecting a double verification method of a trigger operation and a touch operation or a verification method of only detecting whether a touch operation exists, the current physiological parameter of the user may be acquired through the mobile terminal. The physiological parameters may include one or more of heartbeat, blood pressure, pulse, iris parameters, facial feature parameters. The current user of the mobile terminal may be anyone, the mobile terminal needs to include one or more of a front-facing camera and a rear-facing camera. In the present exemplary embodiment, the one or more physiological parameters of the current user of the mobile terminal may be obtained through application software similar to that used in the prior art for measuring the heartbeat of the user using the mobile terminal. For example, the application software can measure physiological parameters such as heart rate of a user through fingers buckled on the rear camera. In the exemplary embodiment, one or more physiological parameters of the current user of the mobile terminal can be obtained through similar principles and applied to a mobile phone game scene to increase the immersion of the user in the game.
Furthermore, in this example embodiment, the obtaining one or more physiological parameters of the current user of the mobile terminal may include:
acquiring image information of a contact part of the current user and the image sensor through the image sensor, and acquiring one or more physiological parameters related to the image information according to the change of the image information.
In the present exemplary embodiment, in the second view mode, image information of a contact portion of the current user with the image sensor may be acquired through the image sensor of the mobile terminal, and the image information may be continuously acquired or photographed within a preset time, and a change in the image information may be analyzed. The contact portion may be any portion that can be contacted, such as a hand or a face. In the present exemplary embodiment, the description will be given taking the example of capturing hand image information. Since the fingerprint chip in the current mobile terminal is used for collecting the original features of the fingerprint, but a large number of disturbing signals are generated in the process of acquiring the fingerprint information, and the disturbing signals are often filtered as noise to ensure the accuracy of the acquired fingerprint information, the fingerprint information is not contained in the hand image information collected in the exemplary embodiment.
After the image information is acquired, one or more physiological parameters associated with the image information can be acquired according to the change of the image information. The description will be given by taking the example of acquiring hand image information to acquire heart rate parameters. The image sensor may be any one of optical recognition, capacitance recognition, and radio frequency recognition. The change of the image information can be the surface texture of the finger and the change of the surface smoothness of the finger. Specifically, the optical recognition can measure the heart rate parameters based on the principle of measuring the heart rate by the camera, and can also measure the heart rate parameters by observing regular fine deformation generated when the heartbeat occurs due to the concave-convex surface of the finger of the user; capacitive recognition can be based on measuring heart rate parameters by sensing regular capacitive changes when a user's heartbeat occurs; the radio frequency identification can obtain related information by transmitting a trace radio frequency signal to pass through the epidermis layer of the finger to obtain the texture of the inner layer of the finger so as to measure the heart rate parameter. Besides, the physiological parameter may be a pulse parameter, a facial feature parameter, or the like.
In step S130, special effect information corresponding to the physiological parameter is enabled according to one or more of the physiological parameters.
In this example embodiment, the special effect information may include one or more of a display special effect and a sound special effect. For example, the sound effect may include a gunshot sound effect, a hit cue sound effect, and the like; the display special effects may include shaking of the muzzle during aiming, shaking of the screen, special effects of flames at the muzzle during shooting, and the like. In the present exemplary embodiment, the special effect information may include, but is not limited to, a shaking of the screen sight and a beating sound effect of the heart among the background sound effects. One physiological parameter can have one or more special effect information corresponding to the physiological parameter, for example, the shaking of a screen sight corresponding to a heart rate parameter and the beating sound effect of the heart in a background sound effect can be simultaneously started. Further, the size of the physiological parameter is positively or negatively correlated with the change of the special effect information, and when the size of the physiological parameter is positively correlated with the change of the special effect information, the larger the physiological parameter is, the stronger the special effect information feedback in the game is. For example, the faster the heart rate is, the larger the shaking amplitude of the screen sight is, and the tighter the beating sound of the heart in the background sound effect is, so that the substitution feeling of the shooting game to the user is enhanced.
Furthermore, in this example embodiment, enabling special effect information corresponding to the physiological parameter may include:
comparing the acquired physiological parameter with a plurality of pre-stored physiological parameter intervals and judging whether the physiological parameter is matched with one of the pre-stored physiological parameter intervals;
and when the physiological parameter is judged to be matched with one of the pre-stored physiological parameter intervals, enabling one or more pieces of special effect information corresponding to the pre-stored physiological parameter intervals.
In this exemplary embodiment, the system may divide each physiological parameter into a plurality of physiological parameter intervals in advance according to a preset condition, then may obtain the physiological parameter of the current user in real time through the physiological parameter obtaining module, and compare the physiological parameter obtained in real time with a plurality of pre-stored physiological parameter intervals one by one, and determine whether the physiological parameter is within a range of one of the physiological parameter intervals, where the comparison process may be implemented by a loop or a function. The heart rate parameters can be divided into a first interval, a second interval, a third interval, a fourth interval and the like from small to large according to actual standards or customized standards meeting the requirements according to the heartbeat times per minute. In addition, special effect information specific to each physiological parameter interval can be set, for example, a gunshot special effect is added to the first interval, a voice special effect is added to the second interval, and the like. When the heart rate parameter is in the fourth interval, the special effect information in the game is fed back most strongly by referring to the positive correlation between the size of the heart rate parameter and the change of the special effect information; when the heart rate parameter is in the first interval, the shaking amplitude of the screen sight is smaller. And the size of the heart rate parameter in each interval can be positively correlated with the change of the special effect information, and in the same interval, the larger the heart rate parameter is, the larger the shaking amplitude of the screen sight is, or the larger the beating sound of the heart in the background sound effect is, the more compact the heart beat sound is.
Further, in the present exemplary embodiment, the effect information includes a display effect and/or a sound effect; wherein the display special effect comprises a shaking amplitude variation effect of a virtual aiming device sight on the graphical user interface.
In this exemplary embodiment, the special effect information may include only a display special effect, may also include only a sound special effect, and may also include both a display special effect and a sound special effect. The display special effect may include, but is not limited to, a shaking amplitude variation effect of a sight of a virtual aiming device on the graphical user interface, where the virtual aiming device may be a sighting telescope of the virtual sniping gun; the sound effect may be a beating sound effect of the heart in a background sound effect. The sight of the virtual aiming device can shake up and down or left and right, a nervous game atmosphere can be created through the shaking of the sight and the beating sound effect of the heart in the background sound effect, and the immersion feeling of a user in a game is increased. The larger the acquired heart rate parameter is, the larger the shaking amplitude of the sight is, and meanwhile, the faster the beating frequency of the heart in the background sound effect is, the larger the sound is.
In addition, in this example embodiment, the image sensor may be a rear camera or a front camera of the mobile terminal, and the physiological parameter includes a heart rate parameter.
In this exemplary embodiment, the image sensor is an important component of a digital camera. The current user of the mobile terminal can contact or shield the front camera or the rear camera through fingers, and change of blood color is obtained through weak light difference, so that beating of the heart is sensed, and heart rate parameters are obtained. The change of the light reflected by the face of the user can be collected and analyzed through the front camera, and therefore the heart rate is calculated. The physiological parameters may also include blood pressure, pulse, respiratory rate, and the like.
For another example, in this example embodiment, the image sensor may be a front camera of the mobile terminal, and the physiological parameter includes a pupil diameter.
In this example embodiment, the heart rate parameter of the current user can be acquired by the front-facing camera, and then special effect control is realized by combining the above other steps. Or, the face image information of the current user can be collected through a front camera of the mobile terminal, and the pupil diameter is obtained according to the face image information; further, the special effect information corresponding to the pupil diameter can be enabled according to the change of the pupil diameter of the user. For example, the larger the detected pupil diameter of the user, the larger the amplitude of the shaking of the foresight in the control game scene, the faster the frequency of the beating of the heart in the background sound effect, and the like.
It should be noted that, in the present exemplary embodiment, although the step of acquiring one or more physiological parameters of the current user of the mobile terminal through the mobile terminal is added, no matter the heart rate parameter of the user or the pupil diameter parameter of the user is acquired, no extra operation of the user is needed, and the operation habit of the user is not changed too much, so that the learning cost of the user can be reduced.
In an exemplary embodiment of the present disclosure, there is also provided a special effect information processing apparatus, and as shown with reference to fig. 4, the apparatus 200 may include:
the view mode adjusting module 201 may be configured to trigger the view mode of the game scene picture on the graphical user interface to be adjusted from a first view mode to a second view mode;
a physiological parameter obtaining module 202, configured to obtain, by the mobile terminal, one or more physiological parameters of a current user of the mobile terminal in the second visual field mode;
a special effect information enabling module 203, configured to enable special effect information corresponding to the physiological parameter according to one or more of the physiological parameters.
The specific details of each module in the special effect information processing apparatus have been described in detail in the corresponding special effect information processing method, and therefore, the details are not described herein again.
In an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Thus, various aspects of the invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 600 according to this embodiment of the invention is described below with reference to fig. 5. The electronic device 600 shown in fig. 5 is only an example and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 5, the electronic device 600 is embodied in the form of a general purpose computing device. The components of the electronic device 600 may include, but are not limited to: the at least one processing unit 610, the at least one memory unit 620, a bus 630 connecting different system components (including the memory unit 620 and the processing unit 610), and a display unit 640.
Wherein the storage unit stores program code that is executable by the processing unit 610 to cause the processing unit 610 to perform steps according to various exemplary embodiments of the present invention as described in the above section "exemplary methods" of the present specification.
The storage unit 620 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM)6201 and/or a cache memory unit 6202, and may further include a read-only memory unit (ROM) 6203.
The memory unit 620 may also include a program/utility 6204 having a set (at least one) of program modules 6205, such program modules 6205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 630 may be one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 600 may also communicate with one or more external devices 700 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 600, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 600 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 650. Also, the electronic device 600 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via the network adapter 660. As shown, the network adapter 660 communicates with the other modules of the electronic device 600 over the bus 630. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 600, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above section "exemplary methods" of the present description, when said program product is run on the terminal device.
Referring to fig. 6, a program product 800 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Furthermore, the above-described figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (7)

1. A special effect information processing method is characterized in that a graphical user interface is obtained by executing a software application on a processor of a mobile terminal and rendering the software application on a touch display of the mobile terminal, and content presented by the graphical user interface comprises a game scene picture, and the method comprises the following steps:
triggering the view mode of the game scene picture on the graphical user interface to be adjusted from a first view mode to a second view mode; triggering the view mode of the game scene picture on the graphical user interface to be adjusted from a first view mode to a second view mode comprises: detecting whether a trigger operation acting on a virtual aiming control area on the graphical user interface is received; after detecting that the trigger operation acting on the virtual aiming control area is received, detecting whether a current user contacts an image sensor of the mobile terminal; when the current user is detected to contact an image sensor of the mobile terminal, triggering the view mode of the game scene picture on the graphical user interface to be adjusted from a first view mode to a second view mode;
acquiring one or more physiological parameters of a current user of the mobile terminal through the mobile terminal in the second visual field mode; acquiring one or more physiological parameters of a current user of the mobile terminal comprises: acquiring image information of a contact part of the current user and an image sensor through the image sensor, and acquiring one or more physiological parameters related to the image information according to the change of the image information; the physiological parameters comprise one or more of pupil diameter, heartbeat, blood pressure, pulse, iris parameters and facial feature parameters;
enabling special effect information corresponding to the physiological parameters according to one or more physiological parameters; one physiological parameter corresponds to one or more special effect information; wherein the magnitude of the physiological parameter is positively or negatively correlated with the change of the special effect information; enabling special effect information corresponding to the physiological parameter includes: comparing the acquired physiological parameter with a plurality of pre-stored physiological parameter intervals and judging whether the physiological parameter is matched with one of the pre-stored physiological parameter intervals; and when the physiological parameter is judged to be matched with one of the pre-stored physiological parameter intervals, enabling one or more pieces of special effect information corresponding to the pre-stored physiological parameter intervals.
2. The special effect information processing method according to claim 1, wherein the special effect information includes a display special effect and/or a sound special effect; wherein the display special effect comprises a shaking amplitude variation effect of the collimation center of the virtual aiming device on the graphical user interface.
3. The special effect information processing method according to claim 1, wherein the image sensor is a rear camera or a front camera of the mobile terminal, and the physiological parameter includes a heartbeat parameter.
4. The special effect information processing method according to claim 1, wherein the image sensor is a front camera of the mobile terminal.
5. A special effect information processing apparatus, wherein a graphical user interface is obtained by executing a software application on a processor of a mobile terminal and rendering on a touch display of the mobile terminal, content presented by the graphical user interface including a game scene screen, the apparatus comprising:
the visual field mode adjusting module is used for triggering the visual field mode of the game scene picture on the graphical user interface to be adjusted from a first visual field mode to a second visual field mode; triggering the view mode of the game scene picture on the graphical user interface to be adjusted from a first view mode to a second view mode comprises: detecting whether a trigger operation acting on a virtual aiming control area on the graphical user interface is received; after detecting that the trigger operation acting on the virtual aiming control area is received, detecting whether a current user contacts an image sensor of the mobile terminal; when the current user is detected to contact an image sensor of the mobile terminal, triggering the view mode of the game scene picture on the graphical user interface to be adjusted from a first view mode to a second view mode;
the physiological parameter acquisition module is used for acquiring one or more physiological parameters of a current user of the mobile terminal through the mobile terminal in the second visual field mode; acquiring one or more physiological parameters of a current user of the mobile terminal comprises: acquiring image information of a contact part of the current user and an image sensor through the image sensor, and acquiring one or more physiological parameters related to the image information according to the change of the image information; the physiological parameters comprise one or more of pupil diameter, heartbeat, blood pressure, pulse, iris parameters and facial feature parameters;
the special effect information enabling module is used for enabling special effect information corresponding to the physiological parameters according to one or more physiological parameters; one physiological parameter corresponds to one or more special effect information; wherein the magnitude of the physiological parameter is positively or negatively correlated with the change of the special effect information; enabling special effect information corresponding to the physiological parameter includes: comparing the acquired physiological parameter with a plurality of pre-stored physiological parameter intervals and judging whether the physiological parameter is matched with one of the pre-stored physiological parameter intervals; and when the physiological parameter is judged to be matched with one of the pre-stored physiological parameter intervals, enabling one or more pieces of special effect information corresponding to the pre-stored physiological parameter intervals.
6. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the special effects information processing method of any one of claims 1-4 via execution of the executable instructions.
7. A computer-readable storage medium on which a computer program is stored, the computer program, when being executed by a processor, implementing the special effect information processing method according to any one of claims 1 to 4.
CN201710367692.7A 2017-05-23 2017-05-23 Special effect information processing method and device, electronic equipment and storage medium Active CN107174824B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710367692.7A CN107174824B (en) 2017-05-23 2017-05-23 Special effect information processing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710367692.7A CN107174824B (en) 2017-05-23 2017-05-23 Special effect information processing method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN107174824A CN107174824A (en) 2017-09-19
CN107174824B true CN107174824B (en) 2021-01-15

Family

ID=59832348

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710367692.7A Active CN107174824B (en) 2017-05-23 2017-05-23 Special effect information processing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN107174824B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109710255B (en) * 2018-12-24 2022-07-12 网易(杭州)网络有限公司 Special effect processing method, special effect processing device, electronic device and storage medium
CN110170168B (en) * 2019-05-30 2022-05-27 腾讯科技(深圳)有限公司 Virtual object shooting control method and device, electronic equipment and storage medium
CN110215690B (en) * 2019-07-11 2023-02-03 网易(杭州)网络有限公司 Visual angle switching method and device in game scene and electronic equipment
CN110879850B (en) * 2019-11-14 2021-02-09 腾讯科技(深圳)有限公司 Method, device and equipment for acquiring jitter parameters and storage medium
CN114870395B (en) * 2021-02-05 2023-09-15 腾讯科技(深圳)有限公司 Terminal vibration detection method, device, medium and equipment of game scene
CN113244612B (en) * 2021-05-26 2024-02-23 网易(杭州)网络有限公司 Lock mark display method, device, storage medium and electronic equipment
CN113457150B (en) * 2021-07-16 2023-10-24 腾讯科技(深圳)有限公司 Information prompting method and device, storage medium and electronic equipment

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1317352A (en) * 2000-04-10 2001-10-17 科乐美股份有限公司 Game system and memorizer able to be read by computer
CN1713938A (en) * 2002-11-20 2005-12-28 世嘉股份有限公司 Game image display control program, game device, and recording medium
CN201171897Y (en) * 2006-09-13 2008-12-31 株式会社万代南梦宫游戏 Game controller, charging bracket, game machine
CN102081458A (en) * 2009-11-27 2011-06-01 致伸科技股份有限公司 Keyboard-mouse set and computer system applying same
CN102946954A (en) * 2010-05-10 2013-02-27 美国索尼电脑娱乐有限责任公司 Polymorphic firearm controller
CN203208653U (en) * 2013-02-07 2013-09-25 马卡里 Accurate locating device in laser shooting confrontation game
CN104147781A (en) * 2014-07-29 2014-11-19 京东方科技集团股份有限公司 Electronic device, electronic system and electronic device control method
CN105159687A (en) * 2015-09-29 2015-12-16 腾讯科技(深圳)有限公司 Information processing method, terminal and computer storage medium
CN105457276A (en) * 2015-11-19 2016-04-06 上海曼恒智能科技有限公司 FPS game simulation system
CN105498213A (en) * 2015-12-09 2016-04-20 杭州无端科技有限公司 Aiming method and device in shooting game
CN105597310A (en) * 2015-12-24 2016-05-25 网易(杭州)网络有限公司 Game control method and device
CN106528020A (en) * 2016-10-26 2017-03-22 腾讯科技(深圳)有限公司 View mode switching method and terminal
CN108355348A (en) * 2015-10-10 2018-08-03 腾讯科技(成都)有限公司 Information processing method, terminal and computer storage media

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5656160B2 (en) * 2010-02-10 2015-01-21 任天堂株式会社 GAME PROGRAM, GAME DEVICE, GAME SYSTEM, AND GAME PROCESSING METHOD

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1317352A (en) * 2000-04-10 2001-10-17 科乐美股份有限公司 Game system and memorizer able to be read by computer
CN1713938A (en) * 2002-11-20 2005-12-28 世嘉股份有限公司 Game image display control program, game device, and recording medium
CN201171897Y (en) * 2006-09-13 2008-12-31 株式会社万代南梦宫游戏 Game controller, charging bracket, game machine
CN102081458A (en) * 2009-11-27 2011-06-01 致伸科技股份有限公司 Keyboard-mouse set and computer system applying same
CN102946954A (en) * 2010-05-10 2013-02-27 美国索尼电脑娱乐有限责任公司 Polymorphic firearm controller
CN203208653U (en) * 2013-02-07 2013-09-25 马卡里 Accurate locating device in laser shooting confrontation game
CN104147781A (en) * 2014-07-29 2014-11-19 京东方科技集团股份有限公司 Electronic device, electronic system and electronic device control method
CN105159687A (en) * 2015-09-29 2015-12-16 腾讯科技(深圳)有限公司 Information processing method, terminal and computer storage medium
CN108355348A (en) * 2015-10-10 2018-08-03 腾讯科技(成都)有限公司 Information processing method, terminal and computer storage media
CN105457276A (en) * 2015-11-19 2016-04-06 上海曼恒智能科技有限公司 FPS game simulation system
CN105498213A (en) * 2015-12-09 2016-04-20 杭州无端科技有限公司 Aiming method and device in shooting game
CN105597310A (en) * 2015-12-24 2016-05-25 网易(杭州)网络有限公司 Game control method and device
CN106528020A (en) * 2016-10-26 2017-03-22 腾讯科技(深圳)有限公司 View mode switching method and terminal

Also Published As

Publication number Publication date
CN107174824A (en) 2017-09-19

Similar Documents

Publication Publication Date Title
CN107174824B (en) Special effect information processing method and device, electronic equipment and storage medium
US11099637B2 (en) Dynamic adjustment of user interface
CN109529319B (en) Display method and device of interface control and storage medium
EP3527121B1 (en) Gesture detection in a 3d mapping environment
CN116450020A (en) Virtual shooting subject control method and device, electronic equipment and storage medium
US10304352B2 (en) Electronic device and method for sharing image
US11669164B2 (en) Augmenting the functionality of user input devices using a digital glove
KR20120068253A (en) Method and apparatus for providing response of user interface
CN111580652B (en) Video playing control method and device, augmented reality equipment and storage medium
US20190318169A1 (en) Method for Generating Video Thumbnail on Electronic Device, and Electronic Device
CN112044065B (en) Virtual resource display method, device, equipment and storage medium
US20120268359A1 (en) Control of electronic device using nerve analysis
RU2643444C2 (en) Visualization and accuracy of reproduction based on attention
KR102632270B1 (en) Electronic apparatus and method for displaying and generating panorama video
CN110546601A (en) Information processing apparatus, information processing method, and program
CN108829239A (en) Control method, device and the terminal of terminal
US20190049558A1 (en) Hand Gesture Recognition System and Method
WO2021073743A1 (en) Determining user input based on hand gestures and eye tracking
US10564719B1 (en) Augmenting the functionality of user input devices using a digital glove
US11126342B2 (en) Electronic device for controlling image display based on scroll input and method thereof
CN110717993A (en) Interaction method, system and medium of split type AR glasses system
EP3537376A2 (en) Image processing method and electronic device supporting image processing
KR20220098021A (en) Human-Machine Interface Device
CN112445328A (en) Mapping control method and device
CN109040604A (en) Shoot processing method, device, storage medium and the mobile terminal of image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant