CN112221118A - Human-computer interaction perception processing method and device and electronic equipment - Google Patents
Human-computer interaction perception processing method and device and electronic equipment Download PDFInfo
- Publication number
- CN112221118A CN112221118A CN202011240927.4A CN202011240927A CN112221118A CN 112221118 A CN112221118 A CN 112221118A CN 202011240927 A CN202011240927 A CN 202011240927A CN 112221118 A CN112221118 A CN 112221118A
- Authority
- CN
- China
- Prior art keywords
- scene
- type
- information
- perception
- perception information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
- A63F13/28—Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
- A63F13/285—Generating tactile feedback signals via the game input device, e.g. force feedback
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application provides a perception processing method and device for human-computer interaction, electronic equipment and a computer readable storage medium; the method comprises the following steps: displaying an interaction process in a human-computer interaction interface; detecting scenes appearing in the interaction process; when the scene appearing in the interaction process is a marked scene configured with perception information by self-defined operation, obtaining a perception information configuration file corresponding to the marked scene; and outputting the perception information corresponding to the perception information configuration file in the scene proceeding process. Through the method and the device, the peripheral equipment of the electronic equipment can be fully utilized to realize flexible customization of the interaction effect.
Description
Technical Field
The present application relates to computer human-computer interaction technologies, and in particular, to a method and an apparatus for perceptual processing of human-computer interaction, an electronic device, and a computer-readable storage medium.
Background
Electronic equipment in the related art has abundant peripheral hardware, the presentation channel of perception environment has been expanded, for example, can indicate through the ringing of sound peripheral hardware output when the incoming telegram is reminded, can indicate through the light of vision peripheral hardware output, can indicate through the vibration of touch peripheral hardware output, to the application client that operates in electronic equipment, can be authorized in its operation process and call electronic equipment's peripheral hardware interface, thereby make in the operation of application client, can provide abundant human-computer interaction experience for the user through the perception information presentation mode that has the variety.
However, the presentation of the sensing information in the related art is controlled by a small number of developers, and the presented manner has high uniformity and limited number, so that it is difficult to satisfy the diversification requirements of users and to fully and effectively utilize the hardware resources of the electronic device.
Disclosure of Invention
The embodiment of the application provides a perception processing method and device for man-machine interaction, electronic equipment and a computer readable storage medium, which can make full use of peripheral equipment of the electronic equipment to realize self-definition of perception information.
The technical scheme of the embodiment of the application is realized as follows:
the embodiment of the application provides a perception processing method for man-machine interaction, which comprises the following steps:
displaying an interaction process in a human-computer interaction interface;
detecting scenes appearing in the interaction process;
when the scene appearing in the interaction process is a marked scene configured with perception information by self-defined operation, obtaining a perception information configuration file corresponding to the marked scene;
and outputting the perception information corresponding to the perception information configuration file in the scene proceeding process.
The embodiment of the application provides a perception processing device for man-machine interaction, which comprises: .
The display module is used for displaying an interaction process in the human-computer interaction interface;
the detection module is used for detecting scenes in the interaction process;
the acquisition module is used for acquiring a perception information configuration file corresponding to a marking scene when the scene appearing in the interaction process is the marking scene configured with perception information by self-defined operation;
and the output module is used for outputting the perception information corresponding to the perception information configuration file in the scene proceeding process.
In the above solution, the type of perception information used by the perception information profile for configuration includes haptic feedback information, and the haptic feedback information includes at least one of the following characteristics: frequency, intensity, duration, orientation; the output module is further configured to: outputting haptic feedback information configured to be associated with a type of an event included in the scene according to the type of the event.
In the foregoing solution, the output module is further configured to: performing at least one of the following operations: in a marking scene comprising a user operation event, synchronously outputting tactile feedback information synchronized with the user operation event; in a markup scene including sound, outputting at least one of: haptic feedback information adapted to a type of the sound, haptic feedback information consistent with a degree of excitement of the sound, haptic feedback information consistent with an orientation of a sound source of the sound; in a marked scenario including a collision event, outputting at least one of: haptic feedback information synchronized with the collision event, haptic feedback information consistent with a severity of the collision event; in a markup scenario including a notification event, haptic feedback information synchronized with the notification event is output.
In the above solution, the type of the perception information used by the perception information configuration file for configuration includes visual feedback information, and the visual feedback information includes at least one of the following characteristics: state, frequency, color, brightness; the output module is further configured to: outputting, according to a type of an event included in the scene, visual feedback information configured to be associated with the type of the event.
In the foregoing solution, the output module is further configured to: performing at least one of the following operations: in a markup scene including a notification event, outputting visual feedback information synchronized with the notification event; in a markup scene including sound, outputting at least one of: visual feedback information adapted to the type of sound, visual feedback information adapted to the rhythm of the sound; outputting visual feedback information for representing the progress of the progress event in a marked scene comprising the progress event; in a tagged scene that includes a result event, visual feedback information characterizing a result of the result event is output.
In the foregoing solution, before the obtaining of the perception information configuration file corresponding to the marked scene, the apparatus further includes: a configuration module to: determining the marked scene; displaying types of perception information configurable for the markup scene and characteristics configurable in each type; wherein the type of the perception information comprises: tactile feedback information; visual feedback information; olfactory feedback information; auditory feedback information; in response to the custom operation aiming at the marked scene, parameters configured by the custom operation aiming at the characteristics in at least one type are obtained, and a perception information configuration file corresponding to the marked scene is generated based on the obtained parameters of the characteristics in at least one type.
In the foregoing solution, the configuration module is further configured to: displaying a historical interaction process in the human-computer interaction interface, wherein the historical interaction process is earlier than the interaction process; and recording the target scene as a marked scene in response to a marking operation for the target scene in the historical interaction process.
In the foregoing solution, the configuration module is further configured to: the screen recording file of the historical interaction process is played before responding to the self-defining operation aiming at the marked scene; when the marked scene appears in the playing process of the screen recording file, displaying an inlet of custom perception information aiming at the marked scene; and displaying a custom interface which is used for receiving the custom operation and comprises types of perception information which can be configured for the mark scene and characteristics which can be configured in each type.
In the foregoing solution, the configuration module is further configured to: and recording the target scene as a marked scene in response to a marking operation for the target scene in the interactive process.
In the foregoing solution, the configuration module is further configured to: after determining the marked scene and before responding to the custom operation for the marked scene, displaying an entry of custom perception information for the marked scene; and displaying a custom interface which is used for receiving the custom operation and comprises types of perception information which can be configured for the mark scene and characteristics which can be configured in each type.
In the above scheme, the characteristics in each type have default parameters configured in advance; the configuration module is further configured to: updating the default parameters to the parameters set by the custom operation in response to modification of the default parameters by the custom operation for at least one characteristic in the type.
In the foregoing solution, the configuration module is further configured to: displaying a plurality of preconfigured modes of the markup scene; wherein each of the modes includes a type of at least one of perceptual information associated with the markup scene and a default parameter in which a characteristic in the type is preconfigured; acquiring at least one type of perception information associated with the marked scene and included in a target mode and default parameters with pre-configured characteristics in the type; wherein the target mode is a selected mode of the custom operation among the plurality of preconfigured modes.
In the foregoing solution, the configuration module is further configured to: obtaining candidate parameters for a characteristic in at least one of the types; carrying out score prediction processing on the candidate parameters through a first neural network model to obtain a prediction score of each candidate parameter; taking the candidate parameter with the highest predictive score as the parameter of the characteristic in at least one of the types; the first neural network model is obtained based on candidate parameter samples and real scores corresponding to the candidate parameter samples through training.
In the foregoing solution, the configuration module is further configured to: acquiring operation behavior data in the human-computer interaction interface; performing preference prediction processing on the operation behavior data through a second neural network model to obtain operation preference represented by the operation data; obtaining parameters of characteristics in at least one of the types pre-associated with the operating preferences; wherein the second neural network model is trained based on the operation data samples and the operation preference characterized by the operation data samples.
In the foregoing solution, the output module is further configured to: inquiring the type of the perception information which is supported by the electronic equipment and the characteristic which supports self-definition in the corresponding type before outputting the perception information corresponding to the perception information configuration file; and according to the query result, filtering out the type of the perception information which does not support self-definition and the characteristics in the corresponding type in the perception information configuration file so as to update the perception information configuration file.
In the foregoing solution, the output module is further configured to: acquiring the type of the perception information configured by the user-defined operation and parameters of characteristics in the corresponding type from the perception information configuration file; and sending a calling request to an application program interface opened to the man-machine interaction interface so as to call the peripheral corresponding to the type to output the perception information of the corresponding type, so that the output perception information of the corresponding type conforms to the parameters of the characteristics of the corresponding type.
The embodiment of the application provides a perception processing device for man-machine interaction, which comprises:
the configuration module is used for receiving custom operation aiming at the marked scene in a human-computer interaction interface;
the configuration module is further used for responding to the user-defined operation, and generating a perception information configuration file corresponding to the marked scene according to the type of the perception information configured by the user-defined operation and parameters of characteristics in the corresponding type;
the display module is used for displaying an interaction process on the human-computer interaction interface;
and the output module is used for outputting the perception information of the type configured by the user-defined operation according to the perception information configuration file in the scene proceeding process when the scene appearing again in the interaction process is the marked scene, and the output perception information of the corresponding type conforms to the parameters of the characteristics in the corresponding type.
An embodiment of the present application provides an electronic device, including:
a memory for storing executable instructions;
and the processor is used for realizing the human-computer interaction perception processing method provided by the embodiment of the application when the executable instructions stored in the memory are executed.
The embodiment of the application provides a computer-readable storage medium, which stores executable instructions and is used for realizing the human-computer interaction perception processing method provided by the embodiment of the application when being executed by a processor.
The embodiment of the application has the following beneficial perception effects:
the perception information of the marked scene is flexibly configured through the user-defined operation, so that when the same marked scene is reproduced, the corresponding perception information configuration file can be reused to call hardware resources in the electronic equipment to output the perception information, the atmosphere matched with the scene is presented, the immersive personalized interaction experience is created, and the effective utilization of the hardware resources of the electronic equipment is finally realized.
Drawings
FIGS. 1A-1B are schematic structural diagrams of a human-computer interaction perception processing system provided by an embodiment of the application;
fig. 2 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
3A-3D are schematic flow diagrams of a perception processing method for human-computer interaction provided by an embodiment of the application;
fig. 4 is a schematic flowchart of a perception processing method for human-computer interaction according to an embodiment of the present disclosure;
FIG. 5 is a schematic application diagram of a perception processing method for human-computer interaction provided in an embodiment of the present application;
FIG. 6 is an application diagram of a perception processing method for human-computer interaction provided in an embodiment of the present application;
FIG. 7 is a schematic application diagram of a perception processing method for human-computer interaction provided in an embodiment of the present application;
FIG. 8 is a schematic application diagram of a perception processing method for human-computer interaction provided in an embodiment of the present application;
fig. 9 is a protocol diagram of a perception processing method for human-computer interaction according to an embodiment of the present disclosure;
fig. 10A-10B are interface diagrams of a perception processing method for a human-computer interaction interface according to an embodiment of the present disclosure.
Detailed Description
In order to make the objectives, technical solutions and advantages of the present application clearer, the present application will be described in further detail with reference to the attached drawings, the described embodiments should not be considered as limiting the present application, and all other embodiments obtained by a person of ordinary skill in the art without creative efforts shall fall within the protection scope of the present application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is understood that "some embodiments" may be the same subset or different subsets of all possible embodiments, and may be combined with each other without conflict.
In the following description, references to the terms "first", "second", and the like are only used for distinguishing similar objects and do not denote a particular order or importance, but rather the terms "first", "second", and the like may be used interchangeably with the order of priority or the order in which they are expressed, where permissible, to enable embodiments of the present application described herein to be practiced otherwise than as specifically illustrated and described herein.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the present application only and is not intended to be limiting of the application.
Before further detailed description of the embodiments of the present application, terms and expressions referred to in the embodiments of the present application will be described, and the terms and expressions referred to in the embodiments of the present application will be used for the following explanation.
1) The intelligent terminal: also called as a mobile intelligent terminal, the mobile intelligent terminal refers to an electronic device that has rich man-machine interaction modes, has internet access capability, is usually equipped with various operating systems, and has strong processing capability, and the mobile intelligent terminal includes a smart phone, a tablet computer, a vehicle-mounted terminal, a handheld game host, and the like.
2) Tactile sense: the sense of touch is the sense of touch generated by the contact of a skin tactile sensor with a mechanical stimulus, and vibration can be generated by a motor and the like in the intelligent terminal to generate the sense of touch.
3) The lamp effect is as follows: the front, the side and the back of the intelligent terminal can be provided with the lamp belts, and different lamp effect atmospheres are generated through the characteristics of the lamp belts such as the light-on state, the brightness and the color.
4) In response to the condition or state on which the performed operation depends, one or more of the performed operations may be in real-time or may have a set delay when the dependent condition or state is satisfied; there is no restriction on the order of execution of the operations performed unless otherwise specified.
5) The client, the application program running in the terminal for providing various services, such as game client, military exercise simulation client, etc.
The smart terminal in the related art has a very rich peripheral, for example, a vibration motor, an atmosphere lamp, and the like, the peripheral may bring personalized experience to a user, the client (for example, a game client, a video client, and the like) also has a rich interactive scene, generally, the smart terminal is preset with an output effect (such as vibration or lamp effect), the client invokes the preset output effect of the smart terminal in a specific scene, the smart terminal predefines several types of output effects and opens an interface to the upper-layer client of the game and the like, in the interactive process of the client (for example, the game interactive process), when the client enters the specific scene, the preset interface of the smart terminal is invoked to trigger the corresponding vibration or lamp effect, so as to output corresponding haptic feedback information (vibration) and visual feedback information (lamp effect), but because the number of the predefined output effects is limited, the applicant finds that a complete solution for combining rich interactive scenes (client sides) and personalized experiences (peripheral equipment can provide personalized experiences for users) is lacking in the related technology when the application embodiment is implemented, so that a user-defined function for combining the rich interactive scenes (client sides) and the personalized experiences cannot be provided for users, the peripheral equipment of an intelligent terminal with the client sides is difficult to be fully utilized, an atmosphere adaptive to the interactive scenes in the client sides is difficult to present, and immersive interactive experiences cannot be created.
Embodiments of the present application provide a perception processing method and apparatus for human-computer interaction, an electronic device, and a computer-readable storage medium, which can make full use of peripherals of the electronic device to implement flexible customization of an interaction effect, and an exemplary application of the electronic device provided in the embodiments of the present application is described below.
In an implementation scenario, referring to fig. 1A, fig. 1A is a schematic structural diagram of a human-computer interaction perception processing system provided in this embodiment, in order to support a game client 410-1, a terminal 400-1 is connected to a server 200 through a network 300, where the network 300 may be a wide area network or a local area network, or a combination of the two. The terminal 400-1 responds to the user-defined operation to perform the associated configuration of the marking scene and the peripheral output effect (perception information) in the game client 410-1, the perception information configuration file can be locally stored in the terminal 400-1 and presents a prompt of successful configuration, under the condition that the user uses the game client 410-1 running in the terminal 400-1 (sending application interaction operation and receiving application interaction result), the terminal 400-1 can present the interaction process of the user and the client 410-1, if the marking scene recurs in the game client 410-1, the perception information configuration file is locally obtained from the terminal 400-1 and the corresponding peripheral output effect (perception information) is output, for example, the marking scene which is performed by the user-defined associated configuration is the grenade touch explosion scene in the game, if the peripheral output effect (perception information) of the associated configuration is that blue light is long and bright and vibrates for 5 seconds, in the process that the user uses the game client 410-1 (an interactive process is presented at the same time), if a grenade touch-to-ground explosion scene occurs in the game client 410-1, the peripheral output effect (perception information) of the associated configuration corresponding to the grenade touch-to-ground explosion scene is output, namely, the terminal 400-1 used by the user can control and output the output effect of "blue light is long and bright and vibrates for 5 seconds".
In some embodiments, the perception information profile generated based on the performed association configuration may be sent to the server 200 (corresponding to the server of the client), i.e., the perception information profile is stored in the server 200 corresponding to the client 410-1.
In an implementation scenario, referring to fig. 1B, fig. 1B is a schematic structural diagram of a human-computer interaction perception processing system provided in this embodiment, in order to support a video client, a terminal 400-2 and a terminal 400-3 are connected to a server 200 through a network 300, where the network 300 may be a wide area network or a local area network, or a combination of the two. The terminal 400-2 responds to the user's custom operation to perform the associated configuration of the marked scene and the peripheral output effect (the perception information) in the video client 410-2, generates a perception information configuration file based on the performed associated configuration and sends the perception information configuration file to the server 200 (the server corresponding to the client), the terminal 400-2 receiving the user's custom operation can be different from the terminal 400-3 when the subsequent user uses the video client 410-2, namely after receiving the user's custom operation, the terminal 400-2 uploads the generated perception information configuration file to the server 200 and returns the prompt of successful configuration, the subsequent user obtains the perception information configuration file from the server 200 and outputs the corresponding peripheral output effect (the perception information) of the associated configuration when using the video client 410-3 operating at the terminal 400-3, under the condition that the user uses the video client 410-3 in the terminal 400-3, the terminal 400-3 can present the interactive process between the user and the client 410-3 (send application interactive operation and receive application interactive result), if the marked scene reappears in the video client 410-3, the sensing information configuration file is obtained from the server 200 and the corresponding peripheral output effect (sensing information) is output, for example, if the marked scene of the associated configuration is that the video is played completely, and the peripheral output effect (sensing information) of the associated configuration is that the red light is long and bright and vibrates continuously for 2 seconds, during the process that the user uses the video client 410-3, if the scene of the video playing is finished in the video client 410-3, the peripheral output effect (sensing information) of the associated configuration corresponding to the scene of the video playing finished is output, that is, the terminal 400-3 used by the user may control to output an output effect of "red light is on for a long time and vibrates for 2 seconds".
In some embodiments, the terminal 400-1, the terminal 400-2, and the terminal 400-3 may implement the perception processing method for human-computer interaction provided by the embodiments of the present application by running a computer program, for example, the computer program may be a native program or a software module in an operating system; may be a Native Application (APP), i.e., a program that needs to be installed in an operating system to run, such as a game APP (i.e., game client 410-1, video client 410-2, and video client 410-3 described above); or may be an applet, i.e. a program that can be run only by downloading it to the browser environment; but also an applet that can be embedded into any APP. In general, the computer programs described above may be any form of application, module or plug-in.
The embodiments of the present application may be implemented by means of Cloud Technology (Cloud Technology), which refers to a hosting Technology for unifying series resources such as hardware, software, and network in a wide area network or a local area network to implement data calculation, storage, processing, and sharing. The cloud technology is a general term of network technology, information technology, integration technology, management platform technology, application technology and the like applied based on a cloud computing business model, can form a resource pool, is used as required, and is flexible and convenient. Cloud computing technology will become an important support. Background services of the technical network system require a large amount of computing and storage resources.
As an example, the server 200 may be an independent physical server, may be a server cluster or a distributed system formed by a plurality of physical servers, and may also be a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a web service, cloud communication, a middleware service, a domain name service, a security service, a CDN, and a big data and artificial intelligence platform. The terminals 400-1, 400-2, 400-3 may be, but are not limited to, a smart phone, a tablet computer, a laptop computer, a desktop computer, a smart speaker, a smart watch, and the like. The terminal 400-1, the terminal 400-2, the terminal 400-3 and the server 200 may be directly or indirectly connected through wired or wireless communication, and the embodiment of the present application is not limited thereto.
Referring to fig. 2, fig. 2 is a schematic structural diagram of an electronic device provided in an embodiment of the present application, and a terminal 400-1 shown in fig. 2 includes: at least one processor 410, memory 450, at least one network interface 420, and a user interface 430. The various components in the terminal 400-1 are coupled together by a bus system 440. It is understood that the bus system 440 is used to enable communications among the components. The bus system 440 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 440 in fig. 2.
The Processor 410 may be an integrated circuit chip having Signal processing capabilities, such as a general purpose Processor, a Digital Signal Processor (DSP), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like, wherein the general purpose Processor may be a microprocessor or any conventional Processor, or the like.
The user interface 430 includes one or more output devices 431, including one or more speakers and/or one or more visual displays, that enable the presentation of media content. The user interface 430 also includes one or more input devices 432, including user interface components that facilitate user input, such as a keyboard, mouse, microphone, touch screen display, camera, other input buttons and controls.
The memory 450 may be removable, non-removable, or a combination thereof. Exemplary hardware devices include solid state memory, hard disk drives, optical disk drives, and the like. Memory 450 optionally includes one or more storage devices physically located remote from processor 410.
The memory 450 includes either volatile memory or nonvolatile memory, and may include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read Only Memory (ROM), and the volatile Memory may be a Random Access Memory (RAM). The memory 450 described in embodiments herein is intended to comprise any suitable type of memory.
In some embodiments, memory 450 is capable of storing data, examples of which include programs, modules, and data structures, or a subset or superset thereof, to support various operations, as exemplified below.
An operating system 451, including system programs for handling various basic system services and performing hardware-related tasks, such as a framework layer, a core library layer, a driver layer, etc., for implementing various basic services and handling hardware-based tasks;
a network communication module 452 for communicating to other computing devices via one or more (wired or wireless) network interfaces 420, exemplary network interfaces 420 including: bluetooth, wireless compatibility authentication (WiFi), and Universal Serial Bus (USB), etc.;
a presentation module 453 for enabling presentation of information (e.g., user interfaces for operating peripherals and displaying content and information) via one or more output devices 431 (e.g., display screens, speakers, etc.) associated with user interface 430;
an input processing module 454 for detecting one or more user inputs or interactions from one of the one or more input devices 432 and translating the detected inputs or interactions.
In some embodiments, the human-computer interaction perception processing device 455 provided in the embodiments of the present application may be implemented in software, and fig. 2 illustrates the human-computer interaction perception processing device 455 stored in the memory 450, which may be software in the form of programs and plug-ins, and includes the following software modules: a display module 4551, a detection module 4552, an acquisition module 4553, an output module 4554, a configuration module 4555, a configuration module 4556, a display module 4557, and an output module 4558, which are logical and thus may be arbitrarily combined or further separated according to the functions implemented, and the functions of the respective modules will be described below.
In the following, a method for performing the perception processing of human-computer interaction provided by the embodiment of the present application by the terminal 400-1 in fig. 1A is taken as an example for description. Referring to fig. 3A, fig. 3A is a schematic flowchart of a method for processing human-computer interaction perception provided in an embodiment of the present application, and will be described with reference to the steps shown in fig. 3A.
In step 101, an interactive process is displayed in a human-computer interaction interface.
By way of example, the interaction process displayed in the human-computer interaction interface may be an interaction process for any client, for example, an interaction process for a game client, an interaction process for a video client, an interaction process for an e-commerce client, an interaction process for a social client, and the like.
As an example, the interaction process in step 101 may be a human-computer interaction process in the case of a single user, and in response to any operation of the human-computer interaction interface by the user, the human-computer interaction interface may present an interaction result for the any operation, for example, the any operation is a click operation for a certain task menu, and the interaction result includes displaying the task menu, the interaction process is a process from receiving the click operation and displaying the interaction result, which is displayed in the human-computer interaction interface, and the interaction process in step 101 may be a human-computer interaction process in the case of multi-user interaction, and the interaction process may include: the process of attack operation, defense operation and obtaining corresponding operation response result of a plurality of objects controlled by a plurality of users in a game or military simulation in the process of fighting. The interactive process is presented in the form of a battle process.
In step 102, scenes present during the interaction are detected.
By way of example, a scenario may be any situation that occurs during an interaction, including but not limited to the following situations: a scene of popping up an instant notice in the client, a scene of clicking a menu or a button in the client by a user, a scene of appearing along with sound, a scene of physical collision, a scene of progress change, such as downloading of a resource package, a scene of special events in the client, such as winning a game, reaching a certain gate and the like.
In step 103, when a scene appearing in the interactive process is a marked scene configured with perception information by the user-defined operation, a perception information configuration file corresponding to the marked scene is obtained.
As an example, before step 103 is executed, a custom configuration process for a scene needs to be performed, and in the custom configuration process, in response to a custom operation of a user for any marked scene, parameters of sensing information configured for the marked scene in the custom operation are obtained, and a sensing information configuration file is generated based on the parameters, so that when the marked scene is reproduced, a corresponding sensing information configuration file may be obtained to output corresponding sensing information.
In step 104, in the process of the scene, the perception information corresponding to the perception information configuration file is output.
As an example, outputting the sensing information corresponding to the sensing information configuration file in step 104 may be implemented by an interactive control protocol, specifically, obtaining the type of the sensing information configured by the custom operation and the parameter of the characteristic in the corresponding type from the sensing information configuration file; and sending a calling request to an application program interface opened to the man-machine interaction interface so as to call the peripheral equipment corresponding to the type to output the perception information of the corresponding type, so that the output perception information of the corresponding type conforms to the parameters of the characteristics of the corresponding type.
The following describes in detail a perception information customization process involved in the perception processing method for human-computer interaction provided by the embodiment of the present application with reference to fig. 3B.
Referring to fig. 3B, fig. 3B is a schematic flowchart of a perception processing method for human-computer interaction according to an embodiment of the present application, and before the perception information configuration file corresponding to the marked scene is acquired in step 103, the following steps 105 to 107 may also be performed. The steps 105 to 107 shown in fig. 3B are performed after the step 102, but the actual implementation process is not strictly limited thereto, and the adaptive adjustment may be performed.
In step 105, a marked scene is determined.
As an example, the marking scene is obtained by recording a target scene of the marking operation in response to the marking operation of the user, and the obtained marking scene is used for associating parameters of characteristics of perception information of each type in subsequent custom operation, so as to form different perception information files adapted to different marking scenes.
In step 106, types of perception information that can be configured for the markup scene and characteristics that can be configured in each type are displayed.
As examples, the types of perceptual information include: tactile feedback information; visual feedback information; olfactory feedback information; auditory feedback information.
As an example, the custom configuration process mainly includes two stages, in the first stage, a scenario that needs to be customized is determined, that is, a markup scenario that needs to be customized is determined from multiple scenarios in the interaction process, and in the second stage, a configured parameter is obtained for the markup scenario with a custom requirement. The implementation of the first and second phases may be performed immediately in succession or separately from the time dimension.
In some embodiments, the determining the marking scene in step 105 may be implemented by the following technical solutions: displaying a historical interaction process in a human-computer interaction interface, wherein the historical interaction process is earlier than the interaction process; and recording the target scene as a marked scene in response to a marking operation for the target scene in the history interaction process.
In some embodiments, the displaying of the types of the perception information configurable for the markup scene and the configurable characteristics in each type in step 106 may be implemented by the following technical solutions: playing a screen recording file in a historical interaction process; when a marked scene appears in the playing process of the screen recording file, displaying an inlet of custom perception information aiming at the marked scene; and displaying a custom interface in response to the triggering operation for the entrance, wherein the custom interface is used for receiving the custom operation, and the custom interface comprises the custom interface displaying the types of the perception information which can be configured for the mark scene and the characteristics which can be configured in each type.
As an example, if the first stage (the implementation manner of step 105 above) and the second stage (the implementation manner of step 106 above) are implemented separately from the time dimension, the process of determining to mark a scene is to mark in any history interaction process, and perform screen recording processing for the history interaction process, and after the history interaction process is completed, a screen recording file obtained by screen recording may be played, and the second stage is implemented for the marked scene in the playing screen recording file.
As an example, a historical interaction process is displayed in a human-computer interaction interface of a battle game client, the historical interaction process is earlier than the interaction process, the historical interaction process can be a certain historical battle process, the historical battle process comprises various operation inputs and response results of the operations, and the like, the occurrence time of the historical interaction process is earlier than the interaction process, the historical interaction process is performed in real time aiming at the historical occurrence time point of the historical interaction process, the target scene is recorded as a mark scene in response to the mark operation on the target scene in the historical interaction process, the target scene is a scene selected by the mark operation, for example, two objects collide in the historical interaction process performed in real time at the historical occurrence time point, and if a user wishes to customize the collision scene, the collision scene is the target scene of the mark operation, the collision scene is recorded as a marked scene, when the collision scene appears at a historical occurrence time point in real time, the marking operation of a user aiming at the collision scene is received, the collision scene is recorded as the marked scene, and after the historical interaction process is completed, namely after the historical fight is finished, a screen recording file of the historical interaction process can be played at any time; the method is equivalent to the method that a user can record a screen during history fight, the screen recording file can be played for customizing specific parameters after the history fight is finished, when a marked scene appears in the playing process of the screen recording file, an entry of the customized perception information aiming at the marked scene is displayed, the entry of the customized perception information can be presented in the form of prompt information, namely a triggerable control is displayed, the information for prompting the user that the marked scene can be customized is displayed on the control, the customized interface which is used for receiving the customized operation and comprises the types of the perception information which can be configured aiming at the marked scene and the characteristics which can be configured in each type is displayed in response to the triggering operation aiming at the entry, the user can switch to the customized interface after the triggering operation aiming at the entry, the customized interface comprises the types of the perception information which can be configured and at least one characteristic of each corresponding type, the type includes at least one of visual feedback information or tactile feedback information, each type has different characteristics, for example, the visual feedback information includes intensity characteristics, time characteristics and the like, the customization interface is also used for receiving a customization operation, and the parameters input by the customization operation are specific to the type of the configured perception information and at least one characteristic of each corresponding type.
Through the embodiment, the user can mark the scene which needs to be customized when using the client in real time, and the recording is the marked scene, so that the user can accurately record the target scene which needs to be customized, but in the interaction process with higher real-time requirement, the user is difficult to customize the marked scene immediately after the scene marking, for example, in the game fighting process, the marked target scene does not need to consume more time, so that the target scene can be marked anytime and anywhere in the fighting process, but the customized process needs to consume more time, so that the fighting progress of a player is influenced, therefore, the target scene can be stopped in the subsequent screen playing and recording process for customizing, and effective balance is realized between game real-time interaction experience and customized fine adjustment.
In some embodiments, the determining the marking scene in step 105 may be implemented by the following technical solutions: and recording the target scene as a marked scene in response to a marking operation for the target scene in the interactive process.
In some embodiments, the displaying of the types of the perception information configurable for the markup scene and the configurable characteristics in each type in step 106 may be implemented by the following technical solutions: displaying an entry of custom perceptual information for the marked scene; and displaying a custom interface in response to the triggering operation for the entrance, wherein the custom interface is used for receiving the custom operation and comprises types of perception information which can be configured for the marking scene and custom interfaces of characteristics which can be configured in each type.
As an example, the implementation of the first phase (implementation of step 105 above) and the second phase (implementation of step 106 above) is not mandatory to be performed separately from the time dimension, i.e. the two phases may be performed consecutively.
In some embodiments, referring to fig. 10A, fig. 10A is an interface schematic diagram of a perception processing method of a human-computer interaction interface provided in an embodiment of the present application, in a human-computer interaction interface 1001A, in response to a marking operation for a target scene during an interaction process of a formation-like game client, recording the target scene as a marked scene, for example, when a scene "a virtual object becomes a leading role of a woman" appears during an interaction process of a formation-like game, displaying a scene 1002A "the virtual object becomes the leading role of the woman" in the human-computer interaction interface 1001A, in response to a marking operation for this scene 1002A, recording it as a marked scene, and displaying an entry 1003A of custom perception information for the marked scene; in response to the trigger operation for the portal 1003A, a custom interface 1004A for receiving a custom operation is displayed. The entry of the customized sensing information can be presented in the form of prompt information, that is, a triggerable control is displayed, information for prompting a user that a markup scene can be customized is displayed on the control, in response to a trigger operation for the entry, a customized interface for receiving the customized operation and including types of sensing information which can be configured for the markup scene and characteristics which can be configured in each type is displayed, after the trigger operation for the entry, the interface can be switched to the customized interface, the customized interface includes the types of sensing information which can be configured and at least one characteristic of each corresponding type, the types include at least one of visual feedback information or tactile feedback information, each type has different characteristics, for example, the visual feedback information includes intensity characteristics, time characteristics and the like, the user-defined interface is also used for receiving user-defined operation, and the parameters input by the user-defined operation are specific to the type of the sensing information which can be configured and at least one characteristic of each corresponding type.
Through the implementation mode, the user can mark the scene needing to be customized when using the client side in real time and record the scene as the marked scene, so that the user can accurately record the target scene needing to be customized, the user can customize the marked scene immediately after marking the scene in the interactive process with lower real-time requirement, for example, in the process of establishing the game, the marked target scene does not need to consume much time, so the marking can be carried out at any time and any place in the process of establishing the game, although the customizing process needs to consume much time, the interactive real-time requirement of the establishing game is lower, so the marking scene can be stopped for customizing in the interactive process, thereby effectively ensuring the real-time of the user-defined operation and avoiding the user forgetting to customize the marked scene in the subsequent process, resulting in a lower utilization of this function.
In some embodiments, the measuring method of the real-time performance may be implemented by marking a client, for example, marking a battle game client as a client with a higher real-time performance requirement, marking a grown-up game client as a client with a lower real-time performance requirement, and adopting the two different embodiments for clients with different real-time performance markers, where the process of marking a client may be completed in a development stage, and a query may be performed in a subsequent application process, and the embodiments respectively corresponding to the applications are used as default embodiments of the client, and the default embodiments are adjusted to the embodiments selected by the user in response to an adjustment operation of the user for the embodiments.
In some embodiments, the displaying of the types of the perception information configurable for the markup scene and the configurable characteristics in each type in step 106 may be implemented by the following technical solutions: displaying a plurality of preconfigured modes of a markup scene; wherein each mode includes a type of at least one of perceptual information associated with a markup scene, and a default parameter in which a characteristic in the type is preconfigured.
In step 107, in response to the custom operation for the markup scene, parameters configured by the custom operation for the characteristics in the at least one type are obtained, and a perception information configuration file corresponding to the markup scene is generated based on the obtained parameters of the characteristics in the at least one type.
In some embodiments, the obtaining of the parameters configured by the custom operation for the characteristics in the at least one type in step 107 may be implemented by the following technical solutions: acquiring at least one type of perception information related to a marked scene and included in a target mode and default parameters with pre-configured characteristics in the type; wherein the target mode is a selected mode of the custom operation among a plurality of preconfigured modes.
As an example, referring to fig. 10B, fig. 10B is an interface schematic diagram of a perception processing method of a human-machine interaction interface provided in an embodiment of the present application, in a human-machine interaction interface 1001B, a plurality of preconfigured modes 1002B of a markup scene are displayed, the plurality of preconfigured modes 1002B are presented in the form of a control, each mode includes a type of at least one piece of perception information associated with the markup scene and a default parameter with which a characteristic in the type is preconfigured, if a modification operation for the parameter is not involved in a customization operation in step 107, that is, the type of the at least one piece of perception information associated with the markup scene in the selected mode and the default parameter with which the characteristic in the type is preconfigured are obtained as a parameter configured for the characteristic in the at least one type.
The following describes in detail a custom parameter modification process involved in the human-computer interaction perception processing method provided in the embodiment of the present application with reference to fig. 3C.
Referring to fig. 3C, fig. 3C is a schematic flowchart of a perception processing method for human-computer interaction provided in the embodiment of the present application, where characteristics in each type have default parameters configured in advance; the obtaining of the parameters configured by the custom operation for the characteristics in the at least one type in step 107 can be implemented by the following steps.
In step 1071, the default parameters are updated to the parameters set by the custom operation in response to the custom operation modifying the default parameters for the property in the at least one type.
As an example, there are default parameters for the preconfigured modes, there are default modes in the preconfigured modes, and there are also default parameters in the default modes, and unlike directly acquiring the default parameters as the parameters configured for the characteristics in the at least one type, the parameters set by the custom operation (the set parameters modified based on the default parameters) are acquired as the parameters configured for the characteristics in the at least one type.
In some embodiments, obtaining the parameters configured by the custom operation for the characteristics in at least one type in step 107 may be implemented by the following technical solutions: obtaining candidate parameters for a characteristic in at least one type; carrying out score prediction processing on the candidate parameters through a first neural network model to obtain a prediction score of each candidate parameter; taking the candidate parameter with the highest predictive score as the parameter of the characteristic in at least one type; the first neural network model is obtained based on the candidate parameter samples and the real scores corresponding to the candidate parameter samples.
In some embodiments, obtaining the parameters configured by the custom operation for the characteristics in at least one type in step 107 may be implemented by the following technical solutions: acquiring operation behavior data in a human-computer interaction interface; performing preference prediction processing on the operation behavior data through a second neural network model to obtain operation preference represented by the operation data; obtaining parameters of characteristics in at least one type pre-associated with the operating preferences; and the second neural network model is trained based on the operation data samples and the operation preference characterized by the operation data samples.
By way of example, the above two embodiments are both operations in which the idea of machine learning is applied to predict parameters of characteristics in at least one type through a training model, the custom operation may be an operation that triggers the operation of a first neural network model and/or a second neural network model, the first neural network model takes user scores as a prediction basis, the trained first neural model may predict parameters of characteristics in at least one type of perceptual information that meets the user's desire, the second neural network model takes operation behavior data as a prediction basis, the trained second neural model may predict parameters of characteristics in at least one type of perceptual information that is pre-associated with operation preferences, and the parameters configured for the characteristics in at least one type by the custom operation may be rapidly obtained through the above embodiments.
In some embodiments, the type of sensory information used by the sensory information profile to configure comprises haptic feedback information comprising at least one of the following characteristics: frequency, intensity, duration, orientation; the step 104 of outputting the perception information corresponding to the perception information configuration file may be implemented by the following technical solutions: according to the type of the event included in the scene, haptic feedback information configured to be associated with the type of the event is output.
In some embodiments, the above outputting the haptic feedback information configured to be associated with the type of the event according to the type of the event included in the scene may be implemented by: performing at least one of the following operations: in a marking scene comprising a user operation event, synchronously outputting tactile feedback information synchronized with the user operation event; in a markup scene including sound, outputting at least one of: tactile feedback information adapted to the type of sound, tactile feedback information in accordance with the degree of excitement of the sound, tactile feedback information in accordance with the orientation of the sound source of the sound; in a marked scenario including a collision event, outputting at least one of: haptic feedback information synchronized with the collision event, haptic feedback information consistent with a severity of the collision event; in a markup scenario including a notification event, haptic feedback information synchronized with the notification event is output.
As an example, the type of event associated with the configured haptic feedback information in the scene includes at least one of: the user operates the event, the user clicks a menu or a button in the game to output the tactile feedback information, and the received clicking operation in the game playing process can also cause the output of the tactile feedback information; the game client side is internally provided with a plurality of sound effects for creating atmosphere, and can vibrate according to the rhythm of the sound effects, so that the user can more intuitively realize the feedback of the interactive information; collision events, namely scenes with a plurality of physical collision events in the game client, such as events colliding with obstacles in the driving process and the like, are fed back to the user by outputting tactile feedback information, so that the interactive information feedback can be more intuitively carried out, and the user can be more concentrated on the game content; and a notification event, in which haptic feedback information may be output for a scene including an instantaneous notification event, the notification event is not limited to a general pop-up notification, and may be output of any information, for example, output of enemy direction information, which may be either displayed or output, and assuming that an enemy is in front left, a motor disposed in front left of the terminal vibrates to inform a player that the enemy is in front left.
As an example, the characteristics of the haptic feedback information (vibration) include at least one of: intensity, which is the intensity of the haptic feedback information, such as vibration intensity, and is usually graded, for example, the intensity of the haptic feedback information may be marked by a number from 0 to 100, different intensities may have different vibration expressions, and the haptic feedback information may be output for scenes with a degree of severity difference, and the intensity of vibration (touch) may be used for feeding back the degree of severity of a collision scene and a sound scene; the frequency is the frequency of the tactile feedback information, such as the vibration frequency, and the unit is hertz (vibration times/second), different frequencies correspond to different vibration effects, and different vibration effects can be displayed through the change of the frequency; the duration is the duration of the tactile feedback information, for example, the vibration duration, and the vibration duration can also generate different vibration effects; orientation, simulating the direction and source of sound or collision by vibrating the orientation, generating tactile feedback aiming at the source and direction of objects in a scene, clearly performing feedback of the direction and the source by the sense of touch when a plurality of scenes in a game need to distinguish the source and the direction of sound (such as footfall sound of enemies, explosion sound and the like), for example, the explosion sound comes from the left front of a player, vibrating only at the upper left corner of a terminal to identify the source of the explosion sound, and enabling the visual perception of a user to be more focused on a game picture.
In some embodiments, the type of sensory information used by the sensory information profile to configure comprises visual feedback information comprising at least one of the following characteristics: state, frequency, color, brightness; the step 104 of outputting the perception information corresponding to the perception information configuration file may be implemented by the following technical solutions: according to the type of the event included in the scene, visual feedback information configured to be associated with the type of the event is output.
In some embodiments, the outputting of the visual feedback information configured to be associated with the type of the event according to the type of the event included in the scene may be implemented by: performing at least one of the following operations: outputting visual feedback information synchronized with the notification event in a marking scene including the notification event; in a markup scene including sound, outputting at least one of: visual feedback information adapted to the type of sound, visual feedback information adapted to the rhythm of the sound; outputting visual feedback information for representing the progress of the progress event in a marked scene comprising the progress event; in a tagged scene that includes a resultant event, visual feedback information characterizing the result of the resultant event is output.
As an example, the event type associated with the configured visual feedback information in the scene includes at least one of: when a notification event (for example, a notification in a game, etc.) exists, notification reminding can be performed on a user through visual feedback information such as customized light flashing, namely, the visual feedback information synchronized with the notification event is output; the sound event can realize music rhythm appearance through visual feedback information, and the in-game sound effect and the music rhythm can be visually displayed through lamplight, in particular to the visual display of music games, namely at least one of visual feedback information adaptive to the type of sound and visual feedback information adaptive to the rhythm of the sound is output; the method comprises the steps that a progress event, namely a scene including an event needing to wait for a long time exists in a game client, for example, downloading of a resource package and the like, a progress bar is simulated through visual feedback information (lamp strips), so that a user can judge the current downloading progress through light and output the visual feedback information for representing the progress of the progress event without repeatedly entering the game client to check; the result event, a special event within the game, may be displayed by lighting, for example, to win a game, to reach a level, etc., and visual feedback information is output to characterize the result of the result event.
As an example, the characteristics of the visual feedback information (light effect) include at least one of: the state (on and off of the lamp) shows different shapes and effects through the control of the on and off of the lamp light at different positions; the flicker frequency can be switched on and off quickly to achieve the flicker effect, the switching frequency per second is the flicker frequency, different flicker frequencies can achieve different effects, and the frequency switching at different positions and at different times can achieve different effects; the color of each display unit can be controlled to realize various color special effects such as gradual change, change and the like; luminance, luminance not only can carry out the demonstration of light and shade effect, can also combine colour, scintillation frequency and state to accomplish richer visual effect.
The authorization query process involved in the human-computer interaction perception processing method provided by the embodiment of the present application is described in detail below with reference to fig. 3D.
Referring to fig. 3D, fig. 3D is a flowchart illustrating a method for processing human-computer interaction perception provided in the embodiment of the present application, and before the step 104 is executed to output the perception information corresponding to the perception information configuration file, the following step 108 and 109 may also be executed.
In step 108, the type of the perception information that the electronic device supports customization and the characteristics that support customization in the corresponding type are queried.
In step 109, according to the query result, the type of the perception information that does not support the customization and the characteristics in the corresponding type are filtered out from the perception information configuration file to update the perception information configuration file.
As an example, before the client controls the electronic device to output corresponding sensing information, an electronic device authorization query needs to be performed through a communication protocol, first, the client initiates a connection establishment request, an operating system of the electronic device (hereinafter, referred to as the electronic device) establishes a connection in response to the connection request, the client queries the type of the sensing information supported by the electronic device and the characteristics of the sensing information supported by the electronic device, the client queries the modes supported by the electronic device (visual feedback information and tactile feedback information), the electronic device returns a supported effect list (whether the visual feedback information is supported or not, whether the tactile feedback information is supported or not), the client queries the characteristics supported by the visual feedback information (the client initiates the query if the query result is the visual feedback information supported, otherwise, the electronic device does not initiate the query), and the electronic device returns the characteristics of the authorization control supported by the visual feedback information, the client inquires the characteristics supported by the tactile feedback information (the inquiry is initiated when the preorder inquiry result is that the tactile feedback information is supported, otherwise, the inquiry is not initiated), the electronic equipment returns the characteristics of the authorization control supported by the tactile feedback information, so that the type of the perception information which does not support self-definition and the characteristics in the corresponding type can be filtered out from the perception information configuration file according to the inquiry result, and the perception information configuration file is updated.
In some embodiments, the technical solutions described in the above steps 108 to 109 are implemented after the configuration process is implemented and before the output of the perceptual information, considering the possibility of terminal replacement, that is, referring to the application scenario described in fig. 1B, a terminal receiving a user-defined operation is different from a terminal outputting the perceptual information when the scenario is reproduced, and it is difficult for different terminals to ensure that the types of the perceptual information and the characteristics of the corresponding types that they support and authorize to call the interface are the same, so the technical solutions described in the above steps 108 to 109 are implemented after the configuration process is implemented and before the output of the perceptual information, which can save the computational response resources and avoid that queries without practical meaning occupy limited computational response resources.
In some embodiments, the technical solutions described in the above steps 108 to 109 may be implemented before implementing the configuration process, considering the possibility of long-term use of the terminal, that is, referring to the application scenario described in fig. 1A, the terminal receiving the user-defined operation is the same as the terminal outputting the perception information in the recurring scenario, and it may be ensured that the types and the characteristics of the corresponding types of the perception information of the support and authorized calling interface queried and obtained in the configuration process are the same as those in the control output process, because the technical solutions described in the above steps 108 to 109 are implemented before implementing the configuration process, the user may obtain a front-back unified use experience, and a situation that the effect configured in the user-defined process cannot be output on the terminal is avoided.
Referring to fig. 4, fig. 4 is a schematic flowchart of a method for processing human-computer interaction perception provided in an embodiment of the present application, and will be described with reference to the steps shown in fig. 4.
In step 201, a user-defined operation for a marked scene is received at a human-computer interaction interface.
In step 202, in response to the custom operation, a sensing information configuration file corresponding to the marked scene is generated according to the type of the sensing information configured by the custom operation and the parameters of the characteristics in the corresponding type.
The implementation of steps 201 and 202 can refer to the implementation of steps 105 and 107 in the previous embodiment.
In step 203, displaying an interaction process on a human-computer interaction interface;
in step 204, when the scene appearing again in the interaction process is a marked scene, in the proceeding process of the scene, the sensing information of the type configured by the custom operation is output according to the sensing information configuration file, and the output sensing information of the corresponding type conforms to the parameter of the characteristic in the corresponding type.
The implementation of steps 203 and 204 can refer to the implementation of steps 101-104 in the previous embodiment.
In the following, an exemplary application of the embodiment of the present application in a practical application scenario will be described, taking a client as a game client as an example, the processing method of human-computer interaction provided by the embodiment of the application allows a user to mark scenes in the game process, and after the game is finished, the perception information of the types of touch sense (tactile feedback information), lamp effect (visual feedback information) and the like is automatically defined by playing back the game screen recording file, so that the user can flexibly adjust the parameters of each characteristic item of the perception information of the types of touch sensation, lamp effect and the like, therefore, a game atmosphere exclusive for the user can be created, and the game client and the electronic equipment (such as a terminal) can finely control hardware related to touch and light effects in the terminal through an agreed protocol, so that the user-defined effect is perfectly presented in the terminal.
In some embodiments, the touch sense of the terminal is mainly achieved by vibration of the motor, the user is reminded by the vibration or experiences various scenes in the game more realistically through the touch sense, such as a scene of hitting an enemy in a shooting game, a scene of footsteps of the enemy, a bumpy scene of driving a motor vehicle, a collision scene, a scene with sound effect, and the like, and the following situations exist in a marking scene of specific tactile feedback information (vibration): 1. the client control terminal triggers instantaneous vibration (single or multiple times), and the user is informed of event triggering, button clicking and the like through tactile perception brought by the vibration, for example, the user clicks a menu or a button in a game to trigger the tactile feedback, and a click operation in the game playing process can also trigger the tactile feedback; 2. the touch feedback can be generated aiming at a sound scene, a plurality of sound effects are used for creating atmosphere in the game, and vibration can be performed according to the rhythm of the sound effect (music), so that the user can more intuitively realize the feedback of the interactive information; 3. the method has the advantages that the tactile feedback can be generated aiming at the scenes of physical collision, a plurality of physical collision scenes exist in the game, for example, the scenes of collision with obstacles in the driving process, bumpy scenes on the road surface and the like, the collision information is fed back to the user through the tactile sense, so that the interactive information feedback can be more intuitively carried out, and the user can be more concentrated on the game content; 4. the source and the direction of an object in a scene can be generated to generate tactile feedback, a plurality of scenes in a game need to distinguish the source and the direction of sound (such as footfall sound of enemies, explosion sound and the like), the direction and the source feedback is clearly carried out through tactile sensation, for example, the explosion sound comes from the left front of a player, and vibration can be carried out only on the upper left corner part of a terminal to identify the source of the explosion sound, so that the visual perception of a user is more focused on a game picture; 5. haptic feedback may be generated for scenes with a severity difference, and the intensity of the vibrations (haptics) may be used to feedback the severity of the impact scene as well as the sound scene.
In some embodiments, the characteristics of the haptic feedback information (vibration) include the following: 1. intensity, which is the intensity of the tactile feedback information, such as vibration intensity, is usually graded, for example, the intensity of the tactile feedback information can be marked by a number from 0 to 100, and different intensities can have different vibration expressions; 2. the frequency is the frequency of the tactile feedback information, such as the vibration frequency, and the unit is hertz (vibration times/second), different frequencies correspond to different vibration effects, and different vibration effects can be displayed through the change of the frequency; 3. the duration is the duration of the tactile feedback information, for example, the vibration duration, and the vibration duration can also generate different vibration effects; 4. azimuth, the direction and origin of sound or impact is simulated by the vibrational azimuth.
In some embodiments, the following situations exist for the marking scene of specific visual feedback information (light effect): 1. visual feedback information can be generated according to an event notification scene, and when a notification event exists, notification reminding can be performed on a user through the visual feedback information such as breathing light effect or customized light flashing and the like, for example, notification in a game and the like; 2. aiming at scenes with music rhythms, the music rhythm can be displayed externally through visual feedback information, and the sound effect in the game and the music rhythm can be visually displayed through light effect, particularly the visual display of music games; 3. visual feedback information can be generated aiming at scenes with progress changes, scenes needing long-time waiting exist in the game, such as downloading of resource packages and the like, and a progress bar is simulated through the visual feedback information (lamp strip), so that a user can judge the current downloading progress through light without repeatedly entering the game to check; 4. visual feedback may be generated for the scene where the game event exists, and special events within the game may be displayed by light effects, for example, winning a game, reaching a certain level, and the like.
In some embodiments, the characteristics of the visual feedback information (light effect) include the following: 1. the state (on and off of the lamp) shows different shapes and effects through the control of the on and off of the lamp light at different positions; 2. the flicker frequency can be switched on and off quickly to achieve the flicker effect, the switching frequency per second is the flicker frequency, different flicker frequencies can achieve different effects, and the frequency switching at different positions and at different times can achieve different effects; 3. the color of each display unit can be controlled to realize various color special effects such as gradual change, change and the like; 4. luminance, luminance not only can carry out the demonstration of light and shade effect, can also combine colour, scintillation frequency and state to accomplish richer visual effect.
In some embodiments, referring to fig. 5, fig. 5 is an application schematic diagram of a human-computer interaction perception processing method provided in the embodiments of the present application, parameters of various characteristics of default tactile feedback information and parameters of various characteristics of visual feedback information are generally preset in a game, a trigger operation of a user for a screen recording control 502 may be received in a setting interface 501, and in response to receiving the trigger operation for the screen recording control 502, a screen recording may be performed on a game process, so that a corresponding scene is positioned by playing back a screen recording file, thereby customizing these effects.
Referring to fig. 6, fig. 6 is an application schematic diagram of a human-computer interaction perception processing method provided in this embodiment of the present application, in a game playback process displayed on a human-computer interaction interface 601, when a marked scene appears, a customizable prompt message 602 is displayed on the human-computer interaction interface 601, the prompt message may be used to prompt that the scene may be customized, and further may be used to prompt a type of the customizable perception information, and meanwhile, the terminal may prompt to output the type of the customizable perception information, for example, the scene may be customized by tactile feedback information, and at this time, tactile feedback information (vibration) may be output to prompt a user that the scene may be customized by tactile feedback information, or the terminal may also output visual feedback information and/or tactile feedback information, the user is prompted to perform the customized configuration of the perception information for the scene, the type of the perception information which can be configured is not specifically prompted, and a customized interface can be displayed for the user to perform the customized customization of the perception information in response to the trigger operation (click operation) for the prompt information 602.
In some embodiments, for each scene, the game has a default effect (default perception information output) set in advance, the user can perform fine adjustment based on the default perception information to obtain the special effect, and for the customization of the haptic feedback information, the content that the user needs to perform the customization is the parameter of each characteristic defined in the haptic feedback information in the above embodiments, see fig. 7, fig. 7 is a schematic application diagram of a perception processing method for human-computer interaction provided by an embodiment of the present application, specifically an example of a haptic effect editor, the instantaneous state of each time point can be adjusted according to a time line, and the user can adjust the parameters (the orientation, the frequency and the intensity) of the tactile feedback information in a whole manner through the adjustment operation aiming at the waveform, and can also define the tactile parameters of each time point through three sub items of the orientation, the frequency and the intensity respectively. For the customization of the visual feedback information, the content that the user needs to customize is the parameter of each characteristic defined in the visual feedback information in the above embodiment, see fig. 8, fig. 8 is an application schematic diagram of the human-computer interaction perception processing method provided in the embodiment of the present application, and specifically is an example of a haptic effect editing interface 801 (customized interface), which can adjust the instantaneous state of each time point according to a time line, similar to the parameter adjustment of the haptic feedback information, the user can integrally adjust the parameter of the visual feedback information by a waveform, or can define the visual parameter of each time point by four sub-items of frequency, brightness, color and state, each sub-item has a corresponding control 802, that is, the user can control the display effect of each time point, if there are a plurality of light strips, each light strip can be independently controlled, can freely adjust the on-off state, the flashing frequency, the brightness and the color (gradual change color) of the lamp strip
In some embodiments, referring to fig. 9, fig. 9 is a protocol diagram of a human-computer interaction perception processing method provided in an embodiment of the present application, connection and authorization query need to be established based on the following communication protocol before control of a client to a terminal peripheral is implemented, where 1, a game initiates establishment of a connection request, 2, a terminal responds to the connection request to establish connection, 3, a manner supported by the game query terminal (visual feedback information and tactile feedback information), 4, the terminal returns a supported effect list (whether visual feedback information is supported or not, and whether tactile feedback information is supported or not), 5, a characteristic supported by the game query visual feedback information (if a result of the preorder query is supporting visual feedback information, the current query is initiated, otherwise, the current query is not initiated), 6, a characteristic supported by the visual feedback information returned by the terminal, 7, and a characteristic supported by the game query tactile feedback information (if a result of the preorder query is supporting tactile feedback information, the current query is initiated) And inquiring again, otherwise, not initiating the inquiry), and 8, returning the characteristic of authorization control supported by the tactile feedback information by the terminal.
In some embodiments, the haptic effect file (vibration) in the perception information profile is defined as follows, the file being defined as a sequence of times, the time intervals being in milliseconds, each time point having the following data: the strength (vibration strength) of the tactile feedback information is an integer of 0-100, and the larger the number is, the higher the strength is, other integer ranges can be selected according to actual needs; the frequency (vibration frequency) of the tactile feedback information is in units of times/second, and the duration (vibration duration) of the tactile feedback information is in units of milliseconds and represents the duration of certain vibration; the position (vibration position) of the tactile feedback information takes the upper left corner of the terminal as the origin of a coordinate system, and a region is marked by four coordinate points to serve as a vibration region, so that the vibration position is represented.
In some embodiments, the visual effect file (light effect) in the perception information profile is defined as follows, the file is defined as a sequence of time, the time interval is in milliseconds, the data of each time point is a list, wherein the minimum light-emitting unit is a piece of data, and the content is as follows: duration of minimum light emitting unit: the unit is millisecond; the state of the light strip (which may be the smallest lighting unit) is displayed by a list, where each element in the list marks the state of each illuminable point in the light strip, as follows: coordinates, marking each point on the lamp strip, numbering from 0, and marking different luminous points by using different numbers respectively; the state of each luminous point, the state of light, the state of extinguishment and the state of flicker; color, coded in RGB format (other coding may be used as well); flicker frequency: the unit is 'time/second', and the field needs to be set when the state is flashing; the brightness is represented by an integer of 0 to 100, and may be in other integer ranges according to actual needs.
In some embodiments, the issuing and playing process of the effect file is as follows, and the input of the PlayHaptics function is the haptic effect file defined in the above embodiment, and the output is: success or failure, which functions to play the haptic effect (shake) defined in the haptic effect file; the input of the PlayLampEffect function is the visual effect file defined in the above embodiment, and the output is: success or failure, which implements the function of playing the visual effect (light effect) defined in the visual effect file.
In some embodiments, the customized functions of touch and light effects can also be opened to the development design user, so that the game has rich and vivid default effects, and a uniform default interactive experience (the aforementioned default mode) is provided for the ordinary user.
This application has defined a set of complete solution for the user can independently select scene in the recreation, through the self-defined peripheral hardware effect of supporting instrument, thereby makes the user can customize the exclusive experience effect, gives play to terminal hardware optimal performance, can realize: the method allows a user to customize the touch and light effects in the game, so that the user-defined effect can be triggered in the same scene every time, and exclusive game experience is brought to the user; through an interactive control protocol between a game and a terminal (an operating system of the terminal), the game controls components of touch (vibration) and vision (light effect) of the terminal, so that effects defined by a user can be realized on the terminal. As the requirement of the user on the personalized experience of the game is higher and higher, the perception processing method of the human-computer interaction provided by the embodiment of the application can fully utilize terminal hardware to create the personalized game atmosphere, and the user is enabled to participate in the customization of the game experience effect more deeply due to the fact that the user-defined function is provided for the user, so that the human-computer interaction efficiency is improved, the interaction protocol established between the game and the terminal can realize the deeper combination of the touch sense and the light effect of the terminal and the game, and the user-defined effect is more abundantly displayed.
Continuing with the exemplary structure of the human-computer interaction perception processing device 455 provided in the embodiments of the present application as software modules, in some embodiments, as shown in fig. 2, the software modules stored in the human-computer interaction perception processing device 455 of the memory 450 may include: a display module 4551, configured to display an interaction process in the human-computer interaction interface; a detection module 4552, configured to detect a scene occurring in the interaction process; an obtaining module 4553, configured to obtain a sensing information configuration file corresponding to a marked scene when the scene appearing in the interaction process is the marked scene configured with the sensing information by the user-defined operation; an output module 4554, configured to output the perception information corresponding to the perception information configuration file during the scene process.
In some embodiments, the type of sensory information used by the sensory information profile to configure comprises haptic feedback information comprising at least one of the following characteristics: frequency, intensity, duration, orientation; an output module 4554, further configured to: according to the type of the event included in the scene, haptic feedback information configured to be associated with the type of the event is output.
In some embodiments, the output module 4554 is further configured to: performing at least one of the following operations: in a marking scene comprising a user operation event, synchronously outputting tactile feedback information synchronized with the user operation event; in a markup scene including sound, outputting at least one of: tactile feedback information adapted to the type of sound, tactile feedback information in accordance with the degree of excitement of the sound, tactile feedback information in accordance with the orientation of the sound source of the sound; in a marked scenario including a collision event, outputting at least one of: haptic feedback information synchronized with the collision event, haptic feedback information consistent with a severity of the collision event; in a markup scenario including a notification event, haptic feedback information synchronized with the notification event is output.
In some embodiments, the type of sensory information used by the sensory information profile to configure comprises visual feedback information comprising at least one of the following characteristics: state, frequency, color, brightness; an output module 4554, further configured to: according to the type of the event included in the scene, visual feedback information configured to be associated with the type of the event is output.
In some embodiments, the output module 4554 is further configured to: performing at least one of the following operations: outputting visual feedback information synchronized with the notification event in a marking scene including the notification event; in a markup scene including sound, outputting at least one of: visual feedback information adapted to the type of sound, visual feedback information adapted to the rhythm of the sound; outputting visual feedback information for representing the progress of the progress event in a marked scene comprising the progress event; in a tagged scene that includes a resultant event, visual feedback information characterizing the result of the resultant event is output.
In some embodiments, before obtaining the perceptual information profile corresponding to the marked scene, the apparatus further comprises: a configuration module 4555 configured to: determining a marked scene; displaying types of perception information configurable for a markup scene and characteristics configurable in each type; wherein the types of perception information include: tactile feedback information; visual feedback information; olfactory feedback information; auditory feedback information; in response to the custom operation aiming at the marked scene, parameters configured by the custom operation aiming at the characteristics in the at least one type are obtained, and a perception information configuration file corresponding to the marked scene is generated based on the obtained parameters of the characteristics in the at least one type.
In some embodiments, the module 4555 is further configured to: displaying a historical interaction process in a human-computer interaction interface, wherein the historical interaction process is earlier than the interaction process; and recording the target scene as a marked scene in response to a marking operation for the target scene in the history interaction process.
In some embodiments, the module 4555 is further configured to: playing a screen recording file of a historical interaction process before responding to the self-defining operation aiming at the marked scene; when a marked scene appears in the playing process of the screen recording file, displaying an inlet of custom perception information aiming at the marked scene; and displaying the types which are used for receiving the custom operation and comprise the perception information which can be configured for the mark scene and the characteristics which can be configured in each type.
In some embodiments, the module 4555 is further configured to: and recording the target scene as a marked scene in response to a marking operation for the target scene in the interactive process.
In some embodiments, the module 4555 is further configured to: after determining the marked scene and before responding to the custom operation for the marked scene, displaying an entry of custom perception information for the marked scene; and displaying the types which are used for receiving the custom operation and comprise the perception information which can be configured for the mark scene and the characteristics which can be configured in each type.
In some embodiments, the properties in each type have pre-configured default parameters; a configuration module 4555, further configured to: in response to a modification of the default parameters of the custom operation for a property in the at least one type, the default parameters are updated to the parameters set by the custom operation.
In some embodiments, the module 4555 is further configured to: displaying a plurality of preconfigured modes of a markup scene; wherein each mode comprises at least one of a type of perceptual information associated with the markup scene and a default parameter in which a characteristic in the type is preconfigured; acquiring at least one type of perception information related to a marked scene and included in a target mode and default parameters with pre-configured characteristics in the type; wherein the target mode is a selected mode of the custom operation among a plurality of preconfigured modes.
In some embodiments, the module 4555 is further configured to: obtaining candidate parameters for a characteristic in at least one type; carrying out score prediction processing on the candidate parameters through a first neural network model to obtain a prediction score of each candidate parameter; taking the candidate parameter with the highest predictive score as the parameter of the characteristic in at least one type; the first neural network model is obtained based on the candidate parameter samples and the real scores corresponding to the candidate parameter samples.
In some embodiments, the module 4555 is further configured to: acquiring operation behavior data in a human-computer interaction interface; performing preference prediction processing on the operation behavior data through a second neural network model to obtain operation preference represented by the operation data; obtaining parameters of characteristics in at least one type pre-associated with the operating preferences; and the second neural network model is trained based on the operation data samples and the operation preference characterized by the operation data samples.
In some embodiments, the output module 4554 is further configured to: inquiring the type of the perception information which is supported by the electronic equipment and the characteristic which supports self-definition in the corresponding type before outputting the perception information corresponding to the perception information configuration file; and according to the query result, filtering out the type of the perception information which does not support self-definition and the characteristics in the corresponding type in the perception information configuration file so as to update the perception information configuration file.
In some embodiments, the output module 4554 is further configured to: acquiring the type of the perception information configured by the user-defined operation and parameters of characteristics in the corresponding type from the perception information configuration file; and sending a calling request to an application program interface opened to the man-machine interaction interface so as to call the peripheral equipment corresponding to the type to output the perception information of the corresponding type, so that the output perception information of the corresponding type conforms to the parameters of the characteristics of the corresponding type.
In some embodiments, as shown in fig. 2, the software modules stored in the human-computer interaction perception processing device 455 of the memory 450 may include: a configuration module 4556, configured to receive a custom operation for a marked scene in a human-computer interaction interface; the configuration module 4556 is further configured to, in response to the custom operation, generate a perceptual information configuration file corresponding to the marked scene according to the type of the perceptual information configured by the custom operation and the parameter of the characteristic in the corresponding type; the display module 4557 is used for displaying an interaction process on a human-computer interaction interface; the output module 4558 is configured to, when a scene appearing again in the interaction process is a marked scene, output the sensing information of the type configured by the custom operation according to the sensing information configuration file in the proceeding process of the scene, and the output sensing information of the corresponding type conforms to the parameter of the characteristic in the corresponding type.
Embodiments of the present application provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and executes the computer instructions, so that the computer device executes the perception processing method for human-computer interaction described in the embodiments of the present application.
Embodiments of the present application provide a computer-readable storage medium storing executable instructions, which when executed by a processor, will cause the processor to perform a perception processing method for human-computer interaction provided by embodiments of the present application, for example, the perception processing method for human-computer interaction shown in fig. 3A-3D and fig. 4.
In some embodiments, the computer-readable storage medium may be memory such as FRAM, ROM, PROM, EP ROM, EEPROM, flash memory, magnetic surface memory, optical disk, or CD-ROM; or may be various devices including one or any combination of the above memories.
In some embodiments, executable instructions may be written in any form of programming language (including compiled or interpreted languages), in the form of programs, software modules, scripts or code, and may be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
By way of example, executable instructions may correspond, but do not necessarily have to correspond, to files in a file system, and may be stored in a portion of a file that holds other programs or data, such as in one or more scripts in a hypertext Markup Language (HTML) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
By way of example, executable instructions may be deployed to be executed on one computing device or on multiple computing devices at one site or distributed across multiple sites and interconnected by a communication network.
In summary, the perception information of the marked scene is flexibly configured through the user-defined operation of the embodiment of the application, so that when the same marked scene is reproduced, the corresponding perception information configuration file can be reused to call the hardware resource in the electronic equipment to output the perception information, the atmosphere adapted to the scene is presented, the immersive personalized interactive experience is created, and the effective utilization of the hardware resource of the electronic equipment is finally realized.
The above description is only an example of the present application, and is not intended to limit the scope of the present application. Any modification, equivalent replacement, and improvement made within the spirit and scope of the present application are included in the protection scope of the present application.
Claims (21)
1. A perception processing method for human-computer interaction is characterized by comprising the following steps:
displaying an interaction process in a human-computer interaction interface;
detecting scenes appearing in the interaction process;
when the scene appearing in the interaction process is a marked scene configured with perception information by self-defined operation, obtaining a perception information configuration file corresponding to the marked scene;
and outputting the perception information corresponding to the perception information configuration file in the scene proceeding process.
2. The method of claim 1,
the type of sensory information that the sensory information profile is configured to configure comprises haptic feedback information comprising at least one of the following characteristics: frequency, intensity, duration, orientation;
the outputting the perception information corresponding to the perception information configuration file includes:
outputting haptic feedback information configured to be associated with a type of an event included in the scene according to the type of the event.
3. The method of claim 2, wherein outputting, according to a type of an event included in the scene, haptic feedback information configured to be associated with the type of the event comprises:
performing at least one of the following operations:
in a marking scene comprising a user operation event, synchronously outputting tactile feedback information synchronized with the user operation event;
in a markup scene including sound, outputting at least one of: haptic feedback information adapted to a type of the sound, haptic feedback information consistent with a degree of excitement of the sound, haptic feedback information consistent with an orientation of a sound source of the sound;
in a marked scenario including a collision event, outputting at least one of: haptic feedback information synchronized with the collision event, haptic feedback information consistent with a severity of the collision event;
in a markup scenario including a notification event, haptic feedback information synchronized with the notification event is output.
4. The method of claim 1,
the type of perception information that the perception information profile is used to configure comprises visual feedback information comprising at least one of the following characteristics: state, frequency, color, brightness;
the outputting the perception information corresponding to the perception information configuration file includes:
outputting, according to a type of an event included in the scene, visual feedback information configured to be associated with the type of the event.
5. The method of claim 4, wherein outputting, according to a type of an event included in the scene, visual feedback information configured to be associated with the type of the event comprises:
performing at least one of the following operations:
in a markup scene including a notification event, outputting visual feedback information synchronized with the notification event;
in a markup scene including sound, outputting at least one of: visual feedback information adapted to the type of sound, visual feedback information adapted to the rhythm of the sound;
outputting visual feedback information for representing the progress of the progress event in a marked scene comprising the progress event;
in a tagged scene that includes a result event, visual feedback information characterizing a result of the result event is output.
6. The method of claim 1, wherein prior to obtaining the perceptual information profile corresponding to the marked scene, the method further comprises:
determining the marked scene;
displaying types of perception information configurable for the markup scene and characteristics configurable in each type;
wherein the type of the perception information comprises: tactile feedback information; visual feedback information; olfactory feedback information; auditory feedback information;
in response to the custom operation aiming at the marked scene, parameters configured by the custom operation aiming at the characteristics in at least one type are obtained, and a perception information configuration file corresponding to the marked scene is generated based on the obtained parameters of the characteristics in at least one type.
7. The method of claim 6,
the determining the marked scene comprises:
displaying a historical interaction process in the human-computer interaction interface, wherein the historical interaction process is earlier than the interaction process;
and recording the target scene as a marked scene in response to a marking operation for the target scene in the historical interaction process.
8. The method of claim 7, wherein displaying the types of perceptual information configurable for the markup scene and the configurable characteristics in each type comprises:
playing a screen recording file of the historical interaction process;
when the marked scene appears in the playing process of the screen recording file, displaying an inlet of custom perception information aiming at the marked scene;
and displaying a custom interface which is used for receiving the custom operation and comprises types of perception information which can be configured for the mark scene and characteristics which can be configured in each type.
9. The method of claim 6, wherein the determining the marked scene comprises:
and recording the target scene as a marked scene in response to a marking operation for the target scene in the interactive process.
10. The method of claim 6, wherein displaying the types of perceptual information configurable for the markup scene and the configurable characteristics in each type comprises:
an entry displaying custom perceptual information for the marked scene;
and displaying a custom interface which is used for receiving the custom operation and comprises types of perception information which can be configured for the mark scene and characteristics which can be configured in each type.
11. The method of claim 6,
the characteristics in each type have default parameters which are configured in advance;
the obtaining the parameters configured by the custom operation for the characteristics in at least one of the types comprises:
updating the default parameters to the parameters set by the custom operation in response to modification of the default parameters by the custom operation for at least one characteristic in the type.
12. The method of claim 6,
the displaying of types of perception information configurable for the markup scene and configurable characteristics in each type includes:
displaying a plurality of preconfigured modes of the markup scene;
wherein each of the modes includes a type of at least one of perceptual information associated with the markup scene and a default parameter in which a characteristic in the type is preconfigured;
the obtaining the parameters configured by the custom operation for the characteristics in at least one of the types comprises:
acquiring at least one type of perception information associated with the marked scene and included in a target mode and default parameters with pre-configured characteristics in the type;
wherein the target mode is a selected mode of the custom operation among the plurality of preconfigured modes.
13. The method of claim 6, wherein obtaining the parameters configured for the custom operation for the at least one property in the type comprises:
obtaining candidate parameters for a characteristic in at least one of the types;
carrying out score prediction processing on the candidate parameters through a first neural network model to obtain a prediction score of each candidate parameter;
taking the candidate parameter with the highest predictive score as the parameter of the characteristic in at least one of the types;
the first neural network model is obtained based on candidate parameter samples and real scores corresponding to the candidate parameter samples through training.
14. The method of claim 6, wherein obtaining the parameters configured for the custom operation for the at least one property in the type comprises:
acquiring operation behavior data in the human-computer interaction interface;
performing preference prediction processing on the operation behavior data through a second neural network model to obtain operation preference represented by the operation data;
obtaining parameters of characteristics in at least one of the types pre-associated with the operating preferences;
wherein the second neural network model is trained based on the operation data samples and the operation preference characterized by the operation data samples.
15. The method of claim 6, wherein prior to outputting the perception information corresponding to the perception information profile, the method further comprises:
inquiring the type of the perception information which is supported by the electronic equipment and the characteristic which supports self-definition in the corresponding type;
and according to the query result, filtering out the type of the perception information which does not support self-definition and the characteristics in the corresponding type in the perception information configuration file so as to update the perception information configuration file.
16. The method according to any one of claims 1-15, wherein outputting the perception information corresponding to the perception information profile comprises:
acquiring the type of the perception information configured by the user-defined operation and parameters of characteristics in the corresponding type from the perception information configuration file;
and sending a calling request to an application program interface opened to the man-machine interaction interface so as to call the peripheral corresponding to the type to output the perception information of the corresponding type, so that the output perception information of the corresponding type conforms to the parameters of the characteristics of the corresponding type.
17. A perception processing method for human-computer interaction is characterized by comprising the following steps:
receiving a custom operation aiming at a marked scene at a human-computer interaction interface;
responding to the self-defining operation, and generating a perception information configuration file corresponding to the marked scene according to the type of perception information configured by the self-defining operation and parameters of characteristics in the corresponding type;
displaying an interaction process on the human-computer interaction interface;
and when the scene reappearing in the interaction process is the marked scene, outputting the perception information of the type configured by the custom operation according to the perception information configuration file in the scene proceeding process, wherein the output perception information of the corresponding type accords with the parameter of the characteristic in the corresponding type.
18. A perception processing device for human-computer interaction is characterized by comprising:
the display module is used for displaying an interaction process in the human-computer interaction interface;
the detection module is used for detecting scenes in the interaction process;
the acquisition module is used for acquiring a perception information configuration file corresponding to a marking scene when the scene appearing in the interaction process is the marking scene configured with perception information by self-defined operation;
and the output module is used for outputting the perception information corresponding to the perception information configuration file in the scene proceeding process.
19. A perception processing device for human-computer interaction is characterized by comprising:
the configuration module is used for receiving custom operation aiming at the marked scene in a human-computer interaction interface;
the configuration module is further used for responding to the user-defined operation, and generating a perception information configuration file corresponding to the marked scene according to the type of the perception information configured by the user-defined operation and parameters of characteristics in the corresponding type;
the display module is used for displaying an interaction process on the human-computer interaction interface;
and the output module is used for outputting the perception information of the type configured by the user-defined operation according to the perception information configuration file in the scene proceeding process when the scene appearing again in the interaction process is the marked scene, and the output perception information of the corresponding type conforms to the parameters of the characteristics in the corresponding type.
20. An electronic device, comprising:
a memory for storing executable instructions;
a processor for implementing the human-computer interaction perception processing method of any one of claims 1 to 16 or claim 17 when executing executable instructions stored in the memory.
21. A computer-readable storage medium storing executable instructions for implementing the human-computer interaction perception processing method of any one of claims 1 to 16 or claim 17 when executed by a processor.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011240927.4A CN112221118A (en) | 2020-11-09 | 2020-11-09 | Human-computer interaction perception processing method and device and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011240927.4A CN112221118A (en) | 2020-11-09 | 2020-11-09 | Human-computer interaction perception processing method and device and electronic equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112221118A true CN112221118A (en) | 2021-01-15 |
Family
ID=74122290
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011240927.4A Pending CN112221118A (en) | 2020-11-09 | 2020-11-09 | Human-computer interaction perception processing method and device and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112221118A (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113010933A (en) * | 2021-03-11 | 2021-06-22 | 中国美术学院 | Efficient multi-user online interactive design system |
CN113207039A (en) * | 2021-05-08 | 2021-08-03 | 腾讯科技(深圳)有限公司 | Video processing method and device, electronic equipment and storage medium |
CN113438303A (en) * | 2021-06-23 | 2021-09-24 | 南京孩乐康智能科技有限公司 | Remote auxiliary work system and method, electronic device and storage medium |
CN113663333A (en) * | 2021-08-24 | 2021-11-19 | 网易(杭州)网络有限公司 | Game control method and device, electronic equipment and storage medium |
CN113821105A (en) * | 2021-09-17 | 2021-12-21 | 北京百度网讯科技有限公司 | Prompting method and device, electronic equipment and storage medium |
CN114870395A (en) * | 2021-02-05 | 2022-08-09 | 腾讯科技(深圳)有限公司 | Terminal vibration detection method, device, medium and equipment for game scene |
-
2020
- 2020-11-09 CN CN202011240927.4A patent/CN112221118A/en active Pending
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114870395A (en) * | 2021-02-05 | 2022-08-09 | 腾讯科技(深圳)有限公司 | Terminal vibration detection method, device, medium and equipment for game scene |
CN114870395B (en) * | 2021-02-05 | 2023-09-15 | 腾讯科技(深圳)有限公司 | Terminal vibration detection method, device, medium and equipment of game scene |
CN113010933A (en) * | 2021-03-11 | 2021-06-22 | 中国美术学院 | Efficient multi-user online interactive design system |
CN113010933B (en) * | 2021-03-11 | 2024-01-09 | 中国美术学院 | Efficient multi-user online interaction design system |
CN113207039A (en) * | 2021-05-08 | 2021-08-03 | 腾讯科技(深圳)有限公司 | Video processing method and device, electronic equipment and storage medium |
CN113207039B (en) * | 2021-05-08 | 2022-07-29 | 腾讯科技(深圳)有限公司 | Video processing method and device, electronic equipment and storage medium |
CN113438303A (en) * | 2021-06-23 | 2021-09-24 | 南京孩乐康智能科技有限公司 | Remote auxiliary work system and method, electronic device and storage medium |
CN113663333A (en) * | 2021-08-24 | 2021-11-19 | 网易(杭州)网络有限公司 | Game control method and device, electronic equipment and storage medium |
CN113821105A (en) * | 2021-09-17 | 2021-12-21 | 北京百度网讯科技有限公司 | Prompting method and device, electronic equipment and storage medium |
CN113821105B (en) * | 2021-09-17 | 2024-10-01 | 北京百度网讯科技有限公司 | Prompting method, prompting device, electronic equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112221118A (en) | Human-computer interaction perception processing method and device and electronic equipment | |
JP6502589B1 (en) | Lighting for video games | |
US10092827B2 (en) | Active trigger poses | |
CN111913624B (en) | Interaction method and device for objects in virtual scene | |
CN108803993B (en) | Application program interaction method, intelligent terminal and computer readable storage medium | |
JP6067905B1 (en) | Robot control program generation system | |
CN112569599B (en) | Control method and device for virtual object in virtual scene and electronic equipment | |
CN112959998B (en) | Vehicle-mounted human-computer interaction method and device, vehicle and electronic equipment | |
TW202227172A (en) | Method of presenting virtual scene, device, electrical equipment, storage medium, and computer program product | |
JP2022531372A (en) | How to preview behavior during a match in an out-of-competition environment, devices, terminals, and computer programs | |
US20210093967A1 (en) | Automatic Multimedia Production for Performance of an Online Activity | |
CN113763568A (en) | Augmented reality display processing method, device, equipment and storage medium | |
CN110822645B (en) | Air conditioner, control method and device thereof and readable storage medium | |
JP2016202357A (en) | Game machine | |
CN113741847B (en) | Light control method, system and storage medium | |
CN117008713A (en) | Augmented reality display method and device and computer readable storage medium | |
CN115576611A (en) | Service processing method and device, computer equipment and storage medium | |
CN112800252B (en) | Method, device, equipment and storage medium for playing media files in virtual scene | |
CN115220625B (en) | Audio playing method and device, electronic equipment and computer readable storage medium | |
CN110351412B (en) | Terminal shell and terminal shell control method, device and system | |
US12128308B2 (en) | Game environment customized generation of gaming music | |
CN117942571A (en) | Data communication method, device, electronic equipment and storage medium | |
CN117046086A (en) | Somatosensory game operation method based on apple watch | |
US20240024776A1 (en) | Game environment customized generation of gaming music | |
CN116747515A (en) | Running somatosensory game operation method based on apple watch |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
REG | Reference to a national code |
Ref country code: HK Ref legal event code: DE Ref document number: 40038135 Country of ref document: HK |
|
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |