CN116570914B - Intelligent game interaction method and system - Google Patents

Intelligent game interaction method and system Download PDF

Info

Publication number
CN116570914B
CN116570914B CN202310540842.5A CN202310540842A CN116570914B CN 116570914 B CN116570914 B CN 116570914B CN 202310540842 A CN202310540842 A CN 202310540842A CN 116570914 B CN116570914 B CN 116570914B
Authority
CN
China
Prior art keywords
face image
game
face
chat
player
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310540842.5A
Other languages
Chinese (zh)
Other versions
CN116570914A (en
Inventor
陈洁婷
梁利娟
陈艺夫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Anluo Network Co ltd
Original Assignee
Guangzhou Anluo Network Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Anluo Network Co ltd filed Critical Guangzhou Anluo Network Co ltd
Priority to CN202310540842.5A priority Critical patent/CN116570914B/en
Publication of CN116570914A publication Critical patent/CN116570914A/en
Application granted granted Critical
Publication of CN116570914B publication Critical patent/CN116570914B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/85Providing additional services to players
    • A63F13/87Communicating with other players during game play, e.g. by e-mail or chat
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/57Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of game services offered to the player
    • A63F2300/572Communication between players during game play of non game information, e.g. e-mail, chat, file transfer, streaming of audio and streaming of video
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/807Role playing or strategy games

Abstract

The invention provides an intelligent game interaction method and system, wherein the method comprises the following steps: s1, matching and acquiring corresponding cut scene data by the game terminal according to the current game progress of a first player user; displaying the matched cut scene data in a first display area; s2, after matching the cut scene data, the game terminal is connected to a chat server corresponding to the cut scene identity information according to the cut scene identity information; s3, the game terminal receives chat contents sent by each player user in the chat server, generates a real-time barrage according to the acquired chat contents, and displays the generated real-time barrage in a second display area; and S4, the game terminal receives an operation instruction of the first player user of the game for the real-time barrage, and sends the operation instruction to the game server, and the game server sends corresponding interaction information to a second player user corresponding to the real-time barrage according to the obtained operation instruction. The invention is helpful for improving the game experience of game players.

Description

Intelligent game interaction method and system
Technical Field
The invention relates to the technical field of game interaction, in particular to an intelligent game interaction method and system.
Background
At present, with the rapid development of computer technology and internet technology, various games are layered endlessly, and aiming at RPG (role playing) games, when a player completes certain game content or the game progress reaches a certain stage, the game can call corresponding cutscene to play, the cutscene helps the player to know the scenario, supports further game playing and branch selection for subsequent players, and improves the immersive scenario feeling of the user on the game.
However, in the prior art, there is often hysteresis in the discussion of game scenarios presented by players for cutscene with other players during the course of the game. Players lack effective interaction with other players in viewing cutscenes. Even if the player carries out communication discussion with other players or further searches for game players with the same volunteer for the follow-up game trend according to the scenario displayed by the cutscene, the experience of the player for the game scenario is affected.
Disclosure of Invention
Aiming at the technical problems, the invention aims to provide an intelligent game interaction method and system.
The aim of the invention is realized by adopting the following technical scheme:
the invention provides an intelligent game interaction method which is characterized by comprising the following steps:
s1, matching and acquiring corresponding cut scene data by the game terminal according to the current game progress of a first player user; displaying the matched cut scene data in a first display area;
s2, after matching the cut scene data, the game terminal is connected to a chat server corresponding to the cut scene identity information according to the cut scene identity information;
s3, the game terminal receives chat contents sent by each player user in the chat server, generates a real-time barrage according to the acquired chat contents, and displays the generated real-time barrage in a second display area;
and S4, the game terminal receives an operation instruction of the first player user of the game for the real-time barrage, and sends the operation instruction to the game server, and the game server sends corresponding interaction information to a second player user corresponding to the real-time barrage according to the obtained operation instruction.
Preferably, in step S1, matching and acquiring corresponding cutscene data includes:
acquiring corresponding process animation data from a game server, and displaying the acquired cutscene data in real time in a first display area; and/or the number of the groups of groups,
and matching and retrieving corresponding cut scene data from the game storage medium, and displaying the obtained cut scene data in a first display area.
Preferably, the cut scene data is configured to carry a corresponding unique ID as identity information, wherein a corresponding chat server or chat server unit is configured for each cut scene;
in step S2, a communication connection is established with a corresponding chat server or chat server unit according to the identity information of the cut scene data currently matched by the first player user.
Preferably, step S3 further includes that the game terminal sends chat content of the first player user to the chat server, and the chat server forwards the chat content to the connected second player user in real time; the chat content carries identity information of the first player user;
after receiving the chat content of the second player user forwarded by the server, the game terminal generates a real-time barrage according to the chat content, and displays the generated real-time barrage in a second display area;
the real-time barrage carries the user identity information of the second player sending the barrage.
Preferably, the method further comprises:
and S6, the game terminal acquires the identity information of the first player user.
Preferably, in step S4, the game terminal receives an operation instruction of the player user for the real-time barrage, and sends the operation instruction to the game server, where the operation instruction carries player user identity information corresponding to the real-time barrage; the game server sends interaction information corresponding to the operation instruction to the corresponding player user according to the received operation instruction;
the operation instructions comprise team invitation, private chat invitation and the like; the interaction information corresponding to the operation instruction includes group invitation information, private chat invitation information, and the like.
Preferably, the method further comprises:
and S5, after the scene cut animation is played, the game terminal is disconnected from the communication connection with the chat server.
In a second aspect, the present invention provides an intelligent game interaction system, including a game terminal, a chat server, and a game server;
the game terminal is used for matching and acquiring corresponding cut scene data according to the current game progress of the first player user; displaying the matched cut scene data in a first display area;
the game terminal is used for connecting to a chat server corresponding to the identity information of the cutscene according to the identity information of the cutscene after matching the cutscene data;
the game terminal is used for receiving chat contents sent by each player user in the chat server, generating a real-time barrage according to the acquired chat contents, and displaying the generated real-time barrage in a second display area;
the game terminal is used for receiving an operation instruction of a first player user of the game for the real-time barrage, sending the operation instruction to the game server, and sending the corresponding interaction information to a second player user corresponding to the real-time barrage by the game server according to the obtained operation instruction.
The beneficial effects of the invention are as follows: aiming at the situation that a player user watches the cut scene in the game process, according to the watched cut scene information, the player is concentrated into the same chat server, so that the player can send a real-time barrage for interactive communication aiming at the scenario or content of the same cut scene, the player can provide the opportunity of carrying out subsequent game content interaction with other players according to the real-time barrage, and the game experience of the game player is improved.
Drawings
The invention will be further described with reference to the accompanying drawings, in which embodiments do not constitute any limitation of the invention, and other drawings can be obtained by one of ordinary skill in the art without inventive effort from the following drawings.
FIG. 1 is a flow chart of a method for intelligent game interaction according to an embodiment of the present invention;
FIG. 2 is a block diagram of a framework of an intelligent game interaction system according to one embodiment of the present invention.
Detailed Description
The invention is further described in connection with the following application scenario.
Referring to FIG. 1, there is shown a smart game interaction method comprising:
s1, matching and acquiring corresponding cut scene data by the game terminal according to the current game progress of a first player user; displaying the matched cut scene data in a first display area;
preferably, in step S1, matching and acquiring corresponding cutscene data includes:
acquiring corresponding process animation data from a game server, and displaying the acquired cutscene data in real time in a first display area; and/or the number of the groups of groups,
and matching and retrieving corresponding cut scene data from the game storage medium, and displaying the obtained cut scene data in a first display area.
In one scenario, the first player is a player currently playing a game through a game terminal, and when the player first starts the game and needs to select a battle or a character, the player first views a game background introduction, a character introduction and a battle introduction scene, so that the player selects the battle or the character of the heart instrument after the scene is ended.
The first display area can be a preset full-screen area, and the bullet screen can be embedded into the first display area for display in a full-screen display mode. Alternatively, the first display area may be a preset partial screen area, and in this mode, the bullet screen may be displayed in a partial position of the first display area, or displayed in another area outside the first display area.
For a stand-alone game performed by a connection server, the cut scene data is stored in a local storage medium of the game terminal, and for an online game such as a web game, the corresponding cut scene data may be directly downloaded from the server and played in real time.
The game terminal is intelligent terminal equipment for players to play games, and comprises intelligent mobile phones, computers, household game machine terminals and the like.
The game server and the chat server can be built based on the same physical server or different servers.
Preferably, the cut scene data is configured to carry a corresponding unique ID as identity information, wherein a corresponding chat server or chat server unit is configured for each cut scene.
In a scene, in a game development stage, a corresponding identity ID is set for each cutscene to serve as a basis for calling and playing and setting a corresponding real-time chat server in a game process.
S2, after matching the cut scene data, the game terminal is connected to a chat server corresponding to the cut scene identity information according to the cut scene identity information;
in step S2, a communication connection is established with a corresponding chat server or chat server unit according to the identity information of the cut scene data currently matched by the first player user.
In a scene, in the chat server, corresponding chat rooms are set for different cutscenes, and when a player plays a certain cutscene, the player is connected with the chat server through a game terminal and enters the corresponding chat room. The player users in the same chat room can send real-time chat information, and the game terminal generates a real-time barrage according to the real-time chat information for display.
S3, the game terminal receives chat contents sent by each player user in the chat server, generates a real-time barrage according to the acquired chat contents, and displays the generated real-time barrage in a second display area;
preferably, step S3 further includes that the game terminal sends chat content of the first player user to the chat server, and the chat server forwards the chat content to the connected second player user in real time; the chat content carries identity information of the first player user;
after receiving the chat content of the second player user forwarded by the chat server, the game terminal generates a real-time barrage according to the chat content, and displays the generated real-time barrage in a second display area;
the real-time barrage carries the user identity information of the second player sending the barrage.
In a scene, the generated real-time barrage comprises a head portrait of a player and corresponding chat content, and a player user can check corresponding player information and pop-up operation windows by clicking the corresponding real-time barrage so as to enable the player to perform further operation according to the real-time barrage.
After the player user sends the chat content to the chat server, the chat content is sent to other players in the same chat room by the chat server, and is played and displayed in the form of real-time barrages in game terminals of the other players.
In one scenario, when a player first starts a game and needs to select a battle or a character, the player first views a game background introduction, a character introduction and a battle introduction cutscene, so that the player selects the battle or the character of the cardiology instrument after the cutscene is finished. When watching the cut scene, the player can enter the corresponding chat room to conduct real-time communication, so that the player can publish evaluation or chat content of the self-bias of the selection of the camping or the roles according to the introduction content of the cut scene, and the player can conveniently search for the volunteer to conduct communication or team game to the same or similar players according to the real-time barracks. When the player agrees to select the same battle to play, the game characters corresponding to the player are transmitted to the same map area to play.
Preferably, the method further comprises:
the game terminal obtains identity information of a first player user.
The game terminal firstly acquires identity information of a player user when the player enters a chat room server (for example, for a stand-alone game of a connectable server, user identity information can be acquired when the player needs to enter the chat room) or logs in the game as a basis of interaction between the players; and after the identity information is verified, the game terminal provides subsequent game content or barrage interactive service based on the player identity information.
Preferably, the game terminal obtains identity information of the first player user, including:
the game terminal acquires the user name and password information input by the user of the first player, and sends the acquired user name and password information to the game server, and the game server verifies the user name and password information;
and when the verification passing information fed back by the game server is received, the user name is used as the identity information of the first player user.
Preferably, after receiving the verification passing information fed back by the game server, the identity information of the first player fed back by the server is accepted.
Preferably, the game terminal acquires identity information of the first player user, and further includes:
the game terminal collects face image data of a first player user and transmits the collected face image data to the game server; and the game server performs identity recognition verification according to the acquired face image data, and receives user identity information corresponding to the first player user returned by the game server after the identity recognition verification is passed.
The method for verifying the identity information of the player user can be through an account number password or based on a face recognition technology. The identity verification can correlate the player user identity information with the game role and the game account; the method is based on face recognition, and can be helpful for distinguishing, identifying and verifying player users and game roles (account numbers), so that game contents or chat contents generated by players can be monitored by the players, and management level in the game interaction process is improved.
Preferably, the game server performs authentication according to the acquired face image data, including:
the server receives the face image acquired and uploaded by the game terminal; the acquired face images comprise a first face image and a second face image which are continuously acquired by the game terminal, wherein the acquisition interval of the first face image and the second face image is less than 1s;
preprocessing according to the acquired face image to obtain a preprocessed face image;
extracting a face part according to the preprocessed face image to obtain a face area image; comprising the following steps: performing edge detection according to the preprocessed face image, and extracting a face part in the image according to the obtained edge information to obtain a face area image;
extracting features according to the obtained face region image to obtain face feature data; comprising the following steps: performing LBP feature extraction according to the acquired face region image, and taking the obtained LBP feature data as face feature data;
matching the obtained face characteristic data with face characteristic data corresponding to each player user prestored in a database to obtain a corresponding identity recognition result and output corresponding player user identity information; comprising the following steps: and carrying out matching verification according to the obtained face feature data and the face feature data prestored in the database and prestored in the registration process of each player, and extracting corresponding player identity user identity information when the face feature data with the similarity larger than the preset standard threshold value is matched.
The game server performs intelligent preprocessing, face extraction, feature recognition and matching verification processing according to the obtained face image data, and recognizes and verifies the identity information of the player user according to the obtained face image data, so that the player user can conveniently avoid complicated instruction input in the game process, and the identity information of the player user is intelligently obtained. Meanwhile, identity information of the player user is acquired in a face recognition mode, and identity verification of distinguishing the current player user from the game role can be carried out, so that game content or chat content generated by the player can be monitored later, and the management level in the game interaction process is improved.
The method aims at the situation that a player user is easily affected by screen flickering and the like in the game process, so that reflection transition of face images collected by a game terminal easily occurs to cause unclear, and the stability of the follow-up identification of the player user according to face image data is affected. Therefore, when the game terminal collects the face image data of the player user, two face image data are continuously collected in a sequential continuous collection mode and uploaded to the game server, and the game server performs preprocessing according to the two obtained face images so as to improve the definition of the face image data.
Preferably, the game server performs preprocessing according to the obtained face image to obtain a preprocessed face image, and specifically includes:
edge detection is respectively carried out on the acquired first face image and second face image, and image alignment is carried out according to the obtained face edge information, so that the positions of faces in the first face image and the second face image in the images are consistent;
respectively converting the first face image and the second face image from RGB color space to Lab color space to obtain brightness component subgraph pL of the first face image 1 Color component subgraph pa 1 And color component subgraph pb 1 And a luminance component sub-graph pL of the second face image 2 Color component subgraph pa 2 And color component subgraph pb 2
And respectively calculating characteristic parameters of the first face image and the second face image, wherein the adopted characteristic parameter calculation function is as follows:
wherein, the variable x=1, 2 corresponds to the first face image and the second face image respectively; charaY x Representing characteristic parameters, meanL x Average luminance component value representing luminance component subgraph, stanL x Represents the set standard luminance component value, maxL x And minL x Representing the maximum luminance component value and the minimum luminance component value, stanΔl, respectively, of the luminance component subgraph x Represents the set standard value of brightness variation, meanTL x Representing the average luminance gradient value of each pixel point in the luminance component subgraph x Represents the set standard brightness gradient value omega a 、ω b And omega c Respectively representing the set weight adjustment factors;
comparing according to the obtained characteristic parameters, if the characteristic parameters of the first face image are character 1 Is larger than the characteristic parameter of the second face imagecharaY 2 Marking the second face image as a standard face image p y The first face image is a mark face image p h The method comprises the steps of carrying out a first treatment on the surface of the Otherwise, if the characteristic parameter character of the first face image 1 Characteristic parameter character of the second face image is smaller than or equal to characteristic parameter character of the second face image 2 Marking the first face image as a standard face image p y The second face image is the identification face image p h
For standard face image p y Luminance component subgraph pL of (2) y And performing brightness adjustment processing, wherein the adopted brightness adjustment function is as follows:
in the method, in the process of the invention,representing brightness component value pL of pixel point (x, y) in standard face image after brightness adjustment y (x, y) represents the brightness component value of the pixel point (x, y) in the standard face image, stanL x Representing the set standard luminance component value, stanL x ∈[60,65],meanL y θ (x, y) and meanL h θ (x, y) respectively represent luminance component subgraph pL y And luminance component subgraph pL h In the 3 x 3 neighborhood range centering on the pixel (x, y), num [ |θ (x, y) -stanL x |>stanθL x ]Representing luminance component subgraph pL y The brightness component value of each pixel point in the 3X 3 neighborhood range taking the pixel point (x, y) as the center is larger than the standard brightness component value stanL x The difference value is larger than the standard brightness gradient value stanθL x Is the number of standard pixel points, and is 5,7]The method comprises the steps of carrying out a first treatment on the surface of the Δβ represents the set regulatory factor, where Δβ ε [0.05,0.15 ]];
Luminance component subgraph adjusted according to luminanceColor corresponding to standard face imageComponent subgraph pa y And color component subgraph pb y And reconstructing to obtain the preprocessed face image.
Aiming at the situation that face images acquired in the game process are easily influenced by screen flickering so as to cause unclear face images, the embodiment of the invention particularly provides a technical scheme that two face images of a player user are continuously acquired through a game terminal and preprocessed according to the two face images so as to obtain preprocessed face images. In the scheme, alignment correction is firstly carried out according to the two obtained face images, so that the position of the characteristic information in the images can be corresponding to the position of the characteristic information in the images. And further calculating characteristic parameters of the two face images according to the obtained luminance component subgraphs, and reflecting luminance characteristics in the images and the conditions of integral (integral luminance level and integral image luminance change level) and local luminance (richness of luminance characteristic information) in the images through the proposed characteristic parameter calculation function, and selecting the face image with better luminance information as a standard image for subsequent processing according to the obtained characteristic parameters. Meanwhile, according to the obtained standard face image, brightness information of the image is further adjusted, wherein a brightness adjusting function is provided, a flicker region in the image can be accurately judged according to local brightness information of a region where a pixel point is located, and particularly brightness information of the same position of the face image is marked as a reference, brightness adjusting processing compensation is carried out on the flicker region, so that brightness adjusting effect of the flicker region and stability of the image are improved. Compared with the traditional single-image brightness adjustment scheme, the method and the device have the advantages that the flicker area in the image is detected firstly, and the flicker area is subjected to brightness compensation according to the brightness information of the same position of the face image serving as a reference, so that the brightness compensation is performed on the flicker area, the real brightness information of the image is restored, the level of detail characteristic representation in the image is improved, and the stability and the reliability of the subsequent further face recognition according to the preprocessed face image are improved.
And S4, the game terminal receives an operation instruction of the first player user of the game for the real-time barrage, and sends the operation instruction to the game server, and the game server sends corresponding interaction information to a second player user corresponding to the real-time barrage according to the obtained operation instruction.
Preferably, in step S4, the game terminal receives an operation instruction of the player user for the real-time barrage, and sends the operation instruction to the game server, where the operation instruction carries player user identity information corresponding to the real-time barrage; and the game server sends interaction information corresponding to the operation instruction to the corresponding player user according to the received operation instruction.
Preferably, the operation instructions include group invitation, private chat invitation, and the like. The interaction information corresponding to the operation instruction includes group invitation information, private chat invitation information, and the like.
The player can select a corresponding manner to further interact with the player who sends the live barrage based on the chat content and the preferences for the game content.
Preferably, in step S4, after the second player user who receives the interaction information accepts the invitation, the first player user and the second player user establish a game interaction connection through the game server.
Preferably, the method further comprises step S5:
after the scene cut is played, the game terminal is disconnected from the communication connection with the chat server.
In a scene, in order to improve the real-time performance of real-time bullet screen communication of the cutscene, after the playing of the cutscene is finished or a period of time is accepted, a player user automatically leaves a corresponding chat server.
According to the embodiment of the invention, aiming at the situation that a player user views the cutscene in the game process, according to the information of the watched cutscene, the player is concentrated in the same chat server, so that the player can send a real-time barrage for interactive communication according to the scenario or content of the same cutscene, the player can carry out subsequent game content interaction with other players according to the real-time barrage, and the game experience of the game player is improved.
Corresponding to the above-mentioned intelligent game interaction method, referring to fig. 2, the present invention also provides an intelligent game interaction system, wherein the system comprises a plurality of game terminals, a chat server and a game server;
the game terminal is used for matching and acquiring corresponding cut scene data according to the current game progress of the first player user; displaying the matched cut scene data in a first display area;
the game terminal is used for connecting to a chat server corresponding to the identity information of the cutscene according to the identity information of the cutscene after matching the cutscene data;
the game terminal is used for receiving chat contents sent by each player user in the chat server, generating a real-time barrage according to the acquired chat contents, and displaying the generated real-time barrage in a second display area;
the game terminal is used for receiving an operation instruction of a first player user of the game for the real-time barrage, sending the operation instruction to the game server, and sending the corresponding interaction information to a second player user corresponding to the real-time barrage by the game server according to the obtained operation instruction.
It should be noted that, the game terminal, the chat server and the game server in the system are further configured to implement the steps, the functions and the corresponding embodiments corresponding to those in fig. 1. This application is not repeated here.
It should be noted that, in each embodiment of the present invention, each functional unit/module may be integrated in one processing unit/module, or each unit/module may exist alone physically, or two or more units/modules may be integrated in one unit/module. The integrated units/modules described above may be implemented either in hardware or in software functional units/modules.
From the description of the embodiments above, it will be apparent to those skilled in the art that the embodiments described herein may be implemented in hardware, software, firmware, middleware, code, or any suitable combination thereof. For a hardware implementation, the processor may be implemented in one or more of the following units: an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a processor, a controller, a microcontroller, a microprocessor, other electronic units designed to perform the functions described herein, or a combination thereof. For a software implementation, some or all of the flow of an embodiment may be accomplished by a computer program to instruct the associated hardware. When implemented, the above-described programs may be stored in or transmitted as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a computer. The computer readable media can include, but is not limited to, RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer.
Finally, it should be noted that the above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the scope of the present invention, and although the present invention has been described in detail with reference to the preferred embodiments, those skilled in the art should understand that modifications or equivalent substitutions can be made to the technical solution of the present invention without departing from the spirit and scope of the technical solution of the present invention.

Claims (6)

1. An intelligent game interaction method is characterized by comprising the following steps:
s1, matching and acquiring corresponding cut scene data by the game terminal according to the current game progress of a first player user; displaying the matched cut scene data in a first display area;
s2, after matching the cut scene data, the game terminal is connected to a chat server corresponding to the cut scene identity information according to the cut scene identity information;
s3, the game terminal receives chat contents sent by each player user in the chat server, generates a real-time barrage according to the acquired chat contents, and displays the generated real-time barrage in a second display area; the step S3 further comprises the steps that the game terminal sends chat contents of the first player user to the chat server, and the chat server forwards the chat contents to the connected second player user in real time; the chat content carries identity information of the first player user;
after receiving the chat content of the second player user forwarded by the chat server, the game terminal generates a real-time barrage according to the chat content, and displays the generated real-time barrage in a second display area;
the real-time barrage carries user identity information of a second player who sends the barrage;
s4, the game terminal receives an operation instruction of a first player user of the game for the real-time barrage, and sends the operation instruction to the game server, and the game server sends corresponding interaction information to a second player user corresponding to the real-time barrage according to the obtained operation instruction;
wherein the method further comprises:
s6, the game terminal acquires the identity information of the first player user, and the method comprises the following steps: the game terminal collects face image data of a first player user and transmits the collected face image data to the game server; the game server performs identity recognition verification according to the acquired face image data, and receives user identity information corresponding to the first player user returned by the game server after the identity recognition verification is passed;
in step S6, the game server performs identity recognition verification according to the acquired face image data, including:
the server receives the face image acquired and uploaded by the game terminal; the acquired face images comprise a first face image and a second face image which are continuously acquired by the game terminal, wherein the acquisition interval of the first face image and the second face image is less than 1s;
preprocessing according to the acquired face image to obtain a preprocessed face image;
extracting a face part according to the preprocessed face image to obtain a face area image; comprising the following steps: performing edge detection according to the preprocessed face image, and extracting a face part in the image according to the obtained edge information to obtain a face area image;
extracting features according to the obtained face region image to obtain face feature data; comprising the following steps: performing LBP feature extraction according to the acquired face region image, and taking the obtained LBP feature data as face feature data;
matching the obtained face characteristic data with face characteristic data corresponding to each player user prestored in a database to obtain a corresponding identity recognition result and output corresponding player user identity information; comprising the following steps: according to the obtained face feature data and the face feature data prestored in the database and prestored in the registration process of each player, carrying out matching verification, and when the face feature data with the similarity larger than the preset standard threshold value are matched, extracting corresponding player identity user identity information; the game server performs preprocessing according to the acquired face image to obtain a preprocessed face image, and specifically comprises the following steps:
edge detection is respectively carried out on the acquired first face image and second face image, and image alignment is carried out according to the obtained face edge information, so that the positions of faces in the first face image and the second face image in the images are consistent;
respectively converting the first face image and the second face image from RGB color space to Lab color space to obtain brightness component subgraph pL of the first face image 1 Color component subgraph pa 1 And color component subgraph pb 1 And a luminance component sub-graph pL of the second face image 2 Color component subgraph pa 2 And color component subgraph pb 2
And respectively calculating characteristic parameters of the first face image and the second face image, wherein the adopted characteristic parameter calculation function is as follows:
wherein, the variable x=1, 2 corresponds to the first face image and the second face image respectively; charaY x Representing characteristic parameters, meanL x Average luminance component value representing luminance component subgraph, stanL x Represents the set standard luminance component value, maxL x And minL x Representing the maximum luminance component value and the minimum luminance component value, stanΔl, respectively, of the luminance component subgraph x Represents the set standard value of brightness variation, meanTL x Representing the average luminance gradient value of each pixel point in the luminance component subgraph x Represents the set standard brightness gradient value omega a 、ω b And omega c Respectively representing the set weight adjustment factors;
comparing according to the obtained characteristic parameters, if the characteristic parameters of the first face image are character 1 Characteristic parameter character larger than second face image 2 Marking the second face image as a standard face image p y The first face image is a mark face image p h The method comprises the steps of carrying out a first treatment on the surface of the Otherwise, if the characteristic parameter character of the first face image 1 Characteristic parameter character of the second face image is smaller than or equal to characteristic parameter character of the second face image 2 Marking the first face image as a standard face image p y The second face image is the identification face image p h
For standard face image p y Luminance component subgraph pL of (2) y And performing brightness adjustment processing, wherein the adopted brightness adjustment function is as follows:
in the method, in the process of the invention,representing brightness component value pL of pixel point (x, y) in standard face image after brightness adjustment y (x, y) represents the brightness of the pixel point (x, y) in the standard face imageDegree component value, stanL x Representing the set standard luminance component value, stanL x ∈[60,65],meanL y θ (x, y) and meanL h θ (x, y) respectively represent luminance component subgraph pL y And luminance component subgraph pL h In the 3 x 3 neighborhood range centering on the pixel (x, y), num [ |θ (x, y) -stanL x |>stanθL x ]Representing luminance component subgraph pL y The brightness component value of each pixel point in the 3X 3 neighborhood range taking the pixel point (x, y) as the center is larger than the standard brightness component value stanL x The difference value is larger than the standard brightness gradient value stanθL x Is the number of standard pixel points, and is 5,7]The method comprises the steps of carrying out a first treatment on the surface of the Δβ represents the set regulatory factor, where Δβ ε [0.05,0.15 ]];
Luminance component subgraph adjusted according to luminanceColor component subgraph pa corresponding to standard face image y And color component subgraph pb y And reconstructing to obtain the preprocessed face image.
2. The intelligent game interaction method according to claim 1, wherein in step S1, matching and acquiring corresponding cutscene data includes:
acquiring corresponding process animation data from a game server, and displaying the acquired cutscene data in real time in a first display area; and/or the number of the groups of groups,
and matching and retrieving corresponding cut scene data from the game storage medium, and displaying the obtained cut scene data in a first display area.
3. The intelligent game interaction method according to claim 1, wherein the cutscene data is set to carry a corresponding unique ID as identity information, wherein a corresponding chat server or chat server unit is set for each cutscene;
in step S2, a communication connection is established with a corresponding chat server or chat server unit according to the identity information of the cut scene data currently matched by the first player user.
4. The intelligent game interaction method according to claim 1, wherein in step S4, the game terminal receives an operation instruction of a player user for the real-time barrage, and sends the operation instruction to the game server, wherein the operation instruction carries player user identity information corresponding to the real-time barrage; the game server sends interaction information corresponding to the operation instruction to the corresponding player user according to the received operation instruction;
the operation instruction comprises team invitation and private chat invitation; the interaction information corresponding to the operation instruction comprises team invitation information and private chat invitation information.
5. The intelligent game interaction method according to claim 1, further comprising:
and S5, after the scene cut animation is played, the game terminal is disconnected from the communication connection with the chat server.
6. An intelligent game interaction system is characterized by comprising a game terminal, a chat server and a game server;
the game terminal is used for matching and acquiring corresponding cut scene data according to the current game progress of the first player user; displaying the matched cut scene data in a first display area;
the game terminal is used for connecting to a chat server corresponding to the identity information of the cutscene according to the identity information of the cutscene after matching the cutscene data;
the game terminal is used for receiving chat contents sent by each player user in the chat server, generating a real-time barrage according to the acquired chat contents, and displaying the generated real-time barrage in a second display area; the game terminal sends the chat content of the first player user to the chat server, and the chat server forwards the chat content to the connected second player user in real time; the chat content carries identity information of the first player user;
after receiving the chat content of the second player user forwarded by the chat server, the game terminal generates a real-time barrage according to the chat content, and displays the generated real-time barrage in a second display area;
the real-time barrage carries user identity information of a second player who sends the barrage;
the game terminal is used for receiving an operation instruction of a first player user of the game for the real-time barrage, sending the operation instruction to the game server, and sending corresponding interaction information to a second player user corresponding to the real-time barrage by the game server according to the obtained operation instruction;
the game terminal is further used for acquiring identity information of the first player user, and comprises: the game terminal collects face image data of a first player user and transmits the collected face image data to the game server; the game server performs identity recognition verification according to the acquired face image data, and receives user identity information corresponding to the first player user returned by the game server after the identity recognition verification is passed;
the game server performs identity identification verification according to the acquired face image data, and comprises the following steps:
the server receives the face image acquired and uploaded by the game terminal; the acquired face images comprise a first face image and a second face image which are continuously acquired by the game terminal, wherein the acquisition interval of the first face image and the second face image is less than 1s;
preprocessing according to the acquired face image to obtain a preprocessed face image;
extracting a face part according to the preprocessed face image to obtain a face area image; comprising the following steps: performing edge detection according to the preprocessed face image, and extracting a face part in the image according to the obtained edge information to obtain a face area image;
extracting features according to the obtained face region image to obtain face feature data; comprising the following steps: performing LBP feature extraction according to the acquired face region image, and taking the obtained LBP feature data as face feature data;
matching the obtained face characteristic data with face characteristic data corresponding to each player user prestored in a database to obtain a corresponding identity recognition result and output corresponding player user identity information; comprising the following steps: according to the obtained face feature data and the face feature data prestored in the database and prestored in the registration process of each player, carrying out matching verification, and when the face feature data with the similarity larger than the preset standard threshold value are matched, extracting corresponding player identity user identity information; the game server performs preprocessing according to the acquired face image to obtain a preprocessed face image, and specifically comprises the following steps:
edge detection is respectively carried out on the acquired first face image and second face image, and image alignment is carried out according to the obtained face edge information, so that the positions of faces in the first face image and the second face image in the images are consistent;
respectively converting the first face image and the second face image from RGB color space to Lab color space to obtain brightness component subgraph pL of the first face image 1 Color component subgraph pa 1 And color component subgraph pb 1 And a luminance component sub-graph pL of the second face image 2 Color component subgraph pa 2 And color component subgraph pb 2
And respectively calculating characteristic parameters of the first face image and the second face image, wherein the adopted characteristic parameter calculation function is as follows:
wherein, the variable x=1, 2 corresponds to the first face image and the second face image respectively; charaY x Representing characteristic parameters, meanL x Average luminance component value representing luminance component subgraph, stanL x Representing the set standard luminance componentValue, maxL x And minL x Representing the maximum luminance component value and the minimum luminance component value, stanΔl, respectively, of the luminance component subgraph x Represents the set standard value of brightness variation, meanTL x Representing the average luminance gradient value of each pixel point in the luminance component subgraph x Represents the set standard brightness gradient value omega a 、ω b And omega c Respectively representing the set weight adjustment factors;
comparing according to the obtained characteristic parameters, if the characteristic parameters of the first face image are character 1 Characteristic parameter character larger than second face image 2 Marking the second face image as a standard face image p y The first face image is a mark face image p h The method comprises the steps of carrying out a first treatment on the surface of the Otherwise, if the characteristic parameter character of the first face image 1 Characteristic parameter character of the second face image is smaller than or equal to characteristic parameter character of the second face image 2 Marking the first face image as a standard face image p y The second face image is the identification face image p h
For standard face image p y Luminance component subgraph pL of (2) y And performing brightness adjustment processing, wherein the adopted brightness adjustment function is as follows:
in the method, in the process of the invention,representing brightness component value pL of pixel point (x, y) in standard face image after brightness adjustment y (x, y) represents the brightness component value of the pixel point (x, y) in the standard face image, stanL x Representing the set standard luminance component value, stanL x ∈[60,65],meanL y θ (x, y) and meanL h θ (x, y) respectively represent luminance component subgraph pL y And luminance component subgraph pL h In the 3 x 3 neighborhood range centering on the pixel (x, y), num [ |θ (x, y) -stanL x |>stanθL x ]Representing luminance component subgraph pL y The brightness component value of each pixel point in the 3X 3 neighborhood range taking the pixel point (x, y) as the center is larger than the standard brightness component value stanL x The difference value is larger than the standard brightness gradient value stanθL x Is the number of standard pixel points, and is 5,7]The method comprises the steps of carrying out a first treatment on the surface of the Δβ represents the set regulatory factor, where Δβ ε [0.05,0.15 ]];
Luminance component subgraph adjusted according to luminanceColor component subgraph pa corresponding to standard face image y And color component subgraph pb y And reconstructing to obtain the preprocessed face image.
CN202310540842.5A 2023-05-13 2023-05-13 Intelligent game interaction method and system Active CN116570914B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310540842.5A CN116570914B (en) 2023-05-13 2023-05-13 Intelligent game interaction method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310540842.5A CN116570914B (en) 2023-05-13 2023-05-13 Intelligent game interaction method and system

Publications (2)

Publication Number Publication Date
CN116570914A CN116570914A (en) 2023-08-11
CN116570914B true CN116570914B (en) 2023-12-19

Family

ID=87535331

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310540842.5A Active CN116570914B (en) 2023-05-13 2023-05-13 Intelligent game interaction method and system

Country Status (1)

Country Link
CN (1) CN116570914B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080011518A (en) * 2006-07-31 2008-02-05 주식회사 나루엔터테인먼트 System for providing game able to make interaction between different channels for multiple user
JP2015150270A (en) * 2014-02-17 2015-08-24 Line株式会社 Program and server
CN105389491A (en) * 2014-08-28 2016-03-09 凯文·艾伦·杜西 Facial recognition authentication system including path parameters
CN106331832A (en) * 2016-09-14 2017-01-11 腾讯科技(深圳)有限公司 Information display method and information display device
CN109005417A (en) * 2018-08-06 2018-12-14 广州华多网络科技有限公司 Direct broadcasting room access method, system, terminal and the device of game are carried out based on live streaming
CN114245221A (en) * 2021-12-14 2022-03-25 北京达佳互联信息技术有限公司 Interaction method and device based on live broadcast room, electronic equipment and storage medium
CN114432708A (en) * 2022-01-28 2022-05-06 腾讯科技(深圳)有限公司 Cloud game processing method, terminal device and computer program product
CN115193039A (en) * 2022-07-26 2022-10-18 上海幻电信息科技有限公司 Interactive method, device and system of game scenarios

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080011518A (en) * 2006-07-31 2008-02-05 주식회사 나루엔터테인먼트 System for providing game able to make interaction between different channels for multiple user
JP2015150270A (en) * 2014-02-17 2015-08-24 Line株式会社 Program and server
CN105389491A (en) * 2014-08-28 2016-03-09 凯文·艾伦·杜西 Facial recognition authentication system including path parameters
CN106331832A (en) * 2016-09-14 2017-01-11 腾讯科技(深圳)有限公司 Information display method and information display device
CN109005417A (en) * 2018-08-06 2018-12-14 广州华多网络科技有限公司 Direct broadcasting room access method, system, terminal and the device of game are carried out based on live streaming
CN114245221A (en) * 2021-12-14 2022-03-25 北京达佳互联信息技术有限公司 Interaction method and device based on live broadcast room, electronic equipment and storage medium
CN114432708A (en) * 2022-01-28 2022-05-06 腾讯科技(深圳)有限公司 Cloud game processing method, terminal device and computer program product
CN115193039A (en) * 2022-07-26 2022-10-18 上海幻电信息科技有限公司 Interactive method, device and system of game scenarios

Also Published As

Publication number Publication date
CN116570914A (en) 2023-08-11

Similar Documents

Publication Publication Date Title
CN108322788B (en) Advertisement display method and device in live video
CN104066003B (en) Method and device for playing advertisement in video
CN107194817B (en) User social information display method and device and computer equipment
US20160367899A1 (en) Multi-Modal Search
CN105635129B (en) Song chorusing method, device and system
CN107638690B (en) Method, device, server and medium for realizing augmented reality
US20140201632A1 (en) Content player
CN113041611A (en) Virtual item display method and device, electronic equipment and readable storage medium
CN111225287A (en) Bullet screen processing method and device, electronic equipment and storage medium
CN112287848A (en) Live broadcast-based image processing method and device, electronic equipment and storage medium
CN111479119A (en) Method, device and system for collecting feedback information in live broadcast and storage medium
CN113996053A (en) Information synchronization method, device, computer equipment, storage medium and program product
CN110335666A (en) Medical image appraisal procedure, device, computer equipment and storage medium
CN117255211A (en) Live broadcast room display method, server side and live broadcast client side
CN110267056B (en) Live broadcast method, device system and computer storage medium
CN113873280A (en) Live wheat-connecting fighting interaction method, system and device and computer equipment
US20160158648A1 (en) Automated selective scoring of user-generated content
CN110324652A (en) Game interaction method and system, electronic equipment and the device with store function
CN116570914B (en) Intelligent game interaction method and system
CN110324653A (en) Game interaction exchange method and system, electronic equipment and the device with store function
CN114307170A (en) Script killing information intelligent interaction system under different use environments
CN111744197B (en) Data processing method, device and equipment and readable storage medium
CN111181839B (en) Data processing method, device and equipment in application sharing
CN114035683B (en) User capturing method, apparatus, device, storage medium and computer program product
US10887629B1 (en) Basketball video interaction method and device, intelligent basketball stand and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant