CN109078319B - Game interface display method and terminal - Google Patents

Game interface display method and terminal Download PDF

Info

Publication number
CN109078319B
CN109078319B CN201810843553.1A CN201810843553A CN109078319B CN 109078319 B CN109078319 B CN 109078319B CN 201810843553 A CN201810843553 A CN 201810843553A CN 109078319 B CN109078319 B CN 109078319B
Authority
CN
China
Prior art keywords
screen
game
interface
displaying
terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810843553.1A
Other languages
Chinese (zh)
Other versions
CN109078319A (en
Inventor
陈立东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201810843553.1A priority Critical patent/CN109078319B/en
Publication of CN109078319A publication Critical patent/CN109078319A/en
Application granted granted Critical
Publication of CN109078319B publication Critical patent/CN109078319B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/85Providing additional services to players
    • A63F13/87Communicating with other players during game play, e.g. by e-mail or chat
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/57Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of game services offered to the player
    • A63F2300/572Communication between players during game play of non game information, e.g. e-mail, chat, file transfer, streaming of audio and streaming of video

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The invention provides a game interface display method and a terminal, wherein the terminal comprises a first screen and a second screen, and the method comprises the following steps: receiving an operation input by a user when the game application program is in a running state; collecting fingerprint characteristic information of a user in response to the operation input by the user; and under the condition that the acquired fingerprint characteristic information is matched with the fingerprint characteristic information stored in advance, displaying a game interface corresponding to the game application program on the first screen, and displaying a game chatting interface on the second screen. In this way, a game interface corresponding to the game application program can be displayed on the first screen, and a game chatting interface can be displayed on the second screen. The condition that the game interface is shielded can be avoided, and the display effect of the game interface is good.

Description

Game interface display method and terminal
Technical Field
The invention relates to the technical field of communication, in particular to a game interface display method and a terminal.
Background
The user can play a game using the terminal. The user may need to communicate with others while playing the game. In the prior art, a user can communicate with other people through a text mode or a voice mode.
For example, for text mode, a text entry box and a soft keyboard may be displayed within the game interface, through which the user may enter text chat content. For the voice mode, a voice input button can be displayed in the game interface, a user can press the voice input button for a long time and input voice, and voice chat content can be input. However, the game interface is blocked by the text input box, the soft keyboard, the voice input button or the chat content, and if the user does not know the content of the blocked area in the game interface in time, the game is likely to fail. Therefore, in the prior art, the display effect of the game interface is poor.
Disclosure of Invention
The embodiment of the invention provides a game interface display method and a terminal, and aims to solve the problem that the display effect of a game interface is poor in the prior art.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides a game interface display method, which is applied to a terminal, where the terminal includes a first screen and a second screen, and the method includes:
receiving an operation input by a user when the game application program is in a running state;
collecting fingerprint characteristic information of a user in response to the operation input by the user;
and under the condition that the acquired fingerprint characteristic information is matched with the fingerprint characteristic information stored in advance, displaying a game interface corresponding to the game application program on the first screen, and displaying a game chatting interface on the second screen.
In a second aspect, an embodiment of the present invention further provides a terminal, where the terminal includes a first screen and a second screen, and includes:
the receiving module is used for receiving the operation input by the user when the game application program is in the running state;
the acquisition module is used for responding to the operation input by the user and acquiring fingerprint characteristic information of the user;
and the first display module is used for displaying a game interface corresponding to the game application program on the first screen and displaying a game chatting interface on the second screen under the condition that the acquired fingerprint characteristic information is matched with the pre-stored fingerprint characteristic information.
In a third aspect, an embodiment of the present invention further provides a terminal, including a processor, a memory, and a computer program stored on the memory and capable of running on the processor, where the computer program, when executed by the processor, implements the steps of the game interface display method described above.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the game interface display method are implemented.
Thus, in the embodiment of the invention, the game interface corresponding to the game application program can be displayed on the first screen, and the game chat interface can be displayed on the second screen. The condition that the game interface is shielded can be avoided, and the display effect of the game interface is good.
Drawings
FIG. 1 is a flow chart of a method for displaying a game interface according to an embodiment of the present invention;
fig. 2 is a schematic diagram of two screens included in the terminal according to the embodiment of the present invention;
FIG. 3 is a first schematic diagram of an interface after a game application has been launched as provided by an embodiment of the invention;
FIG. 4 is a diagram illustrating a text chat interface displayed on a second screen according to an embodiment of the invention;
FIG. 5 is a diagram illustrating a voice chat interface displayed on a second screen according to an embodiment of the invention;
FIG. 6 is a second schematic diagram of an interface after a game application has been launched as provided by an embodiment of the invention;
FIG. 7 is a first diagram illustrating a video chat interface, according to an embodiment of the invention;
FIG. 8 is a third schematic diagram of an interface after a gaming application has been launched as provided by an embodiment of the invention;
FIG. 9 is a second diagram illustrating a video chat interface provided by embodiments of the invention;
FIG. 10 is a flow chart of another method for displaying a game interface according to an embodiment of the present invention;
FIG. 11 is a diagram illustrating a game interface and a text chat interface after rotation of a terminal according to an embodiment of the invention;
FIG. 12 is a diagram illustrating a game interface and a voice chat interface after rotation of a terminal according to an embodiment of the invention;
FIG. 13 is a flow chart of another method for displaying a game interface according to an embodiment of the present invention;
FIG. 14 is a diagram of displaying game chat content on a first screen according to an embodiment of the invention;
FIG. 15 is a schematic diagram of a game chat interface including game chat content displayed in a second screen according to an embodiment of the invention;
FIG. 16 is a flow chart of another method for displaying a game interface according to an embodiment of the present invention;
FIG. 17 is a diagram illustrating a game chat control displayed on a first screen in accordance with an embodiment of the invention;
FIG. 18 is a schematic diagram of a game chat interface including a game chat control displayed in a second screen according to an embodiment of the invention;
fig. 19 is a block diagram of a terminal according to an embodiment of the present invention;
fig. 20 is a block diagram of another terminal provided in an embodiment of the present invention;
fig. 21 is a block diagram of another terminal provided in an embodiment of the present invention;
fig. 22 is a block diagram of another terminal provided in an embodiment of the present invention;
fig. 23 is a block diagram of another terminal provided in an embodiment of the present invention;
fig. 24 is a block diagram of another terminal provided in an embodiment of the present invention;
fig. 25 is a schematic diagram of a hardware structure of a terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, fig. 1 is a flowchart of a game interface display method provided by an embodiment of the present invention, and is applied to a terminal, where the terminal includes a first screen and a second screen. As shown in fig. 1, the method comprises the following steps:
step 101, receiving an operation input by a user when the game application program is in a running state.
In step 101, the terminal in the embodiment of the present invention may include two screens, i.e., a first screen and a second screen, as shown in fig. 2, which is a schematic diagram of the two screens included in the terminal. The screen in the vertical state may be taken as the first screen, and the screen in the horizontal state may be taken as the second screen.
The game application program can be started, the game interface corresponding to the game application program can be displayed on the first screen, and the left hand control area and the right hand control area corresponding to the game application program can be displayed on the second screen. As shown in fig. 3, a first schematic view of the interface after the game application is launched.
If the user wants to communicate with teammates during game play, the user can slide on the first screen or the second screen with a finger. At this time, the terminal receives the operation input by the user.
And 102, responding to the operation input by the user, and collecting fingerprint characteristic information of the user.
In step 102, fingerprint feature information of a user may be collected in response to an operation input by the user.
And 103, displaying a game interface corresponding to the game application program on the first screen and displaying a game chat interface on the second screen under the condition that the acquired fingerprint feature information is matched with the pre-stored fingerprint feature information.
In step 103, after the fingerprint feature information of the user is collected, the collected fingerprint feature information may be compared with the pre-stored fingerprint feature information. Under the condition that the collected fingerprint feature information is matched with the prestored fingerprint feature information, a game interface corresponding to the game application program can be displayed on the first screen, and a game chatting interface can be displayed on the second screen. And under the condition that the acquired fingerprint characteristic information is matched with the prestored fingerprint characteristic information, namely only when a specific finger slides on the first screen or the second screen, displaying a game interface corresponding to the game application program on the first screen and displaying a game chatting interface on the second screen. The occurrence of misoperation can be effectively avoided.
It should be noted that, when the user communicates with teammates, there may be three chat modes, which are a text chat mode, a voice chat mode and a video chat mode. Some gesture operations may be predefined, and one gesture operation is set to uniquely correspond to one chat mode.
For example, in the case where the collected fingerprint feature information matches the fingerprint feature information stored in advance, and in the case where the operation input by the user matches the preset first operation, the text chat mode may be called. A text chat interface can be displayed on the second screen. The text chat interface at least comprises a text input box and a text chat content display area. As shown in fig. 4, a schematic diagram of a text chat interface displayed on the second screen is shown.
Or, under the condition that the collected fingerprint feature information is matched with the fingerprint feature information stored in advance and under the condition that the operation input by the user is matched with the preset second operation, the voice chat mode can be called out. A voice chat interface may be displayed on the second screen. The voice chat interface at least comprises a voice input button and a voice chat content display area. As shown in fig. 5, a schematic diagram of a voice chat interface displayed on the second screen is shown.
Alternatively, the game chat interface may be a video chat interface. Under the condition that the collected fingerprint feature information is matched with the prestored fingerprint feature information and the operation input by the user is matched with the preset third operation, the face image can be detected through the camera of the terminal, and the video chat interface is displayed on the second screen according to the detection result.
For example, as shown in FIG. 6, a second schematic view of the interface after the game application is launched. In fig. 6, a screen in a vertical state may be taken as the second screen, and a screen in a horizontal state may be taken as the first screen. The camera of the terminal may be located on the second screen. After the game application is started, a game interface corresponding to the game application can be displayed on the second screen, and a "left hand manipulation area" and a "right hand manipulation area" corresponding to the game application can be displayed on the first screen. If the user slides on the first screen or the second screen by using a finger at the moment, the terminal can collect fingerprint characteristic information of the user. And detecting the face image through a camera of the terminal under the condition that the acquired fingerprint feature information is matched with the pre-stored fingerprint feature information and the operation input by the user is matched with the preset third operation.
Because the camera of the terminal is located on the second screen, namely the camera of the terminal is located on the screen facing the face, the camera can acquire the face image. In the case that the camera detects the face image, the video chat interface can be displayed on the second screen, and the game interface corresponding to the game application program can be displayed on the first screen. As shown in fig. 7, a first diagram of a video chat interface is shown. In fig. 7, the names of the teammates of the users, i.e., "user a", "user b", and "user d", etc., are displayed on the second screen. If the user clicks the 'user D' area, a face picture of the user D can be displayed in full screen on the second screen. The second screen can also comprise a chat content display area, and text chat content or voice chat content sent by teammates of the user can be displayed in the chat content display area.
There is another case. For example, as shown in FIG. 8, a third schematic view of the interface after the game application is launched. In fig. 8, a screen in a vertical state may be taken as a first screen, and a screen in a horizontal state may be taken as a second screen. The camera of the terminal may be located on the second screen. After the game application is started, a game interface corresponding to the game application can be displayed on the first screen, and a "left hand manipulation area" and a "right hand manipulation area" corresponding to the game application can be displayed on the second screen. If the user slides on the first screen or the second screen by using a finger at the moment, the terminal can collect fingerprint characteristic information of the user. And detecting the face image through a camera of the terminal under the condition that the acquired fingerprint feature information is matched with the pre-stored fingerprint feature information and the operation input by the user is matched with the preset third operation.
At this time, the camera of the terminal is located on the second screen in the horizontal state, so the camera cannot acquire a face image. Under the condition that the camera does not detect the face image, the terminal can output a rotation prompt, and then a user can rotate the terminal according to the rotation prompt. When the terminal rotates and the camera detects the face image, the video chat interface can be displayed on the second screen, and the game interface corresponding to the game application program can be displayed on the first screen. As shown in fig. 9, a second diagram of a video chat interface is shown. In fig. 9, the names of the teammates of the users, i.e., "user a", "user B", and "user C", etc., are displayed on the second screen. If the user clicks the "user A" area, a picture of the face of the user A may be displayed in full screen on the second screen. The second screen can also comprise a chat content display area, and text chat content or voice chat content sent by teammates of the user can be displayed in the chat content display area.
In an embodiment of the present invention, the terminal may be a Mobile phone, a Tablet Personal Computer (Tablet Personal Computer), a Laptop Computer (Laptop Computer), a Personal Digital Assistant (PDA), a Mobile Internet Device (MID), a Wearable Device (Wearable Device), or the like.
The game interface display method provided by the embodiment of the invention is applied to a terminal, and the terminal comprises a first screen and a second screen. Receiving an operation input by a user when the game application program is in a running state; collecting fingerprint characteristic information of a user in response to the operation input by the user; and under the condition that the acquired fingerprint characteristic information is matched with the fingerprint characteristic information stored in advance, displaying a game interface corresponding to the game application program on the first screen, and displaying a game chatting interface on the second screen. In this way, a game interface corresponding to the game application program can be displayed on the first screen, and a game chatting interface can be displayed on the second screen. The condition that the game interface is shielded can be avoided, and the display effect of the game interface is good.
Referring to fig. 10, fig. 10 is a flowchart of another game interface display method provided by an embodiment of the present invention, and the game interface display method is applied to a terminal, where the terminal includes a first screen and a second screen. As shown in fig. 10, the method comprises the following steps:
step 1001, receiving an operation input by a user when the game application is in an operating state.
In step 1001, the terminal in the embodiment of the present invention may include two screens, namely, a first screen and a second screen, and still taking fig. 2 as an example, as shown in fig. 2, a screen in a vertical state may be used as the first screen, and a screen in a horizontal state may be used as the second screen.
The game application program can be started, the game interface corresponding to the game application program can be displayed on the first screen, and the left hand control area and the right hand control area corresponding to the game application program can be displayed on the second screen. Still taking fig. 3 as an example, as shown in fig. 3, it is a first schematic diagram of an interface after the game application is started.
If the user wants to communicate with teammates during game play, the user can slide on the first screen or the second screen with a finger. At this time, the terminal receives the operation input by the user.
Step 1002, collecting fingerprint feature information of the user in response to the operation input by the user.
In step 1002, fingerprint feature information of a user may be collected in response to an operation input by the user.
Step 1003, under the condition that the acquired fingerprint feature information is matched with the pre-stored fingerprint feature information and the operation input by the user is matched with a preset first operation, displaying a game interface corresponding to the game application program on the first screen, and displaying a text chat interface on the second screen, wherein the text chat interface at least comprises a text input box and a text chat content display area.
In step 1003, it should be noted that there may be three chat modes, namely, a text chat mode, a voice chat mode and a video chat mode, when the user communicates with the teammates. Some gesture operations may be predefined, and one gesture operation is set to uniquely correspond to one chat mode.
After the fingerprint feature information of the user is collected, the collected fingerprint feature information can be compared with the fingerprint feature information stored in advance. Under the condition that the collected fingerprint characteristic information is matched with the fingerprint characteristic information stored in advance and under the condition that the operation input by the user is matched with the preset first operation, a character chatting mode can be called out. A game interface corresponding to the game application program can be displayed on the first screen, and a text chat interface can be displayed on the second screen. The text chat interface at least comprises a text input box and a text chat content display area. Still taking fig. 4 as an example, as shown in fig. 4, a schematic diagram of displaying a text chat interface on the second screen is shown.
And 1004, displaying a voice chat interface on the second screen under the condition that the operation input by the user is matched with a preset second operation, wherein the voice chat interface at least comprises a voice input button and a voice chat content display area.
In step 1004, in case that the collected fingerprint feature information matches with the pre-stored fingerprint feature information, and in case that the operation input by the user matches with the preset second operation, a voice chat mode may be invoked. A voice chat interface may be displayed on the second screen. The voice chat interface at least comprises a voice input button and a voice chat content display area. Still taking fig. 5 as an example, as shown in fig. 5, a schematic diagram of displaying a voice chat interface on the second screen is shown.
Step 1005, the game chatting interface is a video chatting interface, and in the case that the operation input by the user is matched with a preset third operation, a face image is detected through a camera of the terminal, and the video chatting interface is displayed on the second screen according to a detection result.
In step 1005, the game chat interface may be a video chat interface. Under the condition that the collected fingerprint feature information is matched with the prestored fingerprint feature information and the operation input by the user is matched with the preset third operation, the face image can be detected through the camera of the terminal, and the video chat interface is displayed on the second screen according to the detection result.
Optionally, displaying the video chat interface on the second screen according to the detection result, including:
displaying the video chat interface on the second screen under the condition that the face image is detected;
under the condition that the face image is not detected, outputting a rotation prompt;
and displaying the video chat interface on the second screen under the condition that the spatial posture of the terminal is detected to be changed and the camera detects the face image.
For example, still taking fig. 6 as an example, in fig. 6, the screen in the vertical state may be taken as the second screen, and the screen in the horizontal state may be taken as the first screen. The camera of the terminal may be located on the second screen. After the game application is started, a game interface corresponding to the game application can be displayed on the second screen, and a "left hand manipulation area" and a "right hand manipulation area" corresponding to the game application can be displayed on the first screen. If the user slides on the first screen or the second screen by using a finger at the moment, the terminal can collect fingerprint characteristic information of the user. And detecting the face image through a camera of the terminal under the condition that the acquired fingerprint feature information is matched with the pre-stored fingerprint feature information and the operation input by the user is matched with the preset third operation.
Because the camera of the terminal is located on the second screen, namely the camera of the terminal is located on the screen facing the face, the camera can acquire the face image. In the case that the camera detects the face image, the video chat interface can be displayed on the second screen, and the game interface corresponding to the game application program can be displayed on the first screen. Still taking fig. 7 as an example, as shown in fig. 7, a first schematic diagram of a video chat interface is shown. In fig. 7, the names of the teammates of the users, i.e., "user a", "user b", and "user d", etc., are displayed on the second screen. If the user clicks the 'user D' area, a face picture of the user D can be displayed in full screen on the second screen. The second screen can also comprise a chat content display area, and text chat content or voice chat content sent by teammates of the user can be displayed in the chat content display area.
There is another case. For example, still taking fig. 8 as an example, in fig. 8, the screen in the vertical state may be taken as the first screen, and the screen in the horizontal state may be taken as the second screen. The camera of the terminal may be located on the second screen. After the game application is started, a game interface corresponding to the game application can be displayed on the first screen, and a "left hand manipulation area" and a "right hand manipulation area" corresponding to the game application can be displayed on the second screen. If the user slides on the first screen or the second screen by using a finger at the moment, the terminal can collect fingerprint characteristic information of the user. And detecting the face image through a camera of the terminal under the condition that the acquired fingerprint feature information is matched with the pre-stored fingerprint feature information and the operation input by the user is matched with the preset third operation.
At this time, the camera of the terminal is located on the second screen in the horizontal state, so the camera cannot acquire a face image. Under the condition that the camera does not detect the face image, the terminal can output a rotation prompt, and then a user can rotate the terminal according to the rotation prompt. When the spatial posture of the terminal is detected to be changed and the camera detects the face image, namely the terminal rotates, and when the camera detects the face image, the video chat interface can be displayed on the second screen, and the game interface corresponding to the game application program can be displayed on the first screen. Still taking fig. 9 as an example, as shown in fig. 9, a second schematic diagram of a video chat interface is displayed. In fig. 9, the names of the teammates of the users, i.e., "user a", "user B", and "user C", etc., are displayed on the second screen. If the user clicks the "user A" area, a picture of the face of the user A may be displayed in full screen on the second screen. The second screen can also comprise a chat content display area, and text chat content or voice chat content sent by teammates of the user can be displayed in the chat content display area.
Optionally, after the step of displaying the text chat interface on the second screen, the method further includes:
under the condition that the change of the space posture of the terminal is detected, displaying the text chatting interface in the first screen, and displaying the game interface in the second screen;
alternatively, after the step of displaying the voice chat interface on the second screen, the method further includes:
and displaying the voice chat interface in the first screen and displaying the game interface in the second screen under the condition that the change of the space posture of the terminal is detected.
Further, when the spatial posture of the terminal is detected to be changed, a text chat interface can be displayed in the first screen, and a game interface can be displayed in the second screen. Still taking fig. 4 as an example, in fig. 4, a game interface is displayed in the first screen, and a text chat interface is displayed in the second screen. Under the condition that the change of the space posture of the terminal is detected, for example, at the moment, the first screen rotates to the horizontal position, and the second screen rotates to the vertical position, if the game interface is still displayed in the first screen and the text chat interface is still displayed in the second screen at the moment, the user is inconvenient to control the game or input the text in the text chat interface. Therefore, at this time, the game interface and the text chat interface can be switched to be displayed on the screen, that is, the text chat interface can be displayed in the first screen, and the game interface can be displayed in the second screen. As shown in fig. 11, the game interface and the text chat interface are schematic diagrams after the terminal is rotated.
Alternatively, in the case where a change in the spatial posture of the terminal is detected, the voice chat interface may be displayed in the first screen, and the game interface may be displayed in the second screen. Still taking fig. 5 as an example, in fig. 5, a game interface is displayed in the first screen, and a voice chat interface is displayed in the second screen. Under the condition that the change of the space posture of the terminal is detected, for example, the first screen rotates to the horizontal position at the moment, and the second screen rotates to the vertical position at the moment, if the game interface is still displayed in the first screen and the voice chat interface is still displayed in the second screen at the moment, the user can not conveniently control the game or record the voice in the voice chat interface. Therefore, at this time, the game interface and the voice chat interface can be switched to be displayed on the screen, that is, the voice chat interface can be displayed in the first screen, and the game interface can be displayed in the second screen. As shown in fig. 12, it is a schematic diagram of the game interface and the voice chat interface after the terminal is rotated. Under the condition that the change of the space posture of the terminal is detected, the game interface and the text chat interface can be switched to be displayed on the screen, or the game interface and the voice chat interface can be switched to be displayed on the screen. Therefore, the user can conveniently control the game and input the chat content.
The game interface display method provided by the embodiment of the invention is applied to a terminal, and the terminal comprises a first screen and a second screen. A game interface corresponding to the game application program can be displayed on the first screen, and a game chatting interface can be displayed on the second screen. The condition that the game interface is shielded can be avoided, and the display effect of the game interface is good.
Referring to fig. 13, fig. 13 is a flowchart of another game interface display method provided by an embodiment of the present invention, and the game interface display method is applied to a terminal, where the terminal includes a first screen and a second screen. As shown in fig. 13, the method comprises the following steps:
step 1301, displaying the game chat content on the first screen when the game application program is in the running state.
In step 1301, the terminal in the embodiment of the present invention may include two screens, namely, a first screen and a second screen, and still taking fig. 2 as an example, as shown in fig. 2, a screen in a vertical state may be used as the first screen, and a screen in a horizontal state may be used as the second screen.
The game application program can be started, the game interface corresponding to the game application program can be displayed on the first screen, and the left hand control area and the right hand control area corresponding to the game application program can be displayed on the second screen. Still taking fig. 3 as an example, as shown in fig. 3, it is a first schematic diagram of an interface after the game application is started. If a teammate of the user issues game chat content to the user, the game chat content may be displayed on the first screen. Suppose that the game chat content is "how is the case? ". As shown in fig. 14, a diagram of displaying game chat content on a first screen is shown.
Because of the game chat content "how is the case? "will cause an occlusion in the game interface, so the user can slide on the first screen or the second screen with a finger.
Step 1302, receiving an operation input by a user.
In step 1302, the user slides on the first screen or the second screen with a finger, and the terminal may receive an operation input by the user.
And step 1303, collecting fingerprint feature information of the user in response to the operation input by the user.
In step 1303, in response to the operation input by the user, fingerprint feature information of the user may be collected.
And 1304, displaying a game interface corresponding to the game application program on the first screen to identify the game chat content under the condition that the acquired fingerprint feature information is matched with the pre-stored fingerprint feature information and the operation input by the user is matched with a preset fourth operation.
In step 1304, after the fingerprint feature information of the user is collected, the collected fingerprint feature information may be compared with the pre-stored fingerprint feature information. Under the condition that the collected fingerprint feature information is matched with the prestored fingerprint feature information and the operation input by the user is matched with the preset fourth operation, a game interface corresponding to a game application program can be displayed on the first screen, and the game chat content can be identified by using an Artificial Intelligence (AI) identification technology, namely "how about the situation? ".
Step 1305, deleting the game chat content in the first screen, and displaying a game chat interface containing the game chat content in the second screen.
In step 1305, the game chat content "how is the case? "thereafter, the game chat content displayed in the first screen" how can be the case "can be deleted by the image processing technique? "and further can display in the second screen a message containing the game chat content" how is the case? "game chat interface. As shown in fig. 15, a schematic diagram of a game chat interface including game chat content is displayed in the second screen. Therefore, when the game chatting content shields the game interface, the game chatting content can be identified by using the artificial intelligence identification technology, and then the game chatting content shielding the game interface can be deleted and displayed on a screen different from the game interface. The influence of the game chatting content on the game interface can be reduced.
The game interface display method provided by the embodiment of the invention is applied to a terminal, and the terminal comprises a first screen and a second screen. When the game chatting content shields the game interface, the artificial intelligence recognition technology can be used for recognizing the game chatting content, the game chatting content shielding the game interface can be further deleted, and the game chatting content is displayed on a screen different from the game interface. The influence of the game chatting content on the game interface can be reduced, and the display effect of the game interface is good.
Referring to fig. 16, fig. 16 is a flowchart of another game interface display method provided by an embodiment of the present invention, and the game interface display method is applied to a terminal, where the terminal includes a first screen and a second screen. As shown in fig. 16, the method comprises the following steps:
step 1601, displaying a game chatting control on the first screen when the game application program is in an operating state, wherein the game chatting control comprises a text input box and a text chatting content display area, or the game chatting control comprises a voice input button and a voice chatting content display area.
In step 1601, the terminal in the embodiment of the present invention may include two screens, i.e., a first screen and a second screen, and still taking fig. 2 as an example, as shown in fig. 2, a screen in a vertical state may be used as the first screen, and a screen in a horizontal state may be used as the second screen.
The game application program can be started, the game interface corresponding to the game application program can be displayed on the first screen, and the left hand control area and the right hand control area corresponding to the game application program can be displayed on the second screen. Still taking fig. 3 as an example, as shown in fig. 3, it is a first schematic diagram of an interface after the game application is started. A game chat control can be displayed on a first screen. The game chat control can comprise a text input box and a text chat content display area, or the game chat control can comprise a voice input button and a voice chat content display area. As shown in fig. 17, a diagram of a game chat control displayed on a first screen is shown. In fig. 17, the game chat control is a text input box and a text chat content display area.
At this time, the game chat control can block the game interface, so as to identify the game chat control displayed in the first screen by using the artificial intelligence identification technology as described in the previous embodiment. The user may slide down on the first screen with a finger.
Step 1602, receive an operation input by a user.
In step 1602, the user slides down on the first screen with a finger, and the terminal may receive an operation input by the user.
Step 1603, in response to the operation input by the user, collecting fingerprint feature information of the user.
In step 1603, fingerprint feature information of the user may be collected in response to an operation input by the user.
And 1604, displaying a game interface corresponding to the game application program on the first screen, deleting the game chat control displayed in the first screen, and displaying a game chat interface containing the game chat control in the second screen under the condition that the acquired fingerprint feature information is matched with pre-stored fingerprint feature information and the operation input by the user is matched with a preset fifth operation.
In step 1604, after the fingerprint feature information of the user is collected, the collected fingerprint feature information may be compared with the pre-stored fingerprint feature information. Under the condition that the collected fingerprint feature information is matched with the fingerprint feature information stored in advance and under the condition that the operation input by the user is matched with the preset fifth operation, a game interface corresponding to the game application program can be displayed on the first screen. The game chat control displayed in the first screen can be deleted, and the game chat interface containing the game chat control is displayed in the second screen, namely the game chat interface containing the text input box and the text chat content display area can be displayed in the second screen. As shown in fig. 18, a diagram of a game chat interface including a game chat control is displayed in a second screen. Therefore, when the game chatting control blocks the game interface, the game chatting control can be identified by using an artificial intelligence identification technology, the game chatting control blocking the game interface can be deleted, and the game chatting control is displayed on a screen different from the game interface. The influence of the game chatting control on the game interface can be reduced.
The game interface display method provided by the embodiment of the invention is applied to a terminal, and the terminal comprises a first screen and a second screen. When the game chatting control shields the game interface, the artificial intelligence recognition technology can be used for recognizing the game chatting control, the game chatting control shielding the game interface can be deleted, and the game chatting control is displayed on a screen different from the game interface. The influence of the game chatting control on the game interface can be reduced, and the display effect of the game interface is good.
Referring to fig. 19, fig. 19 is a structural diagram of a terminal provided in an implementation of the present invention, the terminal including a first screen and a second screen. As shown in fig. 19, the terminal 1900 includes a receiving module 1901, an acquiring module 1902, and a first displaying module 1903, wherein:
a receiving module 1901, configured to receive an operation input by a user when the game application is in an operating state;
an acquiring module 1902, configured to acquire fingerprint feature information of a user in response to an operation input by the user;
a first display module 1903, configured to display a game interface corresponding to the game application on the first screen and display a game chat interface on the second screen when the acquired fingerprint feature information matches with pre-stored fingerprint feature information.
Optionally, as shown in fig. 20, the first display module 1903 includes:
a first display sub-module 19031, configured to display a text chat interface on the second screen when the operation input by the user matches a preset first operation, where the text chat interface at least includes a text input box and a text chat content display area;
or, the second display sub-module 19032 is configured to, if the operation input by the user matches a preset second operation, display a voice chat interface on the second screen, where the voice chat interface at least includes a voice entry button and a voice chat content display area;
or, the third display sub-module 19033 is configured to detect a face image through a camera of the terminal when the game chat interface is a video chat interface and the operation input by the user is matched with a preset third operation, and display the video chat interface on the second screen according to a detection result.
Optionally, as shown in fig. 21, the third display sub-module 19033 includes:
a first display unit 190331 for displaying the video chat interface on the second screen in a case where a face image is detected;
an output unit 190332 for outputting a rotation cue in the case where a face image is not detected;
a second display unit 190333, configured to display the video chat interface on the second screen when the spatial posture of the terminal is detected to change and the camera detects a face image.
Optionally, as shown in fig. 22, the terminal further includes:
a second display module 1904, configured to display game chat content on the first screen;
the first display module 1903 further includes:
an identifying submodule 19034, configured to identify the game chat content when the operation input by the user matches a preset fourth operation;
a deleting submodule 19035, configured to delete the game chat content in the first screen, and display a game chat interface including the game chat content in the second screen.
Optionally, as shown in fig. 23, the terminal further includes:
a third display module 1905, configured to display a game chat control on the first screen, where the game chat control includes a text input box and a text chat content display area, or the game chat control includes a voice entry button and a voice chat content display area;
the first display module 1903 is specifically configured to delete the game chat control displayed in the first screen and display a game chat interface including the game chat control in the second screen when the operation input by the user is matched with a preset fifth operation.
Optionally, as shown in fig. 24, the terminal further includes:
a fourth display module 1906, configured to display the text chat interface in the first screen and display the game interface in the second screen when detecting that the spatial posture of the terminal changes;
a fifth display module 1907, configured to display the voice chat interface in the first screen and display the game interface in the second screen when detecting that the spatial posture of the terminal changes.
The terminal 1900 can implement each process implemented by the terminal in the method embodiments of fig. 1, fig. 10, fig. 13, and fig. 16, and is not described herein again to avoid repetition. In addition, the terminal 1900 according to the embodiment of the present invention can display a game interface corresponding to the game application on the first screen, and can display a game chat interface on the second screen. The condition that the game interface is shielded can be avoided, and the display effect of the game interface is good.
Fig. 25 is a schematic diagram of a hardware structure of a terminal for implementing various embodiments of the present invention.
The terminal 2500 includes, but is not limited to: radio frequency unit 2501, network module 2502, audio output unit 2503, input unit 2504, sensor 2505, display unit 2506, user input unit 2507, interface unit 2508, memory 2509, processor 2510, and power supply 2511. Those skilled in the art will appreciate that the terminal configuration shown in fig. 25 is not intended to be limiting, and that the terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
A processor 2510 for receiving an operation input by a user when the game application is in a running state;
collecting fingerprint characteristic information of a user in response to the operation input by the user;
and under the condition that the acquired fingerprint characteristic information is matched with the fingerprint characteristic information stored in advance, displaying a game interface corresponding to the game application program on the first screen, and displaying a game chatting interface on the second screen.
The game interface corresponding to the game application program can be displayed on the first screen, and the game chatting interface can be displayed on the second screen. The condition that the game interface is shielded can be avoided, and the display effect of the game interface is good.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 2501 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 2510; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 2501 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 2501 can also communicate with a network and other devices through a wireless communication system.
The terminal provides wireless broadband internet access to the user through the network module 2502, such as helping the user send and receive e-mails, browse web pages, and access streaming media.
The audio output unit 2503 may convert audio data received by the radio frequency unit 2501 or the network module 2502 or stored in the memory 2509 into an audio signal and output as sound. Also, the audio output unit 2503 may also provide audio output related to a specific function performed by the terminal 2500 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 2503 includes a speaker, a buzzer, a receiver, and the like.
The input unit 2504 is used to receive an audio or video signal. The input Unit 2504 may include a Graphics Processing Unit (GPU) 25041 and a microphone 25042, and the Graphics processor 25041 processes image data of a still picture or video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 2506. The image frames processed by the graphic processor 25041 may be stored in the memory 2509 (or other storage medium) or transmitted via the radio frequency unit 2501 or the network module 2502. The microphone 25042 may receive sounds, and may be capable of processing such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 2501 in case of a phone call mode.
Terminal 2500 also includes at least one sensor 2505, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that adjusts the brightness of the display panel 25061 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 25061 and/or a backlight when the terminal 2500 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 2505 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 2506 is used to display information input by the user or information provided to the user. The Display unit 2506 may include a Display panel 25061, and the Display panel 25061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 2507 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal. Specifically, the user input unit 2507 includes a touch panel 25071 and other input devices 25072. The touch panel 25071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 25071 (e.g., operations by a user on the touch panel 25071 or near the touch panel 25071 using any suitable object or accessory such as a finger, stylus, etc.). The touch panel 25071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 2510, and receives and executes commands sent from the processor 2510. In addition, the touch panel 25071 may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The user input unit 2507 may include other input devices 25072 in addition to the touch panel 25071. In particular, the other input devices 25072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described herein.
Further, the touch panel 25071 may be overlaid on the display panel 25061, and when the touch panel 25071 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 2510 to determine the type of the touch event, and then the processor 2510 provides a corresponding visual output on the display panel 25061 according to the type of the touch event. Although in fig. 25, the touch panel 25071 and the display panel 25061 are two independent components to implement the input and output functions of the terminal, in some embodiments, the touch panel 25071 and the display panel 25061 may be integrated to implement the input and output functions of the terminal, and is not limited herein.
The interface unit 2508 is an interface for connecting an external device to the terminal 2500. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 2508 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the terminal 2500 or may be used to transmit data between the terminal 2500 and external devices.
The memory 2509 may be used to store software programs as well as various data. The memory 2509 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 2509 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 2510 is a control center of the terminal, connects various parts of the entire terminal using various interfaces and lines, and performs various functions of the terminal and processes data by operating or executing software programs and/or modules stored in the memory 2509 and calling data stored in the memory 2509, thereby integrally monitoring the terminal. Processor 2510 may include one or more processing units; preferably, the processor 2510 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It is to be appreciated that the modem processor described above may not be integrated into processor 2510.
Terminal 2500 may also include a power supply 2511 (e.g., a battery) to provide power to the various components, and preferably, power supply 2511 may be logically coupled to processor 2510 via a power management system to provide management functions such as charging, discharging, and power consumption management via the power management system.
In addition, the terminal 2500 includes some functional modules that are not shown, and thus, are not described in detail herein.
Optionally, processor 2510 is further configured to:
displaying a text chat interface on the second screen under the condition that the operation input by the user is matched with a preset first operation, wherein the text chat interface at least comprises a text input box and a text chat content display area;
or displaying a voice chat interface on the second screen under the condition that the operation input by the user is matched with a preset second operation, wherein the voice chat interface at least comprises a voice input button and a voice chat content display area;
or, the game chatting interface is a video chatting interface, and under the condition that the operation input by the user is matched with a preset third operation, the face image is detected through a camera of the terminal, and the video chatting interface is displayed on the second screen according to the detection result.
Optionally, processor 2510 is further configured to:
displaying the video chat interface on the second screen under the condition that the face image is detected;
under the condition that the face image is not detected, outputting a rotation prompt;
and displaying the video chat interface on the second screen under the condition that the spatial posture of the terminal is detected to be changed and the camera detects the face image.
Optionally, processor 2510 is further configured to:
displaying game chat content on the first screen;
identifying the game chat content under the condition that the operation input by the user is matched with a preset fourth operation;
deleting the game chat content in the first screen, and displaying a game chat interface containing the game chat content in the second screen.
Optionally, processor 2510 is further configured to:
displaying a game chatting control on the first screen, wherein the game chatting control comprises a text input box and a text chatting content display area, or the game chatting control comprises a voice input button and a voice chatting content display area;
and under the condition that the operation input by the user is matched with a preset fifth operation, deleting the game chatting control displayed in the first screen, and displaying a game chatting interface containing the game chatting control in the second screen.
Optionally, processor 2510 is further configured to:
under the condition that the change of the space posture of the terminal is detected, displaying the text chatting interface in the first screen, and displaying the game interface in the second screen;
or, when the spatial posture of the terminal is detected to be changed, the voice chat interface is displayed in the first screen, and the game interface is displayed in the second screen.
The terminal 2500 can implement the processes implemented by the terminal in the foregoing embodiments, and details are not described here to avoid repetition. And the terminal 2500 may display a game interface corresponding to the game application on the first screen, and may display a game chat interface on the second screen. The condition that the game interface is shielded can be avoided, and the display effect of the game interface is good.
Preferably, an embodiment of the present invention further provides a terminal, including a processor 2510, a memory 2509, and a computer program stored in the memory 2509 and capable of running on the processor 2510, where the computer program, when executed by the processor 2510, implements each process of the game interface display method embodiment, and can achieve the same technical effect, and details are not repeated here to avoid repetition.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the game interface display method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (11)

1. A game interface display method is applied to a terminal, the terminal comprises a first screen and a second screen, and the method is characterized by comprising the following steps:
receiving an operation input by a user when the game application program is in a running state;
collecting fingerprint characteristic information of a user in response to the operation input by the user;
under the condition that the acquired fingerprint feature information is matched with the pre-stored fingerprint feature information, displaying a game interface corresponding to the game application program on the first screen, and displaying a game chatting interface on the second screen;
the displaying a game chat interface on the second screen includes:
displaying a text chat interface on the second screen under the condition that the operation input by the user is matched with a preset first operation, wherein the text chat interface at least comprises a text input box and a text chat content display area;
or displaying a voice chat interface on the second screen under the condition that the operation input by the user is matched with a preset second operation, wherein the voice chat interface at least comprises a voice input button and a voice chat content display area;
or, the game chatting interface is a video chatting interface, and under the condition that the operation input by the user is matched with a preset third operation, the face image is detected through a camera of the terminal, and the video chatting interface is displayed on the second screen according to the detection result.
2. The method of claim 1, wherein displaying the video chat interface on the second screen based on the detection comprises:
displaying the video chat interface on the second screen under the condition that the face image is detected;
under the condition that the face image is not detected, outputting a rotation prompt;
and displaying the video chat interface on the second screen under the condition that the spatial posture of the terminal is detected to be changed and the camera detects the face image.
3. The method of claim 1, wherein prior to the step of receiving the operation of user input, the method further comprises:
displaying game chat content on the first screen;
the displaying a game chat interface on the second screen includes:
identifying the game chat content under the condition that the operation input by the user is matched with a preset fourth operation;
deleting the game chat content in the first screen, and displaying a game chat interface containing the game chat content in the second screen.
4. The method of claim 1, wherein prior to the step of receiving the operation of user input, the method further comprises:
displaying a game chatting control on the first screen, wherein the game chatting control comprises a text input box and a text chatting content display area, or the game chatting control comprises a voice input button and a voice chatting content display area;
the displaying a game chat interface on the second screen includes:
and under the condition that the operation input by the user is matched with a preset fifth operation, deleting the game chatting control displayed in the first screen, and displaying a game chatting interface containing the game chatting control in the second screen.
5. The method of claim 1, wherein after the step of displaying a text chat interface on the second screen, the method further comprises:
under the condition that the change of the space posture of the terminal is detected, displaying the text chatting interface in the first screen, and displaying the game interface in the second screen;
alternatively, after the step of displaying the voice chat interface on the second screen, the method further includes:
and displaying the voice chat interface in the first screen and displaying the game interface in the second screen under the condition that the change of the space posture of the terminal is detected.
6. A terminal, the terminal comprising a first screen and a second screen, comprising:
the receiving module is used for receiving the operation input by the user when the game application program is in the running state;
the acquisition module is used for responding to the operation input by the user and acquiring fingerprint characteristic information of the user;
the first display module is used for displaying a game interface corresponding to the game application program on the first screen and displaying a game chatting interface on the second screen under the condition that the acquired fingerprint characteristic information is matched with the pre-stored fingerprint characteristic information;
the first display module includes:
the first display sub-module is used for displaying a text chat interface on the second screen under the condition that the operation input by the user is matched with a preset first operation, wherein the text chat interface at least comprises a text input box and a text chat content display area;
or, the second display sub-module is configured to display a voice chat interface on the second screen under the condition that the operation input by the user is matched with a preset second operation, where the voice chat interface at least includes a voice entry button and a voice chat content display area;
or, the third display sub-module is configured to detect a face image through a camera of the terminal when the game chat interface is a video chat interface and the operation input by the user is matched with a preset third operation, and display the video chat interface on the second screen according to a detection result.
7. The terminal of claim 6, wherein the third display sub-module comprises:
the first display unit is used for displaying the video chat interface on the second screen under the condition that a face image is detected;
the output unit is used for outputting a rotation prompt under the condition that the face image is not detected;
and the second display unit is used for displaying the video chat interface on the second screen when the spatial posture of the terminal is detected to be changed and the camera detects the face image.
8. The terminal of claim 6, wherein the terminal further comprises:
the second display module is used for displaying the game chat content on the first screen;
the first display module further comprises:
the recognition submodule is used for recognizing the game chat content under the condition that the operation input by the user is matched with a preset fourth operation;
and the deleting submodule is used for deleting the game chatting content in the first screen and displaying a game chatting interface containing the game chatting content in the second screen.
9. The terminal of claim 6, wherein the terminal further comprises:
the third display module is used for displaying a game chatting control on the first screen, wherein the game chatting control comprises a text input box and a text chatting content display area, or the game chatting control comprises a voice input button and a voice chatting content display area;
the first display module is specifically configured to delete the game chat control displayed in the first screen and display a game chat interface including the game chat control in the second screen when the operation input by the user is matched with a preset fifth operation.
10. The terminal of claim 6, wherein the terminal further comprises:
the fourth display module is used for displaying the text chatting interface in the first screen and displaying the game interface in the second screen under the condition that the change of the space posture of the terminal is detected;
and the fifth display module is used for displaying the voice chat interface in the first screen and displaying the game interface in the second screen under the condition that the change of the space posture of the terminal is detected.
11. A terminal comprising a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the game interface display method of any one of claims 1 to 5.
CN201810843553.1A 2018-07-27 2018-07-27 Game interface display method and terminal Active CN109078319B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810843553.1A CN109078319B (en) 2018-07-27 2018-07-27 Game interface display method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810843553.1A CN109078319B (en) 2018-07-27 2018-07-27 Game interface display method and terminal

Publications (2)

Publication Number Publication Date
CN109078319A CN109078319A (en) 2018-12-25
CN109078319B true CN109078319B (en) 2021-02-02

Family

ID=64831105

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810843553.1A Active CN109078319B (en) 2018-07-27 2018-07-27 Game interface display method and terminal

Country Status (1)

Country Link
CN (1) CN109078319B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109815667B (en) * 2018-12-27 2022-01-11 维沃移动通信有限公司 Display method and terminal equipment
CN109714485B (en) * 2019-01-10 2021-07-23 维沃移动通信有限公司 Display method and mobile terminal
CN109889661A (en) * 2019-01-30 2019-06-14 维沃移动通信有限公司 A kind of interface display control method and mobile terminal
CN109939444A (en) * 2019-03-18 2019-06-28 北京智明星通科技股份有限公司 Interface display methods and device
CN110502292B (en) * 2019-07-01 2022-07-15 维沃移动通信有限公司 Display control method and terminal
CN110339570A (en) * 2019-07-17 2019-10-18 网易(杭州)网络有限公司 Exchange method, device, storage medium and the electronic device of information
CN110251941B (en) * 2019-07-22 2023-02-03 网易(杭州)网络有限公司 Game picture self-adaption method and device, electronic equipment and storage medium
CN111338541B (en) * 2020-02-12 2021-10-26 网易(杭州)网络有限公司 Interface display method and device in game, electronic equipment and storage medium
CN111265865A (en) * 2020-02-25 2020-06-12 网易(杭州)网络有限公司 Game interface display method and device, terminal equipment and storage medium
CN111589168B (en) * 2020-05-12 2023-03-21 腾讯科技(深圳)有限公司 Instant messaging method, device, equipment and medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2339800A1 (en) * 2006-09-29 2011-06-29 Research In Motion Limited IM contact list entry as a game in progress designate
CN108234743A (en) * 2017-11-24 2018-06-29 北京珠穆朗玛移动通信有限公司 Notice reminding method, mobile terminal and device based on double screen

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100424468B1 (en) * 2001-12-03 2004-03-26 삼성전자주식회사 Display method for folder type mobile terminal equipment
CN102622664A (en) * 2012-01-06 2012-08-01 深圳市华伯通讯设备有限公司 Event processing method and event processing system
CN107300986B (en) * 2017-06-30 2022-01-18 联想(北京)有限公司 Input method switching method and device
CN107679377A (en) * 2017-09-30 2018-02-09 广东欧珀移动通信有限公司 Application interface switching method, device, storage medium and electronic equipment
CN107809504B (en) * 2017-11-07 2019-08-16 Oppo广东移动通信有限公司 Show method, apparatus, terminal and the storage medium of information

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2339800A1 (en) * 2006-09-29 2011-06-29 Research In Motion Limited IM contact list entry as a game in progress designate
CN108234743A (en) * 2017-11-24 2018-06-29 北京珠穆朗玛移动通信有限公司 Notice reminding method, mobile terminal and device based on double screen

Also Published As

Publication number Publication date
CN109078319A (en) 2018-12-25

Similar Documents

Publication Publication Date Title
CN109078319B (en) Game interface display method and terminal
CN110995923B (en) Screen projection control method and electronic equipment
CN108459797B (en) Control method of folding screen and mobile terminal
CN109714485B (en) Display method and mobile terminal
CN110109593B (en) Screen capturing method and terminal equipment
CN108279948B (en) Application program starting method and mobile terminal
CN109241775B (en) Privacy protection method and terminal
CN109412932B (en) Screen capturing method and terminal
CN109710349B (en) Screen capturing method and mobile terminal
CN109343788B (en) Operation control method of mobile terminal and mobile terminal
CN109407948B (en) Interface display method and mobile terminal
CN110012143B (en) Telephone receiver control method and terminal
CN107728923B (en) Operation processing method and mobile terminal
CN109523253B (en) Payment method and device
CN108664818B (en) Unlocking control method and device
CN108132749B (en) Image editing method and mobile terminal
CN109669656B (en) Information display method and terminal equipment
CN111405181A (en) Focusing method and electronic equipment
CN111061446A (en) Display method and electronic equipment
CN108388459B (en) Message display processing method and mobile terminal
CN108093119B (en) Strange incoming call number marking method and mobile terminal
CN110225192B (en) Message display method and mobile terminal
CN110213437B (en) Editing method and mobile terminal
CN109714462B (en) Method for marking telephone number and mobile terminal thereof
CN109491572B (en) Screen capturing method of mobile terminal and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant