CN109529319B - Display method and device of interface control and storage medium - Google Patents

Display method and device of interface control and storage medium Download PDF

Info

Publication number
CN109529319B
CN109529319B CN201811433102.7A CN201811433102A CN109529319B CN 109529319 B CN109529319 B CN 109529319B CN 201811433102 A CN201811433102 A CN 201811433102A CN 109529319 B CN109529319 B CN 109529319B
Authority
CN
China
Prior art keywords
control
interface
user interface
hidden
visual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811433102.7A
Other languages
Chinese (zh)
Other versions
CN109529319A (en
Inventor
仇蒙
潘佳绮
崔维健
张书婷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201811433102.7A priority Critical patent/CN109529319B/en
Publication of CN109529319A publication Critical patent/CN109529319A/en
Application granted granted Critical
Publication of CN109529319B publication Critical patent/CN109529319B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a display method, equipment and a storage medium of an interface control, and relates to the field of virtual environments. The method comprises the following steps: displaying a first user interface of an application program, wherein the first user interface comprises a first picture, and n visual interface controls are also superposed on the first picture; receiving hidden control triggering operation; and triggering operation according to the hidden control, and displaying a second user interface of the application program, wherein the second user interface comprises a second picture, the second picture is not overlapped with the visual interface control or is overlapped with m visual interface controls, and m is more than 0 and less than n. When n first interface controls are displayed in the user interface, the hidden control is used for triggering operation, a second user interface which does not display the visual control in an overlapped mode or displays part of the visual control in an overlapped mode is displayed, the interface controls in the user interface are hidden in a high-efficiency and rapid mode, the visual field is rapidly expanded in battle, the setting efficiency of the interface controls is improved, and the human-computer interaction efficiency is improved.

Description

Display method and device of interface control and storage medium
Technical Field
The embodiment of the application relates to the field of virtual environments, in particular to a display method, equipment and a storage medium of an interface control.
Background
On terminals such as smartphones, tablets, etc., there are many applications based on virtual environments, such as: virtual reality application programs, three-dimensional map programs, military simulation programs, Third-person shooter Games (TPS), First-person shooter Games (FPS), Multiplayer Online Battle Arena Games (MOBA), and the like. Generally, interface controls such as attack controls, setting controls, lying controls, squatting controls, jumping controls, weapon controls, backpack controls, attack auxiliary controls (such as a firearm opening control), view angle rotation controls, small maps, direction display bars and the like are displayed above a virtual environment displayed in a user interface in an overlapping manner.
In the related art, as a plurality of interface controls are displayed above the virtual environment in an overlapping manner, when observing enemies in the virtual environment, the enemies are easily shielded by the interface controls, so that the observation environment is poor, a user needs to click the setting controls to enter the setting interface, and the position of each interface control is adjusted in the setting interface.
However, in a battle type application, such as: in the recreation of fighting through virtual firearms, the firearms are often taken place in several seconds to fighting, set up the position of interface controlling part through setting up the interface, and the process is comparatively loaded down with trivial details, and human-computer interaction efficiency is too low, and influences the process of fighting.
Disclosure of Invention
The embodiment of the application provides a display method, display equipment and a storage medium of an interface control, and can solve the problems that the process is complicated and the man-machine interaction efficiency is too low when the position of the interface control is set through setting an interface. The technical scheme is as follows:
in one aspect, a method for displaying an interface control is provided, where the method includes:
displaying a first user interface of an application program, wherein the first user interface comprises a first picture for observing the virtual environment by adopting the visual angle of a virtual role, n visual interface controls are also superposed on the first picture, and n is a positive integer;
receiving hidden control triggering operation;
and displaying a second user interface of the application program according to the hidden control triggering operation, wherein the second user interface comprises a second picture for observing the virtual environment by adopting the visual angle of the virtual role, the visual interface control is not superposed or m visual interface controls are superposed and displayed on the second picture, the m visual interface controls are one part of the n visual interface controls, and m is more than 0 and less than n.
In another aspect, an apparatus for displaying an interface control is provided, the apparatus including:
the display module is used for displaying a first user interface of an application program, wherein the first user interface comprises a first picture for observing the virtual environment by adopting the visual angle of a virtual role, n visual interface controls are also superposed on the first picture, and n is a positive integer;
the receiving module is used for receiving the hidden control triggering operation;
the display module is further configured to display a second user interface of the application program according to the hidden control triggering operation, where the second user interface includes a second picture observed on the virtual environment by using the viewing angle of the virtual character, the second picture is not overlaid with the visual interface controls or is overlaid with m visual interface controls, the m visual interface controls are part of the n visual interface controls, and m is greater than 0 and less than n.
In another aspect, a terminal is provided, where the terminal includes a processor and a memory, where the memory stores at least one instruction, at least one program, a code set, or a set of instructions, and the at least one instruction, the at least one program, the code set, or the set of instructions is loaded and executed by the processor to implement the display method of the interface control according to any one of the embodiments of the present application.
In another aspect, a computer-readable storage medium is provided, in which at least one instruction, at least one program, a set of codes, or a set of instructions is stored, and the at least one instruction, the at least one program, the set of codes, or the set of instructions is loaded and executed by the processor to implement the display method of the interface control according to any one of the embodiments of the present application.
In another aspect, a computer program product is provided, which when running on a computer, causes the computer to execute the method for displaying an interface control according to any one of the embodiments of the present application.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
when n visual interface controls are displayed in a first user interface, triggering operation is carried out through a hidden control to display a second user interface, the visual control is not overlapped and displayed or part of the visual control is overlapped and displayed on a second picture of the second user interface, triggering operation is conveniently and quickly carried out to hide all or part of the visual interface controls on the first user interface through the hidden control, namely the visual interface controls in the first user interface are hidden in a high-efficiency and quick mode, the visual field is rapidly expanded in battles, the visual interface controls are prevented from shielding the visual field for observing a virtual environment in battles, and the problems that the process is complicated when the interface controls are adjusted in position through setting interfaces and the human-computer interaction efficiency is low are also avoided.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a block diagram of an electronic device provided in an exemplary embodiment of the present application;
FIG. 2 is a block diagram of a computer system provided in an exemplary embodiment of the present application;
FIG. 3 is a flowchart of a method for providing for display of interface controls according to an exemplary embodiment of the present application;
FIG. 4 is a schematic diagram of an interface control in a user interface provided by an exemplary embodiment of the present application;
FIG. 5 is a flowchart of a method for displaying interface controls provided by another exemplary embodiment of the present application;
FIG. 6 is a user interface diagram of a method for displaying interface controls provided by an exemplary embodiment of the present application;
FIG. 7 is a flow diagram for hiding interface controls through a distance sensor provided by the present application based on the embodiment shown in FIG. 5;
FIG. 8 is a flow chart of hiding interface controls through a distance sensor provided by another embodiment of the present application based on the embodiment shown in FIG. 5;
FIG. 9 is a user interface diagram of a method for displaying interface controls provided by another exemplary embodiment of the present application;
FIG. 10 is a user interface diagram of a method for displaying interface controls provided by another exemplary embodiment of the present application;
FIG. 11 is a flowchart of a method for displaying interface controls provided by another exemplary embodiment of the present application;
FIG. 12 is a user interface diagram of a method for displaying interface controls provided by another exemplary embodiment of the present application;
FIG. 13 is a user interface diagram of a method for displaying interface controls provided by another exemplary embodiment of the present application;
FIG. 14 is a block diagram of a display device for interface controls provided in an exemplary embodiment of the present application;
FIG. 15 is a block diagram of a display device for interface controls provided in accordance with another exemplary embodiment of the present application;
fig. 16 is a block diagram of a terminal according to an exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
First, several terms related to the embodiments of the present application are explained:
virtual environment: is a virtual environment that is displayed (or provided) when an application is run on the terminal. The virtual environment may be a simulation environment of a real world, a semi-simulation semi-fictional three-dimensional environment, or a pure fictional three-dimensional environment. The virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment, and the following embodiments illustrate the virtual environment as a three-dimensional virtual environment, but are not limited thereto. Optionally, the virtual environment is also used for virtual environment engagement between at least two virtual characters. Optionally, the virtual environment is also used for a virtual firearm engagement between at least two virtual characters. Optionally, the virtual environment is further configured to engage a virtual firearm between at least two virtual characters within a target area that is smaller over time in the virtual environment.
Virtual object: refers to a movable object in a virtual environment. The movable object may be at least one of a virtual character, a virtual animal, and an animation character. Optionally, when the virtual environment is a three-dimensional virtual environment, the virtual object is a three-dimensional stereo model created based on an animated skeleton technique. Each virtual object has its own shape and volume in the three-dimensional virtual environment, occupying a portion of the space in the three-dimensional virtual environment.
Interface control: the interface control comprises at least one of an attack control, a setting control, a lying control, a squatting control, a jumping control, a weapon control, a backpack control, an attack auxiliary control (such as a firearm mirror opening control), a visual angle rotating control, a small map and a direction display bar. Optionally, the interface control is used for showing the state of the virtual object in the virtual environment and/or controlling the virtual object in the virtual environment. Optionally, the interface control further includes a first layer of interface control and a second layer of interface control triggered and displayed on the first layer of interface control, illustratively, the backpack control is the first layer of interface control, and after the selection is made on the backpack control, the backpack content display control is displayed as the second layer of interface control of the backpack control.
The terminal in the present application may be a desktop computer, a laptop computer, a mobile phone, a tablet computer, an e-book reader, an MP3(Moving Picture Experts Group Audio Layer III, mpeg compression standard Audio Layer 3) player, an MP4(Moving Picture Experts Group Audio Layer IV, mpeg compression standard Audio Layer 4) player, and so on. The terminal is installed and operated with an application program supporting a virtual environment, such as an application program supporting a three-dimensional virtual environment. The application program may be any one of a virtual reality application program, a three-dimensional map program, a military simulation program, a TPS game, an FPS game, and an MOBA game. Alternatively, the application may be a stand-alone application, such as a stand-alone 3D game program, or may be a network online application.
Fig. 1 shows a block diagram of an electronic device according to an exemplary embodiment of the present application. The electronic device 100 includes: an operating system 120 and application programs 122.
Operating system 120 is the base software that provides applications 122 with secure access to computer hardware.
Application 122 is an application that supports a virtual environment. Optionally, application 122 is an application that supports a three-dimensional virtual environment. The application 122 may be any one of a virtual reality application, a three-dimensional map program, a military simulation program, a Third-person Shooting Game (TPS), a First-person Shooting Game (FPS), a MOBA Game, and a multi-player gunfight live Game. The application 122 may be a stand-alone application, such as a stand-alone 3D game program.
Fig. 2 shows a block diagram of a computer system provided in an exemplary embodiment of the present application. The computer system 200 includes: a first device 220, a server 240, and a second device 260.
The first device 220 is installed and operated with an application program supporting a virtual environment. The application program can be any one of a virtual reality application program, a three-dimensional map program, a military simulation program, a TPS game, an FPS game, an MOBA game and a multi-player gunfight living game. The first device 220 is a device used by a first user who uses the first device 220 to control a first virtual object located in a virtual environment for activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the first virtual object is a first virtual character, such as a simulated persona or an animated persona.
The first device 220 is connected to the server 240 through a wireless network or a wired network.
The server 240 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. The server 240 is used to provide background services for applications that support a three-dimensional virtual environment. Alternatively, server 240 undertakes primary computing work and first device 220 and second device 260 undertakes secondary computing work; alternatively, server 240 undertakes secondary computing work and first device 220 and second device 260 undertakes primary computing work; alternatively, the server 240, the first device 220, and the second device 260 perform cooperative computing by using a distributed computing architecture.
The second device 260 is installed and operated with an application program supporting a virtual environment. The application program can be any one of a virtual reality application program, a three-dimensional map program, a military simulation program, an FPS game, an MOBA game and a multi-player gun battle type survival game. The second device 260 is a device used by a second user who uses the second device 260 to control a second virtual object located in the virtual environment to perform activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the second virtual object is a second virtual character, such as a simulated persona or an animated persona.
Optionally, the first virtual character and the second virtual character are in the same virtual environment. Alternatively, the first avatar and the second avatar may belong to the same team, the same organization, have a friend relationship, or have temporary communication rights. Alternatively, the first virtual character and the second virtual character may belong to different teams, different organizations, or two groups with enemy.
Alternatively, the applications installed on the first device 220 and the second device 260 are the same, or the applications installed on the two devices are the same type of application for different control system platforms. The first device 220 may generally refer to one of a plurality of devices, and the second device 260 may generally refer to one of a plurality of devices, and this embodiment is only exemplified by the first device 220 and the second device 260. The first device 220 and the second device 260 may be of the same or different device types, including: at least one of a game console, a desktop computer, a smartphone, a tablet, an e-book reader, an MP3 player, an MP4 player, and a laptop portable computer. The following embodiments are illustrated where the device is a desktop computer.
Those skilled in the art will appreciate that the number of devices described above may be greater or fewer. For example, the number of the devices may be only one, or several tens or hundreds, or more. The number and the type of the devices are not limited in the embodiments of the present application.
Fig. 3 is a flowchart of a display method of an interface control according to an exemplary embodiment of the present application, and as shown in fig. 3, the method is described as applied to the terminal 100 shown in fig. 1 as an example, and the method includes:
step 301, displaying a first user interface of the application program, where the first user interface includes a first screen, and n visual interface controls are displayed on the first screen in an overlapping manner.
Optionally, the first user interface includes a first screen for observing the virtual environment by using the view angle of the virtual character, n visual interface controls are further superimposed on the first screen, and n is a positive integer.
Optionally, the first screen may be a screen for observing the virtual environment using a first person perspective of the virtual character, or may be a screen for observing the virtual environment using a third person perspective of the virtual character.
Optionally, the n visual interface controls include at least one of an attack control, a setting control, a crouching control, a squatting control, a jumping control, a weapon control, a backpack control, an attack-assisting control (e.g., a firearm mirror-opening control), a view rotation control, a minimap, and a direction display bar.
Illustratively, referring to fig. 4, an attack control 410 is displayed in the first user interface 41, and the supply space 410 is respectively provided on the left and right sides of the first user interface 41, and the jump control 411, the groveling control 412, the squatting control 413, the bullet changing control 414, the firearm shooting control 415, the view angle rotation control 416, the backpack control 417, the weapon control 418, the minimap 419, the setting control 420, and the direction display bar 421. The above-mentioned controls are only illustrative examples, and the first user interface 41 further includes other visual interface controls.
Optionally, the visual interface control is used for representing the state of the virtual object in the virtual environment, or controlling the state of the virtual object in the virtual environment, or showing the state of the teammate of the virtual object in the virtual environment. Illustratively, the weapon control 418 is configured to indicate that the virtual firearm held by the virtual object is an AUG assault rifle, the direction display column 421 is configured to indicate that the direction in which the virtual object faces in the virtual environment is the west direction, the attack control 410 is configured to control the virtual object to perform a gun opening operation in the virtual environment, and the jump control 411 is configured to control the virtual object to jump in the virtual environment.
Step 302, receiving a hidden control triggering operation.
Optionally, the hidden control triggering operation is used to trigger hiding all or part of the n visual interface controls in the first user interface.
Optionally, when receiving the hidden control triggering operation, at least one of the following two cases is included:
firstly, receiving a hidden control triggering operation acted on a first user interface, namely a control which can trigger the hidden control triggering operation is displayed on the first user interface;
and secondly, receiving hidden control operation acted on other parts except the display screen on the terminal, wherein the other parts comprise at least one of a sensor part, a physical key and a microphone.
And 303, displaying a second user interface of the application program according to the hidden control triggering operation, wherein the second user interface comprises a second picture, and the second picture is not overlapped with the visual interface control or is overlapped with and displayed with m visual interface controls.
Optionally, the second user interface includes a second screen for observing the virtual environment using the view angle of the virtual character, and the second screen does not display the visual interface control in an overlaid manner or displays m visual interface controls in an overlaid manner, where the m visual interface controls are part of the n visual interface controls.
Optionally, when the n visual interface controls are set as concealable interface controls and a hidden control triggering operation is received, the visual interface controls are not displayed on the second picture in an overlapped mode; when part of the n visual interface controls are set as the concealable interface controls, that is, the m visual interface controls are not set as the concealable interface controls, and when the triggering operation of the concealed controls is received, the m visual interface controls are superposed and displayed on the second picture, and the m visual interface controls are the m interface controls which are not set as the concealable interface controls.
Optionally, when no visual interface control is displayed in an overlapping manner on the second screen, the transparency of all the n visual interface controls may be set to be fully transparent, or all the n visual interface controls may be cancelled to be displayed, or the transparency of one part of the n visual interface controls may be set to be fully transparent, and the other part of the n visual interface controls may be cancelled to be displayed, which is not limited in the embodiment of the present application. Optionally, the n visual interface controls include a clickable control, and when the visual interface controls are not displayed in an overlaid manner on the second screen, the transparency of the clickable control in the n visual interface controls is set to be fully transparent, and the other visual interface controls except the clickable control are cancelled to be displayed.
Optionally, when m visual interface controls are displayed in an overlapping manner on the second screen, that is, n-m visual interface controls are not displayed, the non-displayed visual interface controls may be invisible due to the transparency being set to be fully transparent, may also be invisible due to being canceled from being displayed, may also be set to be fully transparent for part of the transparency, and cancels the display for the other part, which is not limited in this embodiment of the application.
That is, the transparency of the control to be hidden in the n visual interface controls is set to be fully transparent according to the hidden control triggering operation, or the control to be hidden in the n visual interface controls is cancelled and displayed according to the hidden control triggering operation.
When the visual interface control is a clickable control and the transparency of the clickable control is set to be fully transparent, the clickable control can still be clicked, and the virtual object can be controlled according to a click event generated on the clickable control.
Optionally, whether the visual interface control can be hidden or not can be set by the user, or can be preset by a developer. Illustratively, the developer presets that the n visual interface controls are all concealable interface controls, and after the terminal receives the triggering operation of the concealed controls, the visual interface controls are not displayed on the second picture in an overlapped manner; the user-defined interface controlling part that can hide includes the controlling part of lying prone down, the controlling part of squatting, the jump controlling part, the knapsack controlling part, minimap and direction display fence, n visual interface controlling parts that stack was shown on the first picture include attack controlling part, set up the controlling part, the controlling part of lying prone down, the controlling part of squatting down, the jump controlling part, the weapon controlling part, the knapsack controlling part, the firearms shot of a gun controlling part, visual angle spin control, the minimap, the direction display fence, then the terminal is received and is hidden the controlling part and trigger the operation back, m visual interface controlling parts that stack was shown on the second picture include attack controlling part, set up the controlling part, the weapon controlling part, the firearms shot of a gun controlling part, visual angle spin.
To sum up, according to the interface control display method provided in this embodiment, when n visual interface controls are displayed in the first user interface, the hidden control triggers the operation to display the second user interface, the visual interface control or a part of the visual interface control is not displayed in a superimposed manner or is displayed in a superimposed manner on the second screen of the second user interface, and the hidden control triggers the operation to conveniently hide all or part of the visual interface controls on the first user interface, that is, the visual interface controls in the first user interface are hidden in an efficient and fast manner, so that the visual field is rapidly expanded in the battle, the visual interface controls are prevented from shielding the visual field for observing the virtual environment in the battle, and the problems that the process is complicated when the interface controls are adjusted in position by setting the interface, and the human-computer interaction efficiency is low are also avoided.
In an alternative embodiment, the hidden control triggering operation includes a plurality of different situations, fig. 5 is a flowchart of a display method of an interface control according to another exemplary embodiment of the present application, which is described by taking an example that the method is applied to the terminal 100 shown in fig. 1, and as shown in fig. 5, the method includes:
step 501, displaying a first user interface of an application program.
Optionally, the first user interface includes a first screen for observing the virtual environment by using the view angle of the virtual character, n visual interface controls are further superimposed on the first screen, and n is a positive integer.
Optionally, the visual interface control is used for representing the state of the virtual object in the virtual environment, or controlling the state of the virtual object in the virtual environment, or showing the state of the teammate of the virtual object in the virtual environment.
Step 502, receiving a shading operation acting on a sensor component on a terminal.
Optionally, the sensor component is at least one of a distance sensor, a light sensor and a front camera, optionally, the distance sensor judges whether the distance sensor is shielded by transmitting and receiving energy, and when judging that the distance between an object and the distance sensor is smaller than a preset distance according to the transmitted and received energy, the distance sensor is judged to be shielded; the light sensor judges whether the light sensor is shielded or not by receiving light, and when the light received by the light sensor is smaller than a preset light value, the light sensor is judged to be shielded; the front camera judges whether the front camera is shielded or not through the received image, and when the image acquired by the front camera is a completely black image, the front camera is judged to be shielded.
Step 503, determining the blocking operation as a hidden control triggering operation.
Optionally, the hidden control triggering operation is used to trigger hiding all or part of the n visual interface controls in the first user interface. That is, all or part of the n visual interface controls in the first user interface are hidden by the occlusion operation.
Illustratively, the sensor component is taken as a distance sensor for illustration, as shown in fig. 6, a first user interface 61 is displayed on the terminal 60, the terminal 60 further includes a distance sensor 62, and an attack control 610, a jump control 611, a lying control 612, a squatting control 613, a bullet changing control 614, a firearm opening control 615, a visual field rotation control 616, a backpack control 617, a small map 618, and an orientation display column 619 are displayed in the first user interface 61, wherein the controls are all configured to be hidden after a user blocks the distance sensor 62, a second user interface 63 is displayed on the terminal 60, and the second user interface 63 does not include the controls in the first user interface 61.
Optionally, when the occlusion operation is determined as the hidden control triggering operation and the visual interface control is hidden, at least one of the following cases is included:
firstly, when the shielding duration of the shielding operation reaches a preset duration, hiding the visual interface controls, and when the shielding operation is received again, restoring to display the n visual interface controls;
and secondly, the shielding operation is continuous shielding operation, when the continuous shielding operation is received, a second user interface of the application program is displayed, the visual interface controls are hidden, and when the continuous shielding operation is stopped, the n visual interface controls are restored to be displayed.
Illustratively, when the visual interface control is hidden by the sensor component, it needs to be completed by a Micro Controller Unit (MCU), a Central Processing Unit (CPU), a memory, a controller, and a display screen in the terminal, please refer to fig. 7, when the distance sensor 71 detects an occlusion event, the occlusion data of the occlusion event is sent to the MCU72, the occlusion data includes an occlusion duration and an occlusion distance, the MCU72 obtains a control request command corresponding to the occlusion data from the memory 73, and sends the control request command to the CPU74, the CPU74 obtains a control signal corresponding to the control request command from the memory 73, and sends the control signal to the controller 75, and the controller 75 controls the visual interface displayed on the display screen 76 to be invisible (sets transparency to be fully transparent or cancel display), and the touch control received on the display screen 76 passes through the CPU74, the MCU, The MCU72, memory 73, and controller 75 respond.
Referring to fig. 8, the process includes:
step 801, a distance sensor acquires shielding data and sends the shielding data to an MCU. And step 802, the MCU analyzes and compares the shielding data through the memory, generates a control request instruction according to a comparison result, and sends the control request instruction to the CPU. In step 803, the CPU generates a control signal according to the control request instruction and sends the control signal to the controller. And step 804, the controller controls the display screen according to the control signal.
And step 504, receiving the sliding cover event reported by the sliding cover monitoring component.
Optionally, the terminal is a slide terminal including an upper slide and a lower slide, and the terminal includes a slide monitoring part for detecting a slide state of the upper slide and the lower slide. The sliding cover event is an event generated when the upper sliding cover and the lower sliding cover slide relatively.
And 505, when the slide event is a first slide event sliding along a preset direction, determining the first slide event as a hidden control triggering operation.
Optionally, the upper sliding cover and the lower sliding cover of the sliding-cover terminal include two states, namely an overlapped state and a separated state, and the preset direction may be a direction in which the sliding movement is performed from the separated state to the overlapped state, or a direction in which the sliding movement is performed from the overlapped state to the separated state.
Referring to fig. 9, schematically, an attack control 910, a jump control 911, a lower lying control 912, a squat control 913, a bullet changing control 914, a firearm opening control 915, a visual field rotation control 916, a backpack control 917, a small map 918, and an orientation display bar 919 are displayed in the first user interface 91, and when the upper sliding cover 93 and the lower sliding cover 94 of the terminal are slid and a first sliding cover event is generated, the upper sliding cover 93 and the lower sliding cover 94 are in a separated state, a second user interface 92 is displayed, and the controls are not displayed in the second user interface 92.
Step 506, receiving a click operation acting on a microphone on the terminal.
Optionally, the terminal comprises a microphone for receiving sound signals.
Optionally, when the terminal is configured with an earphone, the microphone may be implemented as a microphone on the earphone, and may also be implemented as a microphone on the terminal, which is not limited in this embodiment of the application.
Optionally, when the user clicks on the microphone of the terminal, vibration is generated in the cavity of the microphone, so that a sound signal is generated and collected by the microphone.
And 507, determining the click operation as a hidden control triggering operation when the click feature of the click operation meets the preset feature.
Optionally, the user clicks the microphone at a certain frequency, and when the frequency of clicking the microphone by the user is consistent with a preset frequency, the clicking operation is determined as a hidden control triggering operation; or, the user clicks the microphone for a certain number of times, and when the number of times of clicking the microphone by the user is consistent with the preset number of times, the clicking operation is determined as the hidden control triggering operation.
Optionally, the steps 502 to 507 are triggered to operate by hidden controls acting on other components of the terminal except for the display screen.
And step 508, receiving a sliding operation acted on the sliding scanning control.
Optionally, a slide scan control is included in the first user interface, and the slide scan control is configured to perform slide scan on the first user interface along the slide scan control through a slide scan bar perpendicular to the slide scan control.
Optionally, the ratio of the area scanned by the sliding scan bar to the area of the first user interface is the ratio of the length of the user sliding on the sliding scan control to the length of the sliding scan control.
Optionally, the starting position of the sliding operation may be an endpoint position of the sliding scan control, or may be any position in the middle of the sliding scan control.
In step 509, the sliding operation is determined to be a hidden control triggering operation.
Optionally, hiding the visual interface control according to the sliding operation includes any one of the following cases:
firstly, hiding a visual interface control covered by a coverage area corresponding to the sliding operation in a first user interface;
secondly, hiding a visual interface control which is covered by a coverage area corresponding to the sliding operation and is set to be hidden in the first user interface;
thirdly, hiding the n visual interface controls in the first user interface according to the sliding operation;
and fourthly, hiding the visual interface control which is set to be hidden in the first user interface according to the sliding operation.
Optionally, when the portion of the visual interface control is covered by the coverage area, the visual interface control is hidden when the area covered by the visual interface control reaches 50% of the total area of the visual interface control.
Illustratively, taking the first case as an example, please refer to fig. 10, an attack control 1010, a jump control 1011, a lower lying control 1012, a lower squatting control 1013, a bullet changing control 1014, a firearm mirror opening control 1015, a visual field rotation control 1016, a backpack control 1017, a minimap 1018, and an orientation display column 1019 are displayed in a first user interface 1001, optionally, the first user interface 1001 further includes a sliding scan control 1002, and when the user slides the sliding scan control 1002, the first user interface 1001 further displays a sliding scan bar 1003 perpendicular to the sliding scan control, the area drawn by the sliding scan bar is an area covered by the covered area 1003, for example, the covered area 1020 in fig. 10, the position where the finger of the current user is located corresponds to the position where the sliding scan bar 1003 is currently located, and when the user finishes sliding, the second user interface 1004 is displayed, a backpack control 1017 not covered by the covered area 1020 is displayed in the second user interface 1004, and the orientation display field 1019 is not displayed in the second user interface 1004 because more than 50% of the area of the orientation display field 1019 is covered by the covered area.
And 510, displaying a second user interface of the application program according to the hidden control triggering operation.
Optionally, the second user interface includes a second screen for observing the virtual environment using the view angle of the virtual character, and the second screen does not display the visual interface control in an overlaid manner or displays m visual interface controls in an overlaid manner, where the m visual interface controls are part of the n visual interface controls.
Optionally, when all or part of the n visual interface controls are hidden and a user wants to resume displaying the n visual interface controls, the display of the n visual interface controls may be resumed by dragging the sliding scanning control in the opposite direction for any distance, the display of the n visual interface controls may also be resumed by blocking a sensor component on the terminal, the display of the n visual interface controls may also be resumed by clicking a microphone, the hidden duration may also be timed by the terminal, and the display of the n visual interface controls is resumed when the timed duration reaches a preset duration.
Optionally, when the n visual interface controls are set as concealable interface controls and a hidden control triggering operation is received, the visual interface controls are not displayed on the second picture in an overlapped mode; when part of the n visual interface controls are set as the concealable interface controls, that is, the m visual interface controls are not set as the concealable interface controls, and when the triggering operation of the concealed controls is received, the m visual interface controls are superposed and displayed on the second picture, and the m visual interface controls are the m interface controls which are not set as the concealable interface controls.
Optionally, for the hidden control triggering operation corresponding to different situations, the manner of displaying the second user interface is described in detail in the foregoing step 502 to step 509, and details are not described here again.
It should be noted that the steps 502 to 503, 504 to 505, 506 to 507, and 508 to 509 may be implemented as a set of methods separately, or implemented in combination, for example: and receiving shielding operation acting on the sensor part, and determining the hidden interface control according to the coverage area corresponding to the sliding scanning operation when receiving the sliding operation on the sliding scanning control.
To sum up, according to the interface control display method provided in this embodiment, when n visual interface controls are displayed in the first user interface, the hidden control triggers the operation to display the second user interface, the visual interface control or a part of the visual interface control is not displayed in a superimposed manner or is displayed in a superimposed manner on the second screen of the second user interface, and the hidden control triggers the operation to conveniently hide all or part of the visual interface controls on the first user interface, that is, the visual interface controls in the first user interface are hidden in an efficient and fast manner, so that the visual field is rapidly expanded in the battle, the visual interface controls are prevented from shielding the visual field for observing the virtual environment in the battle, and the problems that the process is complicated when the interface controls are adjusted in position by setting the interface, and the human-computer interaction efficiency is low are also avoided.
According to the method provided by the embodiment, the visual interface control is hidden through shielding the sensor device, the interface control in the user interface is hidden in a high-efficiency and quick mode, the visual field is rapidly expanded in battle, the visual interface control is prevented from shielding the visual field for observing the virtual environment in battle, and the problems that the process is complicated and the human-computer interaction efficiency is low when the interface control is adjusted in position through setting the interface are also avoided.
According to the method provided by the embodiment, the visual interface control is hidden by carrying out sliding cover operation on the sliding cover type terminal, the interface control in the user interface is hidden in a high-efficiency and quick mode, the visual field is rapidly expanded in battle, the visual interface control is prevented from shielding the visual field for observing a virtual environment in battle, and the problems that the process is complicated and the human-computer interaction efficiency is low when the interface control is adjusted in position by setting the interface are also avoided.
According to the method provided by the embodiment, the visual interface control in the target area needing to be hidden in the first user interface is determined by performing sliding operation on the sliding scanning control, when the target area is an area needing to be carefully observed, the visual interface control in the target area is hidden in a high-efficiency and rapid mode, the visual field is rapidly expanded in battle, the visual interface control is prevented from shielding the visual field for observing the virtual environment in battle, and the problems that the process is complicated and the human-computer interaction efficiency is low when the position of the interface control is adjusted by setting the interface are also avoided.
According to the method provided by the embodiment, the visual interface controls in the first user interface are hidden in an efficient and convenient mode, and after the visual interface controls are hidden, the n visual interface controls are restored and displayed in an efficient and convenient mode, so that efficient control over the visual interface controls is realized, the smoothness of operation is improved, and the human-computer interaction efficiency is improved.
In an optional embodiment, the position of the hidden control may also be replaced with target indication content with a smaller display area, fig. 11 is a flowchart of a display method of an interface control according to another exemplary embodiment of the present application, which is described by taking an application of the method in the terminal 100 shown in fig. 1 as an example, as shown in fig. 11, the method includes:
at step 1101, a first user interface of an application is displayed.
Optionally, the first user interface includes a first screen for observing the virtual environment by using the view angle of the virtual character, n visual interface controls are further superimposed on the first screen, and n is a positive integer.
Optionally, the visual interface control is used for representing the state of the virtual object in the virtual environment, or controlling the state of the virtual object in the virtual environment, or showing the state of the teammate of the virtual object in the virtual environment.
Step 1102, receiving a hidden control triggering operation.
The triggering manner of the hidden control triggering operation is described in detail in the above steps 502 to 509, and is not described herein again.
Step 1103, setting the transparency of the control to be hidden in the n visual interface controls to be fully transparent according to the hidden control triggering operation.
Optionally, the control to be hidden is a control that has been set to be hidden in the n visual interface controls, where the transparency set to be fully transparent may be all of the controls that can be hidden, or may be part of the controls that can be hidden.
Optionally, according to the hidden control triggering operation, the transparency of the control which is set to be fully transparent in the n visual interface controls and needs to be hidden is set to be fully transparent.
Illustratively, the first user interface includes a control a, a control B, a control C, a control D, and a control E, and after the 5 controls are set by the user, the control a, the control B, and the control E are concealable controls, where the hiding mode of the control a is that transparency is set to full transparency, and the hiding mode of the control B and the control E is that display is cancelled, that is, the second user interface does not include the control B and the control E.
At step 1104, the target indication content is displayed at a first location in the second user interface.
Optionally, the hidden control includes a first control located at a first position in the first user interface. Optionally, the target indication content is used to indicate a first position where the first control is located, and a display area of the target indication content is smaller than a display area of the first control before being hidden. Optionally, since the first control is a control whose transparency is set to be fully transparent, i.e., the first control is still present in the second user interface but is not visible.
Illustratively, as shown in fig. 12, a terminal 1200 includes a distance sensor 1220, an attack control 1211 and a knapsack control 1212 are displayed in a first user interface 1210, and after a user performs occlusion on the distance sensor 1220, the knapsack control 1212 and the attack control 1211 are hidden in a second user interface, where in the second user interface 1230, a target indication content 1231 is displayed at a position where the attack control 1211 is located, the target indication content 1231 is an outline of the attack control 1211, and the target indication content 1231 is a non-filling content, that is, only the outline of the attack control 1211 is displayed, and the virtual environment is not occluded.
Optionally, the n visual interface controls include a clickable control, the first control belongs to the clickable control, the first control displaying the target indication content is a control with the highest historical use frequency, that is, before the target indication content is displayed at the first position of the second user interface, historical control use data is further required to be obtained, the historical control use data is used for representing the frequency of clicking the clickable control in the n visual interface controls, the control with the highest historical use frequency in the clickable control is determined as the first control according to the historical control use data, and after the first position of the first control in the user interface is determined, the target indication content is displayed at the first position in the second user interface.
Alternatively, the historical usage frequency may be calculated from the average number of clicks of the clickable control per battle.
In step 1105, a selection operation at a second location of a second user interface is received.
Optionally, the visual interface control is an interface control of a first hierarchy in the user interface, the second interface control is a second hierarchy interface control triggered and displayed on the visual interface control, the hidden control includes the second control, the second control is located at a second position in the first user interface, and the hidden mode of the second control is that the transparency is set to be fully transparent.
And step 1106, displaying a second interface control corresponding to the second control according to the selection operation.
Optionally, the display transparency of the second interface control is not fully transparent.
Illustratively, referring to fig. 13, a first user interface 1310 is displayed in the terminal 1300, an attack control 1311 and a knapsack control 1313 are displayed in the first user interface 1310, when a user blocks on the distance sensor 1320, a second user interface 1330 is displayed, the knapsack control 1313 and the attack control 1311 are hidden in the second user interface 1330, when the user clicks on the position of the knapsack control 1313 in the second user interface 1330, a third user interface 1340 is displayed, a knapsack display area 1341 is displayed in the third user interface 1340, and the display transparency of the knapsack display area 1341 is not fully transparent.
To sum up, according to the interface control display method provided in this embodiment, when n visual interface controls are displayed in the first user interface, the hidden control triggers the operation to display the second user interface, the visual interface control or a part of the visual interface control is not displayed in a superimposed manner or is displayed in a superimposed manner on the second screen of the second user interface, and the hidden control triggers the operation to conveniently hide all or part of the visual interface controls on the first user interface, that is, the visual interface controls in the first user interface are hidden in an efficient and fast manner, so that the visual field is rapidly expanded in the battle, the visual interface controls are prevented from shielding the visual field for observing the virtual environment in the battle, and the problems that the process is complicated when the interface controls are adjusted in position by setting the interface, and the human-computer interaction efficiency is low are also avoided.
According to the method provided by the embodiment, the interface control of the first level is hidden in the second user interface, and when the interface control of the first level is selected to display the interface control of the second level, the transparency of the interface control of the second level is not fully transparent, so that the problem that the other controls cannot be controlled and the fighting process is influenced because the other controls are shielded when the interface control of the second level is still fully transparent is solved.
Fig. 14 is a block diagram of a display apparatus of an interface control according to an exemplary embodiment of the present application, where the apparatus may be configured in the terminal 100 shown in fig. 1, and the apparatus includes: a display module 1410 and a receiving module 1420;
a display module 1410, configured to display a first user interface of an application, where the first user interface includes a first picture observed from a virtual environment by using a view of a virtual role, the first picture is further superimposed with n visual interface controls, and n is a positive integer;
a receiving module 1420, configured to receive a hidden control triggering operation;
the display module 1410 is further configured to display a second user interface of the application program according to the hidden control triggering operation, where the second user interface includes a second picture observed for the virtual environment by using the view angle of the virtual role, the second picture is not superimposed with the visual interface controls or is superimposed with m visual interface controls, the m visual interface controls are part of the n visual interface controls, and m is greater than 0 and less than n.
In an alternative embodiment, the apparatus is applied in a terminal;
the receiving module 1420 is further configured to receive the hidden control triggering operation that is applied to other components of the terminal except for the display screen.
In an alternative embodiment, as shown in fig. 15, the other components include a sensor component that is at least one of a distance sensor, a light sensor, and a front camera;
the receiving module 1420 is further configured to receive a blocking operation acting on the sensor component on the terminal;
the device further comprises:
a determining module 1430, configured to determine the occlusion operation as the hidden control triggering operation.
In an alternative embodiment, the terminal is a slide terminal including an upper slide and a lower slide, and the other components include a slide monitoring component for detecting the slide states of the upper slide and the lower slide;
the receiving module 1420 is further configured to receive a sliding cover event reported by the sliding cover monitoring component, where the sliding cover event is an event generated when the upper sliding cover and the lower sliding cover slide relatively;
the determining module 1430 is configured to determine the first slider event as the hidden control triggering operation when the slider event is the first slider event sliding along a preset direction.
In an alternative embodiment, the other component comprises a microphone;
the receiving module 1420 is further configured to receive a click operation applied to the microphone on the terminal;
the determining module 1430 is configured to determine the click operation as the hidden control triggering operation when the click feature of the click operation meets a preset feature.
In an optional embodiment, the first user interface includes a sliding scan control, where the sliding scan control is configured to scan the first user interface through a sliding scan bar perpendicular to the sliding scan control, and a scan width is a width corresponding to the first user interface;
the receiving module 1420 is further configured to receive a sliding operation applied to the sliding scan control, where the sliding operation is used to control the sliding scan bar to perform sliding scan on the first user interface along the sliding scan control;
a determining module 1430, configured to determine the sliding operation as the hidden control triggering operation.
In an optional embodiment, the display module 1410 is further configured to set the transparency of the control that needs to be hidden in the n visual interface controls to be fully transparent according to the hidden control triggering operation, or the display module 1410 is further configured to cancel displaying the control that needs to be hidden in the n visual interface controls according to the hidden control triggering operation.
In an alternative embodiment, the hidden control includes a first control located at a first position in the first user interface;
the display module 1410 is further configured to display target indication content at the first position of the second user interface, where the target indication content is used to indicate the first position where the first control is located, and a display area of the target indication content is smaller than a display area of the first control before being hidden.
In an alternative embodiment, the n visual interface controls include a clickable control, and the first control belongs to the clickable control;
the device, still include:
an obtaining module 1440 configured to obtain historical control usage data, where the historical control usage data is used to indicate a frequency with which the clickable control in the n visual interface controls is clicked;
a determining module 1430, configured to determine, according to the historical control usage data, that a control with the highest historical usage frequency in the clickable controls is the first control;
the determining module 1430 is further configured to determine the first position of the first control in the user interface.
In an alternative embodiment, the visual interface control is an interface control at a first level in the user interface, the second interface control is an interface control at a second level triggered and displayed on the visual interface control, the hidden control comprises a second control, and the second control is at a second position in the first user interface;
the receiving module 1420, further configured to receive a selection operation at the second location of the second user interface;
the display module 1410 is further configured to display a second interface control corresponding to the second control according to the selection operation, where a display transparency of the second interface control is not fully transparent.
It should be noted that, in the above embodiments, the receiving module 1420, the determining module 1430, and the obtaining module 1440 may be implemented by a processor, or may be implemented by a processor and a memory in cooperation; the display module 1410 in the above embodiments may be implemented by a display screen, or may be implemented by a processor and the display screen cooperatively.
Fig. 16 shows a block diagram of a terminal 1600 according to an exemplary embodiment of the present invention. The terminal 1600 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio layer iii, motion video Experts compression standard Audio layer 3), an MP4 player (Moving Picture Experts Group Audio layer IV, motion video Experts compression standard Audio layer 4), a notebook computer, or a desktop computer. Terminal 1600 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, etc.
Generally, terminal 1600 includes: a processor 1601, and a memory 1602.
Processor 1601 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 1601 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). Processor 1601 may also include a main processor and a coprocessor, where the main processor is a processor for processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1601 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content that the display screen needs to display. In some embodiments, the processor 1601 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 1602 may include one or more computer-readable storage media, which may be non-transitory. The memory 1602 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1602 is used to store at least one instruction for execution by processor 1601 to implement a method of displaying an interface control provided by method embodiments of the present application.
In some embodiments, the terminal 1600 may also optionally include: peripheral interface 1603 and at least one peripheral. Processor 1601, memory 1602 and peripheral interface 1603 may be connected by buses or signal lines. Various peripheral devices may be connected to peripheral interface 1603 via buses, signal lines, or circuit boards. Specifically, the peripheral device includes: at least one of a radio frequency circuit 1604, a touch screen display 1605, a camera 1606, audio circuitry 1607, a positioning component 1608, and a power supply 1609.
Peripheral interface 1603 can be used to connect at least one I/O (Input/Output) related peripheral to processor 1601 and memory 1602. In some embodiments, processor 1601, memory 1602, and peripheral interface 1603 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1601, the memory 1602 and the peripheral device interface 1603 may be implemented on a separate chip or circuit board, which is not limited by this embodiment.
The Radio Frequency circuit 1604 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 1604 communicates with communication networks and other communication devices via electromagnetic signals. The rf circuit 1604 converts the electrical signal into an electromagnetic signal to be transmitted, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1604 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1604 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the rf circuit 1604 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1605 is used to display a UI (user interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1605 is a touch display screen, the display screen 1605 also has the ability to capture touch signals on or over the surface of the display screen 1605. The touch signal may be input to the processor 1601 as a control signal for processing. At this point, the display 1605 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 1605 may be one, providing the front panel of the terminal 1600; in other embodiments, the display screens 1605 can be at least two, respectively disposed on different surfaces of the terminal 1600 or in a folded design; in still other embodiments, display 1605 can be a flexible display disposed on a curved surface or a folded surface of terminal 1600. Even further, the display 1605 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The Display 1605 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), or other materials.
The camera assembly 1606 is used to capture images or video. Optionally, camera assembly 1606 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1606 can also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuitry 1607 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1601 for processing or inputting the electric signals to the radio frequency circuit 1604 to achieve voice communication. For stereo sound acquisition or noise reduction purposes, the microphones may be multiple and disposed at different locations of terminal 1600. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1601 or the radio frequency circuit 1604 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuit 1607 may also include a headphone jack.
The positioning component 1608 is configured to locate a current geographic location of the terminal 1600 for navigation or LBS (location based Service). The positioning component 1608 may be a positioning component based on the GPS (global positioning System) of the united states, the beidou System of china, or the galileo System of russia.
Power supply 1609 is used to provide power to the various components of terminal 1600. Power supply 1609 may be alternating current, direct current, disposable or rechargeable. When power supply 1609 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 1600 also includes one or more sensors 1610. The one or more sensors 1610 include, but are not limited to: acceleration sensor 1611, gyro sensor 1612, pressure sensor 1613, fingerprint sensor 1614, optical sensor 1615, and proximity sensor 1616.
Acceleration sensor 1611 may detect acceleration in three coordinate axes of a coordinate system established with terminal 1600. For example, the acceleration sensor 1611 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 1601 may control the touch display screen 1605 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1611. The acceleration sensor 1611 may also be used for acquisition of motion data of a game or a user.
Gyroscope sensor 1612 can detect the organism direction and the turned angle of terminal 1600, and gyroscope sensor 1612 can gather the 3D action of user to terminal 1600 with acceleration sensor 1611 in coordination. From the data collected by the gyro sensor 1612, the processor 1601 may perform the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensors 1613 may be disposed on a side bezel of terminal 1600 and/or underlying touch display 1605. When the pressure sensor 1613 is disposed on the side frame of the terminal 1600, a user's holding signal of the terminal 1600 can be detected, and the processor 1601 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1613. When the pressure sensor 1613 is disposed at the lower layer of the touch display 1605, the processor 1601 controls the operability control on the UI interface according to the pressure operation of the user on the touch display 1605. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1614 is configured to collect a fingerprint of the user, and the processor 1601 is configured to identify the user based on the fingerprint collected by the fingerprint sensor 1614, or the fingerprint sensor 1614 is configured to identify the user based on the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, the processor 1601 authorizes the user to perform relevant sensitive operations including unlocking a screen, viewing encrypted information, downloading software, paying for and changing settings, etc. The fingerprint sensor 1614 may be disposed on the front, back, or side of the terminal 1600. When a physical key or vendor Logo is provided on the terminal 1600, the fingerprint sensor 1614 may be integrated with the physical key or vendor Logo.
The optical sensor 1615 is used to collect ambient light intensity. In one embodiment, the processor 1601 may control the display brightness of the touch display screen 1605 based on the ambient light intensity collected by the optical sensor 1615. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1605 is increased; when the ambient light intensity is low, the display brightness of the touch display 1605 is turned down. In another embodiment, the processor 1601 may also dynamically adjust the shooting parameters of the camera assembly 1606 based on the ambient light intensity collected by the optical sensor 1615.
A proximity sensor 1616, also referred to as a distance sensor, is typically disposed on the front panel of terminal 1600. The proximity sensor 1616 is used to collect the distance between the user and the front surface of the terminal 1600. In one embodiment, the processor 1601 controls the touch display 1605 to switch from the light screen state to the rest screen state when the proximity sensor 1616 detects that the distance between the user and the front surface of the terminal 1600 is gradually decreased; when the proximity sensor 1616 detects that the distance between the user and the front surface of the terminal 1600 is gradually increased, the touch display 1605 is controlled by the processor 1601 to switch from the breath screen state to the bright screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 16 is not intended to be limiting of terminal 1600, and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be employed.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, which may be a computer readable storage medium contained in a memory of the above embodiments; or it may be a separate computer-readable storage medium not incorporated in the terminal. The computer readable storage medium has stored therein at least one instruction, at least one program, a set of codes, or a set of instructions that are loaded and executed by the processor to implement the method for displaying an interface control as described in any of fig. 3, 5, and 11.
Optionally, the computer-readable storage medium may include: a Read Only Memory (ROM), a Random Access Memory (RAM), a Solid State Drive (SSD), or an optical disc. The Random Access Memory may include a resistive Random Access Memory (ReRAM) and a Dynamic Random Access Memory (DRAM). The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (11)

1. A display method of an interface control is characterized by comprising the following steps:
displaying a first user interface of an application program, wherein the first user interface comprises a first picture for observing the virtual environment by adopting the visual angle of a virtual role, n visual interface controls are also superposed on the first picture, and n is a positive integer;
receiving a shielding operation acting on a sensor component on the terminal, wherein the sensor component comprises at least one of a distance sensor, a light sensor and a front camera;
determining the shielding operation as a hidden control triggering operation;
and displaying a second user interface of the application program according to the hidden control triggering operation, wherein the second user interface comprises a second picture for observing the virtual environment by adopting the visual angle of the virtual role, the visual interface control is not superposed or m visual interface controls are superposed and displayed on the second picture, the m visual interface controls are one part of the n visual interface controls, and m is more than 0 and less than n.
2. The method according to claim 1, wherein the terminal is a slide type terminal including an upper slide and a lower slide, and the other components include a slide monitoring component for detecting a slide state of the upper slide and the lower slide;
the receiving the hidden control triggering operation acted on other components except the display screen on the terminal comprises the following steps:
receiving a sliding cover event reported by the sliding cover monitoring part, wherein the sliding cover event is an event generated when the upper sliding cover and the lower sliding cover slide relatively;
and when the sliding cover event is a first sliding cover event sliding along a preset direction, determining the first sliding cover event as the hidden control triggering operation.
3. The method of claim 1, wherein the other components comprise a microphone;
the receiving the hidden control triggering operation acted on other components except the display screen on the terminal comprises the following steps:
receiving a click operation acting on the microphone on the terminal;
and when the click feature of the click operation accords with the preset feature, determining the click operation as the hidden control triggering operation.
4. The method according to claim 1, wherein the first user interface includes a sliding scan control, and the sliding scan control is configured to scan the first user interface through a sliding scan bar perpendicular to the sliding scan control, where a scan width is a width corresponding to the first user interface;
the receiving of the hidden control triggering operation comprises:
receiving a sliding operation acted on the sliding scanning control, wherein the sliding operation is used for controlling the sliding scanning bar to perform sliding scanning on the first user interface along the sliding scanning control;
and determining the sliding operation as the hidden control triggering operation.
5. The method according to any one of claims 1 to 4, wherein the displaying the second user interface of the application according to the hidden control triggering operation comprises:
and setting the transparency of the control needing to be hidden in the n visual interface controls to be fully transparent according to the hidden control triggering operation, or canceling the display of the control needing to be hidden in the n visual interface controls according to the hidden control triggering operation.
6. The method of claim 5, wherein the hidden control comprises a first control located at a first location in the first user interface;
after the setting the transparency of the control to be hidden in the n visual interface controls to be fully transparent according to the hidden control triggering operation, the method further includes:
and displaying target indication content at the first position of the second user interface, wherein the target indication content is used for indicating the first position where the first control is located, and the display area of the target indication content is smaller than that of the first control before being hidden.
7. The method of claim 6, wherein the n visual interface controls include a clickable control, and wherein the first control belongs to the clickable control;
the method further comprises, prior to displaying the target indication content at the first location of the second user interface:
obtaining historical control usage data, wherein the historical control usage data is used for representing the frequency of clicking of the clickable controls in the n visual interface controls;
determining a control with the highest historical use frequency in the clickable controls as the first control according to the historical control use data;
determining the first position in the user interface at which the first control is located.
8. The method of claim 7, wherein the visual interface control is a first level of interface control in the user interface, wherein the second interface control is a second level of interface control that triggers display on the visual interface control, wherein the hidden control comprises a second control at a second location in the first user interface;
after the setting the transparency of the control to be hidden in the n visual interface controls to be fully transparent according to the hidden control triggering operation, the method further includes:
receiving a selection operation at the second location of the second user interface;
and displaying a second interface control corresponding to the second control according to the selection operation, wherein the display transparency of the second interface control is not fully transparent.
9. An apparatus for displaying interface controls, the apparatus comprising:
the display module is used for displaying a first user interface of an application program, wherein the first user interface comprises a first picture for observing the virtual environment by adopting the visual angle of a virtual role, n visual interface controls are also superposed on the first picture, and n is a positive integer;
the receiving module is used for receiving shielding operation acting on a sensor component on the terminal, and the sensor component comprises at least one of a distance sensor, a light sensor and a front camera;
the determining module is used for determining the shielding operation as a hidden control triggering operation;
the display module is further configured to display a second user interface of the application program according to the hidden control triggering operation, where the second user interface includes a second picture observed on the virtual environment by using the viewing angle of the virtual character, the second picture is not overlaid with the visual interface controls or is overlaid with m visual interface controls, the m visual interface controls are part of the n visual interface controls, and m is greater than 0 and less than n.
10. A terminal, characterized in that the terminal comprises a processor and a memory, wherein at least one instruction, at least one program, a set of codes or a set of instructions is stored in the memory, and the at least one instruction, the at least one program, the set of codes or the set of instructions is loaded and executed by the processor to realize the display method of the interface control according to any one of claims 1 to 8.
11. A computer readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by a processor to implement the method of displaying an interface control according to any one of claims 1 to 8.
CN201811433102.7A 2018-11-28 2018-11-28 Display method and device of interface control and storage medium Active CN109529319B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811433102.7A CN109529319B (en) 2018-11-28 2018-11-28 Display method and device of interface control and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811433102.7A CN109529319B (en) 2018-11-28 2018-11-28 Display method and device of interface control and storage medium

Publications (2)

Publication Number Publication Date
CN109529319A CN109529319A (en) 2019-03-29
CN109529319B true CN109529319B (en) 2020-06-02

Family

ID=65850898

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811433102.7A Active CN109529319B (en) 2018-11-28 2018-11-28 Display method and device of interface control and storage medium

Country Status (1)

Country Link
CN (1) CN109529319B (en)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6818092B2 (en) * 2019-06-25 2021-01-20 株式会社コロプラ Game programs, game methods, and information terminals
CN112286782B (en) * 2019-07-23 2023-10-10 腾讯科技(深圳)有限公司 Control shielding detection method, software detection method, device and medium
CN110465073A (en) * 2019-08-08 2019-11-19 腾讯科技(深圳)有限公司 Method, apparatus, equipment and the readable storage medium storing program for executing that visual angle adjusts in virtual environment
CN110711386B (en) * 2019-10-23 2023-01-10 网易(杭州)网络有限公司 Method and device for processing information in game, electronic equipment and storage medium
CN110882537B (en) * 2019-11-12 2023-07-25 北京字节跳动网络技术有限公司 Interaction method, device, medium and electronic equipment
CN111309416B (en) * 2020-01-19 2022-06-14 北京字节跳动网络技术有限公司 Information display method, device and equipment of application interface and readable medium
CN111256704B (en) * 2020-01-21 2022-06-10 华为技术有限公司 Navigation method and related device of folding screen
CN111294637A (en) * 2020-02-11 2020-06-16 北京字节跳动网络技术有限公司 Video playing method and device, electronic equipment and computer readable medium
CN111672121A (en) * 2020-06-11 2020-09-18 腾讯科技(深圳)有限公司 Virtual object display method and device, computer equipment and storage medium
CN111885256B (en) * 2020-07-16 2021-12-14 深圳传音控股股份有限公司 Terminal control method, terminal and computer storage medium
CN112330823B (en) * 2020-11-05 2023-06-16 腾讯科技(深圳)有限公司 Virtual prop display method, device, equipment and readable storage medium
CN115113785A (en) * 2021-03-17 2022-09-27 深圳市万普拉斯科技有限公司 Application program operation method and device, computer equipment and storage medium
CN113262492B (en) * 2021-04-28 2024-02-02 网易(杭州)网络有限公司 Game data processing method and device and electronic terminal
CN113134233B (en) * 2021-05-14 2023-06-20 腾讯科技(深圳)有限公司 Control display method and device, computer equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004094741A (en) * 2002-09-02 2004-03-25 Matsushita Electric Ind Co Ltd Information transmitter/receiver and method for transmitting and receiving information
CN105224171A (en) * 2015-09-22 2016-01-06 小米科技有限责任公司 icon display method, device and terminal
CN105373379A (en) * 2015-10-10 2016-03-02 网易(杭州)网络有限公司 Game interface switching method and device
CN107678647A (en) * 2017-09-26 2018-02-09 网易(杭州)网络有限公司 Virtual shooting main body control method, apparatus, electronic equipment and storage medium
CN108717733A (en) * 2018-06-07 2018-10-30 腾讯科技(深圳)有限公司 View angle switch method, equipment and the storage medium of virtual environment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004094741A (en) * 2002-09-02 2004-03-25 Matsushita Electric Ind Co Ltd Information transmitter/receiver and method for transmitting and receiving information
CN105224171A (en) * 2015-09-22 2016-01-06 小米科技有限责任公司 icon display method, device and terminal
CN105373379A (en) * 2015-10-10 2016-03-02 网易(杭州)网络有限公司 Game interface switching method and device
CN107678647A (en) * 2017-09-26 2018-02-09 网易(杭州)网络有限公司 Virtual shooting main body control method, apparatus, electronic equipment and storage medium
CN108717733A (en) * 2018-06-07 2018-10-30 腾讯科技(深圳)有限公司 View angle switch method, equipment and the storage medium of virtual environment

Also Published As

Publication number Publication date
CN109529319A (en) 2019-03-29

Similar Documents

Publication Publication Date Title
CN109529319B (en) Display method and device of interface control and storage medium
CN109126129B (en) Method, device and terminal for picking up virtual article in virtual environment
CN110115838B (en) Method, device, equipment and storage medium for generating mark information in virtual environment
CN109350964B (en) Method, device, equipment and storage medium for controlling virtual role
CN109045695B (en) Accessory selection method, device and storage medium in virtual environment
CN108815851B (en) Interface display method, equipment and storage medium for shooting in virtual environment
KR102637047B1 (en) Virtual object control method, device and media for marking virtual items
WO2019153750A1 (en) Method, apparatus and device for view switching of virtual environment, and storage medium
CN110755841B (en) Method, device and equipment for switching props in virtual environment and readable storage medium
CN111035918B (en) Reconnaissance interface display method and device based on virtual environment and readable storage medium
CN110694273A (en) Method, device, terminal and storage medium for controlling virtual object to use prop
CN111589132A (en) Virtual item display method, computer equipment and storage medium
CN108786110B (en) Method, device and storage medium for displaying sighting telescope in virtual environment
CN109634413B (en) Method, device and storage medium for observing virtual environment
CN111659117B (en) Virtual object display method and device, computer equipment and storage medium
CN109917910B (en) Method, device and equipment for displaying linear skills and storage medium
CN111589146A (en) Prop operation method, device, equipment and storage medium based on virtual environment
CN110448908B (en) Method, device and equipment for applying sighting telescope in virtual environment and storage medium
CN111273780B (en) Animation playing method, device and equipment based on virtual environment and storage medium
CN113398571A (en) Virtual item switching method, device, terminal and storage medium
CN112402950A (en) Using method, device, equipment and storage medium of virtual prop
CN111013137B (en) Movement control method, device, equipment and storage medium in virtual scene
CN112451969A (en) Virtual object control method and device, computer equipment and storage medium
CN113577765A (en) User interface display method, device, equipment and storage medium
CN113398572A (en) Virtual item switching method, skill switching method and virtual object switching method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant