CN109529319A - Display methods, equipment and the storage medium of interface control - Google Patents
Display methods, equipment and the storage medium of interface control Download PDFInfo
- Publication number
- CN109529319A CN109529319A CN201811433102.7A CN201811433102A CN109529319A CN 109529319 A CN109529319 A CN 109529319A CN 201811433102 A CN201811433102 A CN 201811433102A CN 109529319 A CN109529319 A CN 109529319A
- Authority
- CN
- China
- Prior art keywords
- control
- user interface
- interface
- visual
- trigger action
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/822—Strategy games; Role-playing games
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/214—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
- A63F13/2145—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
This application discloses a kind of display methods of interface control, equipment and storage mediums, are related to virtual environment field.This method comprises: showing the first user interface of application program, includes the first picture in the first user interface, be also superimposed with n visual interface control on the first picture;It receives and hides control trigger action;According to hiding control trigger action, the second user interface of application program is shown, include the second picture in second user interface, not being superimposed with visual interface control or Overlapping display on the second picture has m visual interface control, 0 < m < n.When showing n the first interface controls in the user interface, by hiding control trigger action, show the second user interface of non-Overlapping display visible controls or Overlapping display partial visual control, the interface control in user interface is hidden in a manner of efficient quick, expand the visual field rapidly in battle, the efficiency for improving interface control setting, improves human-computer interaction efficiency.
Description
Technical field
The invention relates to virtual environment field, in particular to a kind of display methods of interface control, equipment and deposit
Storage media.
Background technique
In such as terminal of smart phone, tablet computer etc, there is application program much based on virtual environment,
Such as: virtual reality applications program, three-dimensional map program, Military Simulation program, third person shooting game (Third-Personal
Shooting Game, TPS), first person shooting game (First-person shooting game, FPS), more people it is online
Tactics competitive game (Multiplayer Online Battle Arena Games, MOBA) etc..In general, being shown in user interface
Above the virtual environment shown, Overlapping display have attack control, setting control, under lie prone control, control of squatting down, jump control, weapon
Control, knapsack control, attack auxiliary control (such as: firearms start shooting control), visual angle pivot controls, small map, direction display field
Interface control.
In the related technology, since virtual environment top Overlapping display has more interface control, cause in virtual environment
Enemy when observing, enemy is easily blocked by interface control, cause environment of observation poor, user need to click setting control into
Enter set interface, and adjusts the position of each interface control in set interface.
However, such as: in the game fought by virtual firearms, firearms are fought often in battle class application program
Occur in several seconds, is configured by position of the set interface to interface control, process is relatively complicated, and human-computer interaction efficiency is too
It is low, and influence Campaign Process.
Summary of the invention
The embodiment of the present application provides the display methods, equipment and storage medium of a kind of interface control, can solve and passes through
Set interface is configured the position of interface control, and process is relatively complicated, the too low problem of human-computer interaction efficiency.The technology
Scheme is as follows:
On the one hand, a kind of display methods of interface control is provided, which comprises
It shows the first user interface of application program, includes the visual angle pair using virtual role in first user interface
The first picture that the virtual environment is observed also is superimposed with n visual interface control on first picture, and n is positive whole
Number;
It receives and hides control trigger action;
According to the hiding control trigger action, the second user interface of the application program, the second user are shown
It include the second picture for being observed the virtual environment of the visual angle using the virtual role in interface, described second
Not being superimposed with the visual interface control or Overlapping display on picture has the m visual interface controls, the m visual interface
Control is a part in the n visual interface control, 0 < m < n.
On the other hand, a kind of display device of interface control is provided, described device includes:
Display module includes using empty in first user interface for showing the first user interface of application program
Intend the first picture that the virtual environment is observed at the visual angle of role, n are also superimposed on first picture can visual field
Face control, n are positive integer;
Receiving module, for receiving hiding control trigger action;
The display module is also used to show that the second of the application program is used according to the hiding control trigger action
Family interface includes being observed using the visual angle of the virtual role the virtual environment in the second user interface
The second picture, not being superimposed with the visual interface control or Overlapping display on second picture has the m visual interfaces
Control, the m visual interface control are a part in the n visual interface control, 0 < m < n.
On the other hand, a kind of terminal is provided, the terminal includes processor and memory, is stored in the memory
At least one instruction, at least a Duan Chengxu, code set or instruction set, at least one instruction, an at least Duan Chengxu, institute
Code set or instruction set is stated to be loaded by the processor and executed to realize the boundary as described in any in above-mentioned the embodiment of the present application
The display methods of face control.
On the other hand, a kind of computer readable storage medium is provided, at least one finger is stored in the storage medium
Enable, at least a Duan Chengxu, code set or instruction set, at least one instruction, an at least Duan Chengxu, the code set or
Instruction set is loaded by the processor and is executed to realize the aobvious of the interface control as described in any in above-mentioned the embodiment of the present application
Show method.
On the other hand, a kind of computer program product is provided, when the computer program product is run on computers
When, so that computer executes the display methods of the interface control as described in any in above-mentioned the embodiment of the present application.
Technical solution bring beneficial effect provided by the embodiments of the present application includes at least:
When showing n visual interface control in the first user interface, by hiding control trigger action, display second
User interface, non-Overlapping display visible controls or Overlapping display partial visual control on second picture at the second user interface,
By hiding control trigger action easily to all or part of progress in the visual interface control in the first user interface
It hides, namely the visual interface control in the first user interface is hidden in a manner of efficient quick, in battle rapidly
The visual field is expanded, avoids visual interface control from blocking in battle to the visual field of observation virtual environment, also avoids by setting
The problem of process is cumbersome when setting interface to interface control progress position adjustment, human-computer interaction inefficiency.
Detailed description of the invention
In order to more clearly explain the technical solutions in the embodiments of the present application, make required in being described below to embodiment
Attached drawing is briefly described, it should be apparent that, the drawings in the following description are only some examples of the present application, for
For those of ordinary skill in the art, without creative efforts, it can also be obtained according to these attached drawings other
Attached drawing.
Fig. 1 is the structural block diagram for the electronic equipment that one exemplary embodiment of the application provides;
Fig. 2 is the structural block diagram for the computer system that one exemplary embodiment of the application provides;
Fig. 3 is that one exemplary embodiment of the application provides the display methods flow chart of interface control;
Fig. 4 is the interface control schematic diagram in one exemplary embodiment offer user interface of the application;
Fig. 5 is the display methods flow chart for the interface control that another exemplary embodiment of the application provides;
Fig. 6 is the user interface schematic diagram of the display methods for the interface control that one exemplary embodiment of the application provides;
Fig. 7 is that the application is hidden interface control by range sensor based on what the embodiment shown in Fig. 5 provided
Flow chart;
Fig. 8 be the application another based on shown in Fig. 5 embodiment provide by range sensor to interface control into
The hiding flow chart of row;
Fig. 9 is the user interface signal of the display methods for the interface control that another exemplary embodiment of the application provides
Figure;
Figure 10 is the user interface signal of the display methods for the interface control that another exemplary embodiment of the application provides
Figure;
Figure 11 is the display methods flow chart for the interface control that another exemplary embodiment of the application provides;
Figure 12 is the user interface signal of the display methods for the interface control that another exemplary embodiment of the application provides
Figure;
Figure 13 is the user interface signal of the display methods for the interface control that another exemplary embodiment of the application provides
Figure;
Figure 14 is the structural block diagram of the display device for the interface control that one exemplary embodiment of the application provides;
Figure 15 is the structural block diagram of the display device for the interface control that another exemplary embodiment of the application provides;
Figure 16 is the structural block diagram for the terminal that one exemplary embodiment of the application provides.
Specific embodiment
To keep the purposes, technical schemes and advantages of the application clearer, below in conjunction with attached drawing to the application embodiment party
Formula is described in further detail.
Firstly, to the invention relates to several nouns explain:
Virtual environment: being the virtual environment of display when application program is run at the terminal (or offer).The virtual environment can
To be the simulated environment to real world, it is also possible to the semifictional three-dimensional environment of half emulation, can also be pure imaginary three-dimensional
Environment.Virtual environment can be any one in two-dimensional virtual environment, 2.5 dimension virtual environments and three-dimensional virtual environment, following
Embodiment is that three-dimensional virtual environment comes for example, but being not limited to this with virtual environment.Optionally, the virtual environment is also
For the virtual environment battle between at least two virtual roles.Optionally, which is also used at least two virtual angles
It is fought between color using virtual firearms.Optionally, which is also used within the scope of target area, and at least two is empty
It is fought between quasi- role using virtual firearms, which can elapse with the time in virtual environment and constantly become
It is small.
Virtual objects: refer to the movable object in virtual environment.The movable object can be virtual portrait, virtual
At least one of animal, cartoon character.Optionally, when virtual environment is three-dimensional virtual environment, virtual objects are based on dynamic
Draw the three-dimensional stereo model of bone technology creation.Each virtual objects are in three-dimensional virtual environment with itself shape and body
Product, occupies a part of space in three-dimensional virtual environment.
Interface control: refer to control of the Overlapping display on the picture observed virtual environment, optionally, the boundary
Face control include attack control, setting control, under lie prone control, control of squatting down, jump control, weapon control, knapsack control, attack
At least one of auxiliary control (such as: firearms start shooting control), visual angle pivot controls, small map, direction display field.Optionally,
The state and/or virtual objects are controlled in virtual environment that the interface control is used to show virtual objects in virtual environment
System.Optionally, which further includes first layer interface control and triggers the second of display on first layer interface control
Bed boundary control, schematically, knapsack control are first layer interface control, after being selected on the knapsack control, display back
Second layer interface control of the packet content displaying control as the knapsack control.
Terminal in the application can be desktop computer, pocket computer on knee, mobile phone, tablet computer, e-book
Reader, MP3 (Moving Picture Experts Group Audio Layer III, dynamic image expert's compression standard
Audio level 3) player, MP4 (Moving Picture Experts Group Audio Layer IV, dynamic image expert
Compression standard audio level 4) player etc..Installation and operation has the application program for supporting virtual environment in the terminal, such as
Support the application program of three-dimensional virtual environment.The application program can be virtual reality applications program, three-dimensional map program, military affairs
Simulated program, TPS game, FPS game, any one in MOBA game.Optionally, which can be standalone version
Application program, such as the 3D games of standalone version, are also possible to the application program of network on-line version.
Fig. 1 shows the structural block diagram of the electronic equipment of one exemplary embodiment of the application offer.The electronic equipment 100
It include: operating system 120 and application program 122.
Operating system 120 is that the basic software of the secure access to computer hardware is provided for application program 122.
Application program 122 is to support the application program of virtual environment.Optionally, application program 122 is to support three-dimensional
The application program of environment.The application program 122 can be virtual reality applications program, three-dimensional map program, Military Simulation program,
Third person shooting game (Third-Personal Shooting Game, TPS), first person shooting game (First-
Person shooting game, FPS), MOBA game, any one in more people's gunbattle class survival games.The application program
122 can be the application program of standalone version, such as the 3D games of standalone version.
Fig. 2 shows the structural block diagrams for the computer system that one exemplary embodiment of the application provides.The department of computer science
System 200 includes: the first equipment 220, server 240 and the second equipment 260.
First equipment, 220 installation and operation has the application program for supporting virtual environment.The application program can be virtual existing
Real application program, three-dimensional map program, Military Simulation program, TPS game, FPS game, MOBA game, the existence of more people's gunbattle classes
Any one in game.First equipment 220 is the equipment that the first user uses, and the first user is controlled using the first equipment 220
The first virtual objects carry out activity in virtual environment, which includes but is not limited to: adjustment body posture creeps, walks
At least one of go, run, ride, jump, drive, pick up, shoot, attack, throw.Schematically, the first virtual objects
It is the first virtual portrait, such as artificial figure role or cartoon character role.
First equipment 220 is connected by wireless network or cable network with server 240.
Server 240 includes at least one in a server, multiple servers, cloud computing platform and virtualization center
Kind.Server 240 is used to that the application program of three-dimensional virtual environment to be supported to provide background service.Optionally, server 240 undertakes
Main to calculate work, the first equipment 220 and the second equipment 260 undertake secondary calculation work;Alternatively, server 240 undertake it is secondary
Work is calculated, the first equipment 220 and the second equipment 260 undertake main calculating work;Alternatively, server 240, the first equipment 220
Cooperated computing is carried out using distributed computing architecture between 260 three of the second equipment.
Second equipment, 260 installation and operation has the application program for supporting virtual environment.The application program can be virtual existing
Real application program, three-dimensional map program, Military Simulation program, FPS game, MOBA game, in more people's gunbattle class survival games
Any one.Second equipment 260 is the equipment that second user uses, and second user is located at virtual using the control of the second equipment 260
The second virtual objects carry out activity in environment, which includes but is not limited to: adjustment body posture, creep, walking, run,
At least one of ride, jump, drive, pick up, shoot, attack, throw.Schematically, the second virtual objects are the second void
Anthropomorphic object, such as artificial figure role or cartoon character role.
Optionally, the first virtual portrait and the second virtual portrait are in same virtual environment.Optionally, the first visual human
Object and the second virtual portrait may belong to the same troop, the same tissue, with friend relation or with provisional communication
Permission.Optionally, the first virtual portrait and the second virtual portrait also may belong to different troops, different tissues or have hostile
The Liang Ge group of property.
Optionally, the application program installed in the first equipment 220 and the second equipment 260 be identical or two equipment on
The application program of installation is the same type application program of different control system platforms.First equipment 220 can refer to multiple set
One in standby, the second equipment 260 can refer to one in multiple equipment, and the present embodiment is only with the first equipment 220 and second
Equipment 260 illustrates.The device type of first equipment 220 and the second equipment 260 is identical or different, the device type packet
It includes: game host, desktop computer, smart phone, tablet computer, E-book reader, MP3 player, MP4 player and knee
At least one of mo(u)ld top half portable computer.Following embodiment is illustrated so that equipment is desktop computer.
Those skilled in the art could be aware that the quantity of above equipment can be more or less.For example above equipment can be with
Only one perhaps above equipment be tens or several hundred or greater number.The embodiment of the present application to the quantity of equipment and
Device type is not limited.
Fig. 3 is the flow chart of the display methods for the interface control that one exemplary embodiment of the application provides, such as Fig. 3 institute
Show, is illustrated for being applied in terminal 100 as shown in Figure 1 in this way, this method comprises:
Step 301, it shows the first user interface of application program, includes the first picture in the first user interface, first draws
Overlapping display has n visual interface control on face.
It optionally, include first observed using the visual angle of virtual role virtual environment in first user interface
Picture is also superimposed with n visual interface control on first picture, and n is positive integer.
Optionally, which, which can be, observes virtual environment using the first person of virtual role
Picture is also possible to the picture observed using the third person of virtual role virtual environment.
Optionally, the n visual interface control include attack control, setting control, under lie prone control, control of squatting down, jump
Control, weapon control, knapsack control, attack auxiliary control (such as: firearms start shooting control), visual angle pivot controls, small map, direction
At least one of display field.
Schematically, referring to FIG. 4, showing attack control 410 in the first user interface 41, and the supply space
410 each side have one in the first user interface 41, jump control 411, under lie prone control 412, control 413 of squatting down, change
Play control 414, firearms start shooting control 415, visual angle pivot controls 416, knapsack control 417, weapon control 418, small map 419,
Setting control 420 and direction display field 421.Wherein, above-mentioned control is only schematical citing, is gone back in the first user interface 41
Including other visual interface controls.
Optionally, the visual interface control is for indicating state of the virtual objects in virtual environment, or, in virtual environment
In the state of virtual objects is controlled, or, being shown to the state of the teammates of virtual objects in virtual environment.Signal
Property, above-mentioned weapon control 418 is used to indicate that the virtual firearms that the virtual objects are held to be AUG assault rifle, above-mentioned direction
Display field 421 is used to indicate that the direction that virtual objects are faced in virtual environment to be west to above-mentioned attack control 410 is used for
Control virtual objects carry out shooting operation in virtual environment, and above-mentioned jump control 411 is for controlling virtual objects in virtual ring
It jumps in border.
Step 302, it receives and hides control trigger action.
Optionally, this hides control trigger action for triggering to n visual interface control in first user interface
Being hidden in part in whole or in part.
Optionally, when receiving hiding control trigger action, including at least one of the following two kinds situation:
The first, receives the hiding control trigger action acted in the first user interface, i.e., in first user interface
The control for hiding control trigger action can be triggered by showing;
It second, receives the hiding control acted on the other component in terminal in addition to display screen and hides operation, it should
It include at least one of sensor element, physical button, microphone in other component.
Step 303, according to hiding control trigger action, the second user interface of application program, second user interface are shown
In include the second picture, not being superimposed visual interface control or Overlapping display on the second picture has m visual interface control.
It optionally, include second observed using the visual angle of virtual role virtual environment in the second user interface
Picture, non-Overlapping display has visual interface control or Overlapping display to have m visual interface control on second picture, and the m can
Visual interface control is a part in n visual interface control.
Optionally, when the n visual interface control is arranged to suppressible interface control and receives hiding control touching
When hair operation, non-Overlapping display has visual interface control on second picture;When the part in the n visual interface control is set
It is set to suppressible interface control, i.e., the m visual interface control is not configured to suppressible interface control, and receives hidden
When hiding control trigger action, Overlapping display has m visual interface control on second picture, which is upper
It states m and is not configured to suppressible interface control.
Optionally, when Overlapping display non-on the second picture has visual interface control, it can be n visual interface control
Transparency both is set to all-transparent, is also possible to the n visual interface control and is all cancelled display, can also be that n is a visual
The transparency of a part in interface control is arranged to all-transparent, and another part is cancelled display, and the embodiment of the present application is to this
It is not limited.It optionally, include that can click control in the n visual interface control, non-Overlapping display is visual on the second picture
When interface control, the transparency for clicking control in n visual interface control is arranged to all-transparent, and control can be clicked by removing
Except other visual interface controls be cancelled display.
Optionally, when m visual interface control of Overlapping display on the second picture, that is, there is n-m visual interface control not
It has been shown that, then not shown visual interface control can be due to transparency be arranged to all-transparent without as it can be seen that be also possible to by
Cancel display without as it can be seen that can also be that partial transparency is arranged to all-transparent, another part is cancelled display, and the application is real
It applies example and this is not limited.
That is, according to hiding control trigger action by the transparency for the control for needing to be hidden in n visual interface control
It is set as all-transparent, or, cancelling the control for needing to be hidden in n visual interface control of display according to hiding control trigger action
Part.
Wherein, when visual interface control is that can click control, and the transparency that can click control is arranged to all-transparent
When, this can click control and can still click, and can according to can click the click event that is generated on control to virtual objects into
Row control.
Optionally, whether visual interface control can hide can be by user's sets itself, can also be pre- by developer
First set.Schematically, it is all suppressible interface control that developer, which presets above-mentioned n visual interface control, then eventually
After termination receives hiding control trigger action, non-Overlapping display has visual interface control on the second picture;User is customized can be hidden
The interface control of hiding is lain prone control, control of squatting down, jump control, knapsack control, small map and direction display field under including, and first
On picture n visual interface control of Overlapping display include attack control, setting control, under lie prone control, control of squatting down, jump control
Part, weapon control, knapsack control, firearms are started shooting control, visual angle pivot controls, small map, direction display field, then terminal receives hidden
After hiding control trigger action, m visual interface control of Overlapping display includes attack control, setting control, force on the second picture
Device control, firearms are started shooting control, visual angle pivot controls.
In conclusion the display methods of interface control provided in this embodiment, n are shown in the first user interface can
When visual interface control, by hiding control trigger action, second user interface is shown, on second picture at the second user interface
Non- Overlapping display visible controls or Overlapping display partial visual control, by hiding control trigger action easily to the first user
Being hidden in the visual interface control on interface in whole or in part, namely to the first user circle in a manner of efficient quick
Visual interface control in face is hidden, and expands the visual field rapidly in battle, avoids visual interface control in battle to sight
The visual field for examining virtual environment is blocked, and it is numerous to also avoid through set interface process when to interface control progress position adjustment
It is trivial, the problem of human-computer interaction inefficiency.
In an alternative embodiment, hiding control trigger action includes the case where a variety of different that Fig. 5 is that the application is another
The display methods flow chart for the interface control that one exemplary embodiment provides, is applied in this way in terminal as shown in Figure 1
It is illustrated in 100, as shown in figure 5, this method comprises:
Step 501, the first user interface of application program is shown.
It optionally, include first observed using the visual angle of virtual role virtual environment in first user interface
Picture is also superimposed with n visual interface control on first picture, and n is positive integer.
Optionally, the visual interface control is for indicating state of the virtual objects in virtual environment, or, in virtual environment
In the state of virtual objects is controlled, or, being shown to the state of the teammates of virtual objects in virtual environment.
Step 502, it receives to act on the sensor element in terminal and blocks operation.
Optionally, which is at least one of range sensor, light sensor and front camera,
Optionally, range sensor judges whether the range sensor is blocked by emitting and receiving energy, when according to emitting and connect
When the energy judgement of receipts has the distance between object and the range sensor to be less than pre-determined distance, then the range sensor quilt is judged
It blocks;Light sensor judges whether the light sensor is blocked by receiving light, receives when the light sensor
When light is less than default values of light, judge that the light sensor is blocked;Front camera passes through the image judgement received should
Whether front camera is blocked, and when the image of front camera acquisition is all black picture, then judges the front camera
It is blocked.
Step 503, operation will be blocked to be determined as hiding control trigger action.
Optionally, this hides control trigger action for triggering in n visual interface control in the first user interface
In whole or in part be hidden.That is, blocking operation in n visual interface control in the first user interface by this
In whole or in part be hidden.
Schematically, it is illustrated so that the sensor element is range sensor as an example, as shown in fig. 6, being shown in terminal 60
It is shown with the first user interface 61, further includes range sensor 62 in the terminal 60, attack control is shown in the first user interface 61
Part 610, jump control 611, under lie prone control 612, control 613 of squatting down, change and play control 614, firearms and start shooting control 615, visual field rotation
Turn control 616, knapsack control 617, small map 618 and orientation display field 619, wherein above-mentioned control is all set to can
After hiding control user is blocked on the range sensor 62, second user interface 63 is shown in the terminal 60, this
Do not include in two user interfaces 63 61 in above-mentioned first user interface in control.
Optionally, operation will be blocked to be determined as hiding control trigger action and when being hidden visual interface control, packet
Include at least one of following situation:
First, when this blocks when blocking duration and reaching preset duration of operation, above-mentioned visual interface control is hidden,
And when receiving this again and blocking operation, restore to show the n visual interface control;
Second, this blocks operation persistently to block operation, when receiving this and persistently blocking operation, shows application program the
Two user interfaces are hidden visual interface control, and when this, which persistently blocks operation, stops, restoring display the n can visual field
Face control.
Schematically, it when being hidden by sensor element to visual interface control, needs through the micro-control in terminal
Unit (MicroController Unit, MCU) processed, central processing unit (Central Processing Unit, CPU), storage
Device, controller and display screen are completed jointly, referring to FIG. 7, range sensor 71 detects when blocking event, this is blocked thing
The data of blocking of part are sent to MCU72, and it includes blocking duration and blocking distance in data that this, which is blocked, and MCU72 is from memory 73
It is middle obtain corresponding with data are blocked control request instruction after, which is sent to CPU74, CPU74 is from storage
After obtaining control signal corresponding with control request instruction in device 73, after which is sent to controller 75, by controlling
It is invisible (setting transparency is all-transparent or cancellation display) that device 75, which controls the visual interface control shown in display screen 76, is shown
The touch event received in display screen 76 is responded by CPU74, MCU72, memory 73 and controller 75.
Schematically, referring to FIG. 8, the process includes:
Step 801, data are blocked in range sensor acquisition, and will be blocked data and be sent to MCU.Step 802, MCU passes through
Memory carries out analysis comparison to data are blocked, and generates control request instruction according to comparison result, by the control request instruction
It is sent to CPU.Step 803, CPU generates control signal according to control request instruction, and the control signal is sent to controller.
Step 804, controller controls display screen according to control signal.
Step 504, the slip lid event that slip lid monitoring component reports is received.
Optionally, terminal is the slide-type terminal for including upper slip lid and lower sliding cover, which includes for detecting upper slip lid
With the slip lid monitoring component of the slip lid state of lower sliding cover.Above-mentioned slip lid event is that upper slip lid and lower sliding cover generate when sliding relatively
Event.
Step 505, when slip lid event is along the first slip lid event of preset direction sliding, the first slip lid event is determined
To hide control trigger action.
Optionally, the upper slip lid and lower sliding cover of the slide-type terminal include coincidence status and discrete state two states, are somebody's turn to do
Preset direction can be the direction slided from discrete state to coincidence status, be also possible to from coincidence status to discrete state
The direction slided.
Schematically, referring to FIG. 9, shown in the first user interface 91 attack control 910, jump control 911, under lie prone
Control 912, control 913 of squatting down, change play control 914, firearms start shooting control 915, visual field pivot controls 916, knapsack control 917,
Small map 918 and orientation display field 919, when the upper slip lid 93 and lower sliding cover 94 of terminal are slided and generate the first slip lid event
Afterwards, slip lid 93 and lower sliding cover 94 are in discrete state on this, then show second user interface 92, in the second user interface 92 not
Show above-mentioned control.
Step 506, the clicking operation acted on the microphone in terminal is received.
Optionally, which includes microphone, and the microphone is for receiving voice signal.
Optionally, when terminal is configured with earphone, which can be implemented as the microphone on earphone, also may be implemented
For the microphone in terminal, the embodiment of the present application is not limited this.
Optionally, user generates vibration in the cavity of microphone when clicking on the microphone of terminal to produce
Raw voice signal is acquired by microphone.
Step 507, when the click feature of clicking operation meets default feature, it will click on operation and be determined as hiding control touching
Hair operation.
Optionally, user clicks the microphone with certain frequency, when the user clicks the frequency of microphone and pre-
If frequency is consistent, then the clicking operation is determined as hiding control trigger action;Or, user clicks microphone certain time
Number when the number of microphone is consistent with preset times when the user clicks, which is determined as to hide control trigger action.
Optionally, above-mentioned steps 502 to step 507 is all to act on the other component in addition to display screen at the terminal
Hide control trigger action.
Step 508, the slide acted on slip scan control is received.
It optionally, include slip scan control in the first user interface, which is used for by sweeping with sliding
It retouches the vertical slip scan item of control and slip scan is carried out to the first user interface along slip scan control.
Optionally, the area that slip scan item is scanned accounts for the ratio of the area of the first user interface, is sliding for user
The length slided on scanning control accounts for the ratio of the length of slip scan control.
Optionally, the initial position of the slide can be the endpoint location of slip scan control, be also possible to the cunning
Any position among dynamic scanning control.
Step 509, slide is determined as hiding control trigger action.
Optionally, visual interface control is hidden including any one in following situation according to slide:
First, by the first user interface, the visual interface control that is covered by the corresponding overlay area of the slide into
Row is hidden;
Second, it is being covered by the corresponding overlay area of the slide and be arranged to can be hidden by the first user interface
The visual interface control of hiding is hidden;
Third is hidden n visual interface control in the first user interface according to slide;
4th, it is hidden to suppressible visual interface control progress is arranged in the first user interface according to slide
Hiding.
Optionally, it when the covering of the part coated region of visual interface control, is capped when the visual interface control
When area reaches the 50% of the gross area of the visual interface control, which is hidden.
Schematically, it is illustrated by taking the first above-mentioned situation as an example, referring to FIG. 10, in the first user interface 1001
Show attack control 1010, jump control 1011, under lie prone control 1012, control 1013 of squatting down, change and play control 1014, firearms and open
Mirror control 1015, visual field pivot controls 1016, knapsack control 1017, small map 1018 and orientation display field 1019, optionally,
Further include slip scan control 1002 in first user interface 1001, the slip scan control 1002 is carried out in user with this
When sliding, the slip scan item 1003 vertical with the slip scan control, the cunning are also shown in first user interface 1001
The region that dynamic scan stripes 1003 were drawn is the region of coated region covering, current to use such as the overlay area 1020 in Figure 10
Position where the finger of family is the position that corresponding slip scan item 1003 is currently located, when user terminates sliding, display second
User interface 1004 shows the knapsack control 1017 that uncovered region 1020 covers in the second user interface 1004,
In, orientation display field 1019 is not shown in second user interface 1004 due to being more than 50% area coated region covering
The orientation display field 1019.
Step 510, according to hiding control trigger action, the second user interface of application program is shown.
It optionally, include second observed using the visual angle of virtual role virtual environment in the second user interface
Picture, non-Overlapping display has visual interface control or Overlapping display to have m visual interface control on second picture, and the m can
Visual interface control is a part in n visual interface control.
Optionally, when being hidden in n visual interface control in whole or in part, and user thinks that recovery display n can
When visual interface control, the slip scan control any distance can be dragged by opposite direction and restores the n visual interface control
It has been shown that, can also restore the display of the n visual interface control by the sensor element blocked in terminal, can also pass through a little
The display that microphone restores the n visual interface control is hit, timing can also be carried out to hiding duration by terminal, work as timing duration
When reaching preset duration, restore n visual interface control of display, the embodiment of the present application is aobvious to the recovery of n visual interface control
Show that method does not limit.
Optionally, when the n visual interface control is arranged to suppressible interface control and receives hiding control touching
When hair operation, non-Overlapping display has visual interface control on second picture;When the part in the n visual interface control is set
It is set to suppressible interface control, i.e., the m visual interface control is not configured to suppressible interface control, and receives hidden
When hiding control trigger action, Overlapping display has m visual interface control on second picture, which is upper
It states m and is not configured to suppressible interface control.
Optionally, for the different corresponding hiding control trigger actions of situation, show that the mode at second user interface exists
Above-mentioned steps 502 have been carried out detailed description into step 509, and details are not described herein again.
It is worth noting that, above-mentioned steps 502 to step 503, step 504 to step 505, step 506 to step 507 with
And step 508 is to step 509, can be implemented separately as methodology, can also be implemented in combination with, such as: reception acts on sensor
When blocking operation on component, and receiving the slide on slip scan control, covered according to slip scan operation is corresponding
Cover area determines the interface control being hidden.
In conclusion the display methods of interface control provided in this embodiment, n are shown in the first user interface can
When visual interface control, by hiding control trigger action, second user interface is shown, on second picture at the second user interface
Non- Overlapping display visible controls or Overlapping display partial visual control, by hiding control trigger action easily to the first user
Being hidden in the visual interface control on interface in whole or in part, namely to the first user circle in a manner of efficient quick
Visual interface control in face is hidden, and expands the visual field rapidly in battle, avoids visual interface control in battle to sight
The visual field for examining virtual environment is blocked, and it is numerous to also avoid through set interface process when to interface control progress position adjustment
It is trivial, the problem of human-computer interaction inefficiency.
Method provided in this embodiment, by sensor component block operation visual interface control is carried out it is hidden
Hiding, is hidden the interface control in user interface in a manner of efficient quick, expands the visual field rapidly in battle, avoiding can
Visual interface control blocks the visual field of observation virtual environment in battle, also avoids through set interface to interface control
The problem of process is cumbersome when progress position adjustment, human-computer interaction inefficiency.
Method provided in this embodiment, it is hidden to the progress of visual interface control by carrying out slip lid operation to slide-type terminal
Hiding, is hidden the interface control in user interface in a manner of efficient quick, expands the visual field rapidly in battle, avoiding can
Visual interface control blocks the visual field of observation virtual environment in battle, also avoids through set interface to interface control
The problem of process is cumbersome when progress position adjustment, human-computer interaction inefficiency.
Method provided in this embodiment is determined in the first user interface by carrying out slide to slip scan control
The visual interface control in hiding target area is needed, when target area is the region for needing to be examined, with efficient
Quick way is hidden the visual interface control in target area, expands the visual field rapidly in battle, and avoiding can visual field
Face control blocks the visual field of observation virtual environment in battle, also avoids carrying out interface control by set interface
The problem of process is cumbersome when position adjusts, human-computer interaction inefficiency.
Method provided in this embodiment, by efficient and convenient mode to the visual interface control in the first user interface into
Row is hidden, and after hiding, and restores to show n visual interface control by efficient and convenient mode, realize to visual interface
The efficient control of control, improves the fluency of operation, improves human-computer interaction efficiency.
In an alternative embodiment, the position where the control being hidden can also use the lesser target of display area
Indicating content substitution, Figure 11 is the display methods flow chart for the interface control that another exemplary embodiment of the application provides, with
This method is illustrated for applying in terminal 100 as shown in Figure 1, as shown in figure 11, this method comprises:
Step 1101, the first user interface of application program is shown.
It optionally, include first observed using the visual angle of virtual role virtual environment in first user interface
Picture is also superimposed with n visual interface control on first picture, and n is positive integer.
Optionally, the visual interface control is for indicating state of the virtual objects in virtual environment, or, in virtual environment
In the state of virtual objects is controlled, or, being shown to the state of the teammates of virtual objects in virtual environment.
Step 1102, it receives and hides control trigger action.
The triggering mode for hiding control trigger action has been carried out detailed description into step 509 in above-mentioned steps 502,
Details are not described herein again.
Step 1103, according to hiding control trigger action by the saturating of the control for needing to be hidden in n visual interface control
Lightness is set as all-transparent.
Optionally, the control which is hidden is to have been set to suppressible control in n visual interface control,
In, transparency is arranged to the whole controls that can be in suppressible control of all-transparent, is also possible to suppressible control
In part control.
Optionally, it according to hiding control trigger action, will need to be hidden in n visual interface control, and hiding
The transparency for being arranged to the control of all-transparent is set as all-transparent.
Schematically, in the first user interface include control A, control B, control C, control D and control E, user to this 5
After a control is configured, control A, control B and control E are suppressible control, wherein the hiding of control A is
Lightness is set as all-transparent, and it does not include the control that is, in second user interface that the hiding of control B and control E, which are to cancel display,
Part B and control E.
Step 1104, the first position displaying target in second user interface indicates content.
It optionally, include the first control in the control being hidden, which is located at first in the first user interface
At position.Optionally, which is used to indicate first position locating for the first control, and the Target indication content
Display area of the display area less than the first control before being hidden.Optionally, it is arranged to due to the first control for transparency
The control of all-transparent, i.e. first control are still present in second user interface but invisible.
It schematically, as shown in figure 12, include range sensor 1220 in terminal 1200, in the first user interface 1210
Show attack control 1211 and knapsack control 1212, after user is blocked on the range sensor 1220, knapsack
Control 1212 and attack control 1211 are hidden in second user interface, wherein in second user interface 1230, the attack control
Displaying target indicates that content 1231, the Target indication content 1231 are to attack the wheel of control 1211 on position where part 1211
Exterior feature, and the Target indication content 1231 is the profile of the attack control 1211 only to be shown, without to virtual ring without filling content
It is blocked in border.
It optionally, include that can click control, and the first control belongs to this can click control in n visual interface control, it should
Displaying target indicates that the first control of content is the highest control of history frequency of use, the i.e. first position at second user interface
Before place's displaying target instruction content, it is also necessary to obtain history control using data, the history control is using data for indicating n
The frequency clicking control and being clicked in a visual interface control can click control using data determination according to the history control
In the highest control of history frequency of use be the first control, and determine the first control locating first position in the user interface
Afterwards, displaying target indicates content at the first position in second user interface.
Optionally, which can be clicked number calculating by that can click control being averaged in every battle
It obtains.
Step 1105, the selection operation in the second place at second user interface is received.
Optionally, which is the interface control of the first level in user interface, and second contact surface control is can
The second level interface control of display is triggered on visual interface control, includes the second control, second control in the control being hidden
The second place in the first user interface, and the hiding of second control is that transparency is set as all-transparent.
Step 1106, second contact surface control corresponding with the second control is shown according to selection operation.
Optionally, the display transparency of the second contact surface control is non-all-transparent.
Schematically, Figure 13 is please referred to, the first user interface 1310 is shown in terminal 1300, gives the first user interface
Attack control 1311 and knapsack control 1313 are shown in 1310, after user is blocked on range sensor 1320, are shown
Show second user interface 1330, in the second user interface 1330 hide knapsack control 1313 and attack control 1311, when with
After family is clicked on the position of knapsack control 1313 in second user interface 1330, third user interface 1340 is shown, it should
Knapsack display area 1341 is shown in third user interface 1340, the display transparency of the knapsack display area 1341 is non-complete
It is transparent.
In conclusion the display methods of interface control provided in this embodiment, n are shown in the first user interface can
When visual interface control, by hiding control trigger action, second user interface is shown, on second picture at the second user interface
Non- Overlapping display visible controls or Overlapping display partial visual control, by hiding control trigger action easily to the first user
Being hidden in the visual interface control on interface in whole or in part, namely to the first user circle in a manner of efficient quick
Visual interface control in face is hidden, and expands the visual field rapidly in battle, avoids visual interface control in battle to sight
The visual field for examining virtual environment is blocked, and it is numerous to also avoid through set interface process when to interface control progress position adjustment
It is trivial, the problem of human-computer interaction inefficiency.
Method provided in this embodiment hides the interface control of the first level in second user interface, and when first
When the interface control of level is shown the interface control of the second level by selection, the transparency of the interface control of second level is non-
All-transparent, avoid the interface control of the second level remain unchanged all-transparent when, block other controls, cause other controls that can not be controlled
The problem of making, influencing Campaign Process.
Figure 14 is the structural block diagram of the display device for the interface control that one exemplary embodiment of the application provides, the device
It can be only fitted in terminal 100 as shown in Figure 1, which includes: display module 1410 and receiving module 1420;
Display module 1410 includes adopting in first user interface for showing the first user interface of application program
The first picture observed with the visual angle of virtual role the virtual environment, n are also superimposed on first picture can
Visual interface control, n are positive integer;
Receiving module 1420, for receiving hiding control trigger action;
The display module 1410 is also used to show the of the application program according to the hiding control trigger action
Two user interfaces include being carried out using the visual angle of the virtual role to the virtual environment in the second user interface
Second picture of observation, be not superimposed with the visual interface control on second picture or Overlapping display have m it is described visual
Interface control, the m visual interface control are a part in the n visual interface control, 0 < m < n.
In an alternative embodiment, described device is applied in terminal;
The receiving module 1420 is also used to reception and acts on other components in the terminal in addition to display screen
The hiding control trigger action.
In an alternative embodiment, as shown in figure 15, other components include sensor element, the sensor
Component is at least one of range sensor, light sensor and front camera;
The receiving module 1420 is also used to receive to act on the sensor element in the terminal and blocks behaviour
Make;
Described device further include:
Determining module 1430, for blocking operation by described and being determined as the hiding control trigger action.
In an alternative embodiment, the terminal is the slide-type terminal for including upper slip lid and lower sliding cover, it is described its
Its component includes the slip lid monitoring component for detecting the slip lid state of the upper slip lid and the lower sliding cover;
The receiving module 1420 is also used to receive the slip lid event that the slip lid monitoring component reports, the slip lid thing
Part is the event generated when the upper slip lid and the lower sliding cover slide relatively;
Determining module 1430, for when the slip lid event be along preset direction sliding the first slip lid event when, by institute
It states the first slip lid event and is determined as the hiding control trigger action.
In an alternative embodiment, other components include microphone;
The receiving module 1420 is also used to receive the clicking operation acted on the microphone in the terminal;
Determining module 1430, for when the click feature of the clicking operation meets default feature, the click to be grasped
Work is determined as the hiding control trigger action.
It in an alternative embodiment, include slip scan control, the slip scan in first user interface
Control is scanned for being scanned by the slip scan item vertical with the slip scan control to first user interface
Width is the corresponding width of first user interface;
The receiving module 1420 is also used to receive the slide acted on the slip scan control, the cunning
Dynamic operation carries out slip scan to first user interface along the slip scan control for controlling the slip scan item;
Determining module 1430, for the slide to be determined as the hiding control trigger action.
In an alternative embodiment, the display module 1410 is also used to according to the hiding control trigger action
All-transparent is set by the transparency for the control for needing to be hidden in the n visual interface control, or, the display module
1410, it is also used to cancel according to the hiding control trigger action showing and need to be hidden in the n visual interface control
Control.
It in an alternative embodiment, include the first control in the control being hidden, first control is located at described
At first position in first user interface;
The display module 1410 is also used in the first position displaying target instruction at the second user interface
Hold, the Target indication content is used to indicate the first position locating for first control, and the Target indication content
Display area be less than display area of first control before being hidden.
It in an alternative embodiment, include that can click control, first control in the n visual interface control
Control can be clicked described in belonging to;
Described device, further includes:
Module 1440 is obtained, for obtaining history control using data, the history control is using data for indicating institute
State the frequency clicking control and being clicked in n visual interface control;
Determining module 1430 is used for that can click in control history described in being determined according to the history control using data
The highest control of frequency is first control;
The determining module 1430 is also used to determine first control locating described first in the user interface
Position.
In an alternative embodiment, the visual interface control is the interface control of the first level in the user interface
Part, second contact surface control are the second level interface control that display is triggered on the visual interface control, the control being hidden
In include the second control, the second place of second control in first user interface;
The receiving module 1420 is also used to receive the selection behaviour in the second place at the second user interface
Make;
The display module 1410 is also used to show the second boundary corresponding with second control according to the selection operation
Face control, the display transparency of the second contact surface control are non-all-transparent.
It should be noted that receiving module 1420, determining module 1430 in above-described embodiment, obtain module 1440 can be with
Can also there are processor and memory cooperative achievement by processor realization;Display module 1410 in above-described embodiment can be by showing
Display screen realization, can also be by processor and display screen cooperative achievement.
Figure 16 shows the structural block diagram of the terminal 1600 of an illustrative embodiment of the invention offer.The terminal 1600 can
To be: smart phone, tablet computer, MP3 player (Moving Picture Experts Group Audio Layer
III, dynamic image expert's compression standard audio level 3), MP4 (Moving Picture Experts Group Audio
Layer IV, dynamic image expert's compression standard audio level 4) player, laptop or desktop computer.Terminal 1600 is also
Other titles such as user equipment, portable terminal, laptop terminal, terminal console may be referred to as.
In general, terminal 1600 includes: processor 1601 and memory 1602.
Processor 1601 may include one or more processing cores, such as 4 core processors, 8 core processors etc..Place
Reason device 1601 can use DSP (Digital Signal Processing, Digital Signal Processing), FPGA (Field-
Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array, may be programmed
Logic array) at least one of example, in hardware realize.Processor 1601 also may include primary processor and coprocessor, master
Processor is the processor for being handled data in the awake state, also referred to as CPU (Central Processing
Unit, central processing unit);Coprocessor is the low power processor for being handled data in the standby state.?
In some embodiments, processor 1601 can be integrated with GPU (Graphics Processing Unit, image processor),
GPU is used to be responsible for the rendering and drafting of content to be shown needed for display screen.In some embodiments, processor 1601 can also be wrapped
AI (Artificial Intelligence, artificial intelligence) processor is included, the AI processor is for handling related machine learning
Calculating operation.
Memory 1602 may include one or more computer readable storage mediums, which can
To be non-transient.Memory 1602 may also include high-speed random access memory and nonvolatile memory, such as one
Or multiple disk storage equipments, flash memory device.In some embodiments, the non-transient computer in memory 1602 can
Storage medium is read for storing at least one instruction, at least one instruction performed by processor 1601 for realizing this Shen
Please in embodiment of the method provide interface control display methods.
In some embodiments, terminal 1600 is also optional includes: peripheral device interface 1603 and at least one periphery are set
It is standby.It can be connected by bus or signal wire between processor 1601, memory 1602 and peripheral device interface 1603.It is each outer
Peripheral equipment can be connected by bus, signal wire or circuit board with peripheral device interface 1603.Specifically, peripheral equipment includes:
In radio circuit 1604, touch display screen 1605, camera 1606, voicefrequency circuit 1607, positioning component 1608 and power supply 1609
At least one.
Peripheral device interface 1603 can be used for I/O (Input/Output, input/output) is relevant outside at least one
Peripheral equipment is connected to processor 1601 and memory 1602.In some embodiments, processor 1601, memory 1602 and periphery
Equipment interface 1603 is integrated on same chip or circuit board;In some other embodiments, processor 1601, memory
1602 and peripheral device interface 1603 in any one or two can be realized on individual chip or circuit board, this implementation
Example is not limited this.
Radio circuit 1604 is for receiving and emitting RF (Radio Frequency, radio frequency) signal, also referred to as electromagnetic signal.
Radio circuit 1604 is communicated by electromagnetic signal with communication network and other communication equipments.Radio circuit 1604 is by telecommunications
Number being converted to electromagnetic signal is sent, alternatively, the electromagnetic signal received is converted to electric signal.Optionally, radio circuit
1604 include: antenna system, RF transceiver, one or more amplifiers, tuner, oscillator, digital signal processor, volume solution
Code chipset, user identity module card etc..Radio circuit 1604 can by least one wireless communication protocol come with it is other
Terminal is communicated.The wireless communication protocol includes but is not limited to: WWW, Metropolitan Area Network (MAN), Intranet, each third generation mobile communication network
(2G, 3G, 4G and 5G), WLAN and/or WiFi (Wireless Fidelity, Wireless Fidelity) network.In some implementations
In example, radio circuit 1604 can also include that NFC (Near Field Communication, wireless near field communication) is related
Circuit, the application are not limited this.
Display screen 1605 is for showing UI (UserInterface, user interface).The UI may include figure, text, figure
Mark, video and its their any combination.When display screen 1605 is touch display screen, display screen 1605 also has acquisition aobvious
The ability of the touch signal on the surface or surface of display screen 1605.The touch signal can be used as control signal and be input to processing
Device 1601 is handled.At this point, display screen 1605 can be also used for providing virtual push button and/or dummy keyboard, also referred to as soft button
And/or soft keyboard.In some embodiments, display screen 1605 can be one, and the front panel of terminal 1600 is arranged;At other
In embodiment, display screen 1605 can be at least two, be separately positioned on the different surfaces of terminal 1600 or in foldover design;?
In still other embodiments, display screen 1605 can be flexible display screen, be arranged on the curved surface of terminal 1600 or fold plane
On.Even, display screen 1605 can also be arranged to non-rectangle irregular figure, namely abnormity screen.Display screen 1605 can be adopted
With LCD (Liquid Crystal Display, liquid crystal display), (Organic Light-Emitting Diode, has OLED
Machine light emitting diode) etc. materials preparation.
CCD camera assembly 1606 is for acquiring image or video.Optionally, CCD camera assembly 1606 includes front camera
And rear camera.In general, the front panel of terminal is arranged in front camera, the back side of terminal is arranged in rear camera.?
In some embodiments, rear camera at least two is that main camera, depth of field camera, wide-angle camera, focal length are taken the photograph respectively
As any one in head, to realize that main camera and the fusion of depth of field camera realize background blurring function, main camera and wide
Pan-shot and VR (Virtual Reality, virtual reality) shooting function or other fusions are realized in camera fusion in angle
Shooting function.In some embodiments, CCD camera assembly 1606 can also include flash lamp.Flash lamp can be monochromatic temperature flash of light
Lamp is also possible to double-colored temperature flash lamp.Double-colored temperature flash lamp refers to the combination of warm light flash lamp and cold light flash lamp, can be used for
Light compensation under different-colour.
Voicefrequency circuit 1607 may include microphone and loudspeaker.Microphone is used to acquire the sound wave of user and environment, and
It converts sound waves into electric signal and is input to processor 1601 and handled, or be input to radio circuit 1604 to realize that voice is logical
Letter.For stereo acquisition or the purpose of noise reduction, microphone can be separately positioned on the different parts of terminal 1600 to be multiple.
Microphone can also be array microphone or omnidirectional's acquisition type microphone.Loudspeaker is then used to that processor 1601 or radio frequency will to be come from
The electric signal of circuit 1604 is converted to sound wave.Loudspeaker can be traditional wafer speaker, be also possible to piezoelectric ceramics loudspeaking
Device.When loudspeaker is piezoelectric ceramic loudspeaker, the audible sound wave of the mankind can be not only converted electrical signals to, can also be incited somebody to action
Electric signal is converted to the sound wave that the mankind do not hear to carry out the purposes such as ranging.In some embodiments, voicefrequency circuit 1607 may be used also
To include earphone jack.
Positioning component 1608 is used for the current geographic position of positioning terminal 1600, to realize navigation or LBS (Location
Based Service, location based service).Positioning component 1608 can be the GPS (Global based on the U.S.
Positioning System, global positioning system), China dipper system or Russia Galileo system positioning group
Part.
Power supply 1609 is used to be powered for the various components in terminal 1600.Power supply 1609 can be alternating current, direct current
Electricity, disposable battery or rechargeable battery.When power supply 1609 includes rechargeable battery, which can be line charge
Battery or wireless charging battery.Wired charging battery is the battery to be charged by Wireline, and wireless charging battery is to pass through
The battery of wireless coil charging.The rechargeable battery can be also used for supporting fast charge technology.
In some embodiments, terminal 1600 further includes having one or more sensors 1610.One or more sensing
Device 1610 includes but is not limited to: acceleration transducer 1611, gyro sensor 1612, pressure sensor 1613, fingerprint sensing
Device 1614, optical sensor 1615 and proximity sensor 1616.
Acceleration transducer 1611 can detecte the acceleration in three reference axis of the coordinate system established with terminal 1600
Size.For example, acceleration transducer 1611 can be used for detecting component of the acceleration of gravity in three reference axis.Processor
The 1601 acceleration of gravity signals that can be acquired according to acceleration transducer 1611, control touch display screen 1605 with transverse views
Or longitudinal view carries out the display of user interface.Acceleration transducer 1611 can be also used for game or the exercise data of user
Acquisition.
Gyro sensor 1612 can detecte body direction and the rotational angle of terminal 1600, gyro sensor 1612
Acquisition user can be cooperateed with to act the 3D of terminal 1600 with acceleration transducer 1611.Processor 1601 is according to gyro sensors
The data that device 1612 acquires, following function may be implemented: action induction (for example changing UI according to the tilt operation of user) is clapped
Image stabilization, game control and inertial navigation when taking the photograph.
The lower layer of side frame and/or touch display screen 1605 in terminal 1600 can be set in pressure sensor 1613.When
When the side frame of terminal 1600 is arranged in pressure sensor 1613, user can detecte to the gripping signal of terminal 1600, by
Reason device 1601 carries out right-hand man's identification or prompt operation according to the gripping signal that pressure sensor 1613 acquires.Work as pressure sensor
1613 when being arranged in the lower layer of touch display screen 1605, is grasped by processor 1601 according to pressure of the user to touch display screen 1605
Make, realization controls the operability control on the interface UI.Operability control include button control, scroll bar control,
At least one of icon control, menu control.
Fingerprint sensor 1614 is used to acquire the fingerprint of user, is collected by processor 1601 according to fingerprint sensor 1614
Fingerprint recognition user identity, alternatively, by fingerprint sensor 1614 according to the identity of collected fingerprint recognition user.Knowing
Not Chu the identity of user when being trusted identity, authorize the user to execute relevant sensitive operation by processor 1601, which grasps
Make to include solving lock screen, checking encryption information, downloading software, payment and change setting etc..Fingerprint sensor 1614 can be set
Set the front, the back side or side of terminal 1600.When being provided with physical button or manufacturer Logo in terminal 1600, fingerprint sensor
1614 can integrate with physical button or manufacturer Logo.
Optical sensor 1615 is for acquiring ambient light intensity.In one embodiment, processor 1601 can be according to light
The ambient light intensity that sensor 1615 acquires is learned, the display brightness of touch display screen 1605 is controlled.Specifically, work as ambient light intensity
When higher, the display brightness of touch display screen 1605 is turned up;When ambient light intensity is lower, the aobvious of touch display screen 1605 is turned down
Show brightness.In another embodiment, the ambient light intensity that processor 1601 can also be acquired according to optical sensor 1615, is moved
The acquisition parameters of state adjustment CCD camera assembly 1606.
Proximity sensor 1616, also referred to as range sensor are generally arranged at the front panel of terminal 1600.Proximity sensor
1616 for acquiring the distance between the front of user Yu terminal 1600.In one embodiment, when proximity sensor 1616 is examined
When measuring the distance between the front of user and terminal 1600 and gradually becoming smaller, by processor 1601 control touch display screen 1605 from
Bright screen state is switched to breath screen state;When proximity sensor 1616 detect the distance between front of user and terminal 1600 by
When gradual change is big, touch display screen 1605 is controlled by processor 1601 and is switched to bright screen state from breath screen state.
It, can be with it will be understood by those skilled in the art that the restriction of the not structure paired terminal 1600 of structure shown in Figure 16
Including than illustrating more or fewer components, perhaps combining certain components or being arranged using different components.
Those of ordinary skill in the art will appreciate that all or part of the steps in the various methods of above-described embodiment is can
It is completed with instructing relevant hardware by program, which can be stored in a computer readable storage medium, the meter
Calculation machine readable storage medium storing program for executing can be computer readable storage medium included in the memory in above-described embodiment;It can also be with
It is individualism, without the computer readable storage medium in supplying terminal.Be stored in the computer readable storage medium to
Few an instruction, at least a Duan Chengxu, code set or instruction set, it is at least one instruction, an at least Duan Chengxu, described
Code set or instruction set are loaded by the processor and are executed to realize the interface control as described in Fig. 3, Fig. 5 and Figure 11 are any
Display methods.
Optionally, the computer readable storage medium may include: read-only memory (ROM, Read Only Memory),
Random access memory (RAM, Random Access Memory), solid state hard disk (SSD, Solid State Drives) or light
Disk etc..Wherein, random access memory may include resistive random access memory body (ReRAM, Resistance Random
Access Memory) and dynamic random access memory (DRAM, Dynamic Random Access Memory).Above-mentioned
Apply for that embodiment sequence number is for illustration only, does not represent the advantages or disadvantages of the embodiments.
Those of ordinary skill in the art will appreciate that realizing that all or part of the steps of above-described embodiment can pass through hardware
It completes, relevant hardware can also be instructed to complete by program, the program can store in a kind of computer-readable
In storage medium, storage medium mentioned above can be read-only memory, disk or CD etc..
The foregoing is merely the preferred embodiments of the application, not to limit the application, it is all in spirit herein and
Within principle, any modification, equivalent replacement, improvement and so on be should be included within the scope of protection of this application.
Claims (15)
1. a kind of display methods of interface control, which is characterized in that the described method includes:
It shows the first user interface of application program, includes the visual angle using virtual role in first user interface to described
The first picture that virtual environment is observed also is superimposed with n visual interface control on first picture, and n is positive integer;
It receives and hides control trigger action;
According to the hiding control trigger action, the second user interface of the application program, the second user interface are shown
In include the second picture for being observed the virtual environment of the visual angle using the virtual role, second picture
On be not superimposed with the visual interface control or Overlapping display has the m visual interface controls, the m visual interface control
It is a part in the n visual interface control, 0 < m < n.
2. the method according to claim 1, wherein the method is applied in terminal;
Described receive hides control trigger action, comprising:
Receive the hiding control trigger action acted on other components in the terminal in addition to display screen.
3. according to the method described in claim 2, it is characterized in that, other components include sensor element, the sensing
Device component is at least one of range sensor, light sensor and front camera;
The reception acts on the hiding control trigger action on other components in the terminal in addition to display screen, packet
It includes:
Reception, which acts on the sensor element in the terminal, blocks operation;
It blocks operation by described and is determined as the hiding control trigger action.
4. according to the method described in claim 2, it is characterized in that, the terminal is the sliding cover type for including upper slip lid and lower sliding cover
Terminal, other components include the slip lid monitoring component for detecting the slip lid state of the upper slip lid and the lower sliding cover;
The reception acts on the hiding control trigger action on other components in the terminal in addition to display screen, packet
It includes:
The slip lid event that the slip lid monitoring component reports is received, the slip lid event is the upper slip lid and the lower sliding cover phase
The event generated when to sliding;
When the slip lid event is along the first slip lid event of preset direction sliding, the first slip lid event is determined as institute
State hiding control trigger action.
5. according to the method described in claim 2, it is characterized in that, other components include microphone;
The reception acts on the hiding control trigger action on other components in the terminal in addition to display screen, packet
It includes:
Receive the clicking operation acted on the microphone in the terminal;
When the click feature of the clicking operation meets default feature, the clicking operation is determined as the hiding control and is touched
Hair operation.
6. the method according to claim 1, wherein in first user interface include slip scan control,
The slip scan control is used for through the slip scan item vertical with the slip scan control to first user interface
It is scanned, sweep length is the corresponding width of first user interface;
Described receive hides control trigger action, comprising:
The slide acted on the slip scan control is received, the slide is for controlling the slip scan item
Slip scan is carried out to first user interface along the slip scan control;
The slide is determined as the hiding control trigger action.
7. method according to any one of claims 1 to 6, which is characterized in that it is described according to the hiding control trigger action,
Show the second user interface of the application program, comprising:
The transparency for the control for needing to be hidden in the n visual interface control is set according to the hiding control trigger action
It is set to all-transparent, needs to be hidden or, cancelled according to the hiding control trigger action and being shown in the n visual interface control
Control.
8. the method according to the description of claim 7 is characterized in that in the control being hidden include the first control, it is described
First control is located at the first position in first user interface;
The control that will need to be hidden in the n visual interface control according to the hiding control trigger action it is transparent
Degree is set as after all-transparent, further includes:
The first position displaying target at the second user interface indicates that content, the Target indication content are used to indicate
The first position locating for first control, and the display area of the Target indication content is less than first control and exists
Display area before being hidden.
9. according to the method described in claim 8, it is characterized in that, in the n visual interface control include can click control,
First control can click control described in belonging to;
The first position displaying target at the second user interface indicates before content, further includes:
It obtains history control and uses data, the history control is using data for indicating in the n visual interface control
The frequency clicking control and being clicked;
Can to click the highest control of history frequency of use in control be described described in being determined according to the history control using data
First control;
Determine first control first position locating in the user interface.
10. according to the method described in claim 9, it is characterized in that, the visual interface control is the in the user interface
The interface control of one level, second contact surface control are the second level interface control that display is triggered on the visual interface control
Part includes the second control, the second place of second control in first user interface in the control being hidden;
The control that will need to be hidden in the n visual interface control according to the hiding control trigger action it is transparent
Degree is set as after all-transparent, further includes:
Receive the selection operation in the second place at the second user interface;
Show that second contact surface control corresponding with second control, the second contact surface control are shown according to the selection operation
Show that transparency is non-all-transparent.
11. a kind of display device of interface control, which is characterized in that described device includes:
Display module includes using virtual angle in first user interface for showing the first user interface of application program
The first picture that the virtual environment is observed at the visual angle of color is also superimposed with n visual interface control on first picture
Part, n are positive integer;
Receiving module, for receiving hiding control trigger action;
The display module is also used to show second user circle of the application program according to the hiding control trigger action
Face includes the observed using the visual angle of the virtual role the virtual environment in the second user interface
Two pictures, not being superimposed with the visual interface control or Overlapping display on second picture has the m visual interface controls,
The m visual interface control is a part in the n visual interface control, 0 < m < n.
12. device according to claim 11, which is characterized in that described device is applied in terminal;
The receiving module, be also used to receive act on other components in the terminal in addition to display screen it is described hide
Control trigger action.
13. device according to claim 12, which is characterized in that other components include sensor element, the biography
Inductor components are at least one of range sensor, light sensor and front camera;
The receiving module is also used to receive to act on the sensor element in the terminal and blocks operation;
Described device further include:
Determining module, for blocking operation by described and being determined as the hiding control trigger action.
14. a kind of terminal, which is characterized in that the terminal includes processor and memory, is stored at least in the memory
One instruction, at least a Duan Chengxu, code set or instruction set, at least one instruction, an at least Duan Chengxu, the generation
Code collection or instruction set are loaded by the processor and are executed to realize the aobvious of the interface control as described in claims 1 to 10 is any
Show method.
15. a kind of computer readable storage medium, which is characterized in that be stored at least one instruction, extremely in the storage medium
A few Duan Chengxu, code set or instruction set, at least one instruction, an at least Duan Chengxu, the code set or instruction
Collection is loaded by the processor and is executed the display methods to realize the interface control as described in claims 1 to 10 is any.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811433102.7A CN109529319B (en) | 2018-11-28 | 2018-11-28 | Display method and device of interface control and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811433102.7A CN109529319B (en) | 2018-11-28 | 2018-11-28 | Display method and device of interface control and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109529319A true CN109529319A (en) | 2019-03-29 |
CN109529319B CN109529319B (en) | 2020-06-02 |
Family
ID=65850898
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811433102.7A Active CN109529319B (en) | 2018-11-28 | 2018-11-28 | Display method and device of interface control and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109529319B (en) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110465073A (en) * | 2019-08-08 | 2019-11-19 | 腾讯科技(深圳)有限公司 | Method, apparatus, equipment and the readable storage medium storing program for executing that visual angle adjusts in virtual environment |
CN110711386A (en) * | 2019-10-23 | 2020-01-21 | 网易(杭州)网络有限公司 | Method and device for processing information in game, electronic equipment and storage medium |
CN110882537A (en) * | 2019-11-12 | 2020-03-17 | 北京字节跳动网络技术有限公司 | Interaction method, device, medium and electronic equipment |
CN111294637A (en) * | 2020-02-11 | 2020-06-16 | 北京字节跳动网络技术有限公司 | Video playing method and device, electronic equipment and computer readable medium |
CN111309416A (en) * | 2020-01-19 | 2020-06-19 | 北京字节跳动网络技术有限公司 | Information display method, device and equipment of application interface and readable medium |
CN111672121A (en) * | 2020-06-11 | 2020-09-18 | 腾讯科技(深圳)有限公司 | Virtual object display method and device, computer equipment and storage medium |
CN111885256A (en) * | 2020-07-16 | 2020-11-03 | 深圳传音控股股份有限公司 | Terminal control method, terminal and computer storage medium |
CN112286782A (en) * | 2019-07-23 | 2021-01-29 | 腾讯科技(深圳)有限公司 | Control shielding detection method, software detection method, device and medium |
CN112330823A (en) * | 2020-11-05 | 2021-02-05 | 腾讯科技(深圳)有限公司 | Virtual item display method, device, equipment and readable storage medium |
WO2021147698A1 (en) * | 2020-01-21 | 2021-07-29 | 华为技术有限公司 | Navigation method for foldable screen, and related apparatuses |
CN113262492A (en) * | 2021-04-28 | 2021-08-17 | 网易(杭州)网络有限公司 | Game data processing method and device and electronic terminal |
CN115113785A (en) * | 2021-03-17 | 2022-09-27 | 深圳市万普拉斯科技有限公司 | Application program operation method and device, computer equipment and storage medium |
WO2022237072A1 (en) * | 2021-05-14 | 2022-11-17 | 腾讯科技(深圳)有限公司 | Control display method and apparatus, device, medium, and program product |
EP3991816A4 (en) * | 2019-06-25 | 2023-07-19 | Colopl, Inc. | Game program, game method, and information terminal device |
US12019837B2 (en) | 2021-05-14 | 2024-06-25 | Tencent Technology (Shenzhen) Company Limited | Control display method and apparatus, device, medium, and program product |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004094741A (en) * | 2002-09-02 | 2004-03-25 | Matsushita Electric Ind Co Ltd | Information transmitter/receiver and method for transmitting and receiving information |
CN105224171A (en) * | 2015-09-22 | 2016-01-06 | 小米科技有限责任公司 | icon display method, device and terminal |
CN105373379A (en) * | 2015-10-10 | 2016-03-02 | 网易(杭州)网络有限公司 | Game interface switching method and device |
CN107678647A (en) * | 2017-09-26 | 2018-02-09 | 网易(杭州)网络有限公司 | Virtual shooting main body control method, apparatus, electronic equipment and storage medium |
CN108717733A (en) * | 2018-06-07 | 2018-10-30 | 腾讯科技(深圳)有限公司 | View angle switch method, equipment and the storage medium of virtual environment |
-
2018
- 2018-11-28 CN CN201811433102.7A patent/CN109529319B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004094741A (en) * | 2002-09-02 | 2004-03-25 | Matsushita Electric Ind Co Ltd | Information transmitter/receiver and method for transmitting and receiving information |
CN105224171A (en) * | 2015-09-22 | 2016-01-06 | 小米科技有限责任公司 | icon display method, device and terminal |
CN105373379A (en) * | 2015-10-10 | 2016-03-02 | 网易(杭州)网络有限公司 | Game interface switching method and device |
CN107678647A (en) * | 2017-09-26 | 2018-02-09 | 网易(杭州)网络有限公司 | Virtual shooting main body control method, apparatus, electronic equipment and storage medium |
CN108717733A (en) * | 2018-06-07 | 2018-10-30 | 腾讯科技(深圳)有限公司 | View angle switch method, equipment and the storage medium of virtual environment |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3991816A4 (en) * | 2019-06-25 | 2023-07-19 | Colopl, Inc. | Game program, game method, and information terminal device |
CN112286782A (en) * | 2019-07-23 | 2021-01-29 | 腾讯科技(深圳)有限公司 | Control shielding detection method, software detection method, device and medium |
CN112286782B (en) * | 2019-07-23 | 2023-10-10 | 腾讯科技(深圳)有限公司 | Control shielding detection method, software detection method, device and medium |
CN110465073A (en) * | 2019-08-08 | 2019-11-19 | 腾讯科技(深圳)有限公司 | Method, apparatus, equipment and the readable storage medium storing program for executing that visual angle adjusts in virtual environment |
CN110711386A (en) * | 2019-10-23 | 2020-01-21 | 网易(杭州)网络有限公司 | Method and device for processing information in game, electronic equipment and storage medium |
CN110882537A (en) * | 2019-11-12 | 2020-03-17 | 北京字节跳动网络技术有限公司 | Interaction method, device, medium and electronic equipment |
CN111309416A (en) * | 2020-01-19 | 2020-06-19 | 北京字节跳动网络技术有限公司 | Information display method, device and equipment of application interface and readable medium |
WO2021147698A1 (en) * | 2020-01-21 | 2021-07-29 | 华为技术有限公司 | Navigation method for foldable screen, and related apparatuses |
WO2021160142A1 (en) * | 2020-02-11 | 2021-08-19 | 北京字节跳动网络技术有限公司 | Video playback method and apparatus, electronic device, and computer readable medium |
CN111294637A (en) * | 2020-02-11 | 2020-06-16 | 北京字节跳动网络技术有限公司 | Video playing method and device, electronic equipment and computer readable medium |
US12022163B2 (en) | 2020-02-11 | 2024-06-25 | Beijing Bytedance Network Technology Co., Ltd. | Video playing method and apparatus, electronic device and computer readable medium |
CN111672121A (en) * | 2020-06-11 | 2020-09-18 | 腾讯科技(深圳)有限公司 | Virtual object display method and device, computer equipment and storage medium |
CN111885256A (en) * | 2020-07-16 | 2020-11-03 | 深圳传音控股股份有限公司 | Terminal control method, terminal and computer storage medium |
CN112330823B (en) * | 2020-11-05 | 2023-06-16 | 腾讯科技(深圳)有限公司 | Virtual prop display method, device, equipment and readable storage medium |
CN112330823A (en) * | 2020-11-05 | 2021-02-05 | 腾讯科技(深圳)有限公司 | Virtual item display method, device, equipment and readable storage medium |
CN115113785A (en) * | 2021-03-17 | 2022-09-27 | 深圳市万普拉斯科技有限公司 | Application program operation method and device, computer equipment and storage medium |
CN113262492A (en) * | 2021-04-28 | 2021-08-17 | 网易(杭州)网络有限公司 | Game data processing method and device and electronic terminal |
CN113262492B (en) * | 2021-04-28 | 2024-02-02 | 网易(杭州)网络有限公司 | Game data processing method and device and electronic terminal |
WO2022237072A1 (en) * | 2021-05-14 | 2022-11-17 | 腾讯科技(深圳)有限公司 | Control display method and apparatus, device, medium, and program product |
US12019837B2 (en) | 2021-05-14 | 2024-06-25 | Tencent Technology (Shenzhen) Company Limited | Control display method and apparatus, device, medium, and program product |
Also Published As
Publication number | Publication date |
---|---|
CN109529319B (en) | 2020-06-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109529319A (en) | Display methods, equipment and the storage medium of interface control | |
JP7264993B2 (en) | Method and apparatus for controlling interaction between virtual object and projectile, and computer program | |
CN108717733B (en) | View angle switch method, equipment and the storage medium of virtual environment | |
CN109045695A (en) | Accessory selection method, equipment and storage medium in virtual environment | |
AU2020294677B2 (en) | Method and apparatus for controlling virtual object to mark virtual item, and medium | |
WO2019201047A1 (en) | Method for adjusting viewing angle in virtual environment, device, and readable storage medium | |
CN110465073A (en) | Method, apparatus, equipment and the readable storage medium storing program for executing that visual angle adjusts in virtual environment | |
CN109126129A (en) | The method, apparatus and terminal that virtual objects are picked up in virtual environment | |
JP7145331B2 (en) | Observation method, device and computer program for virtual items in virtual environment | |
CN110115838A (en) | Method, apparatus, equipment and the storage medium of mark information are generated in virtual environment | |
CN110427111A (en) | The operating method of virtual item, device, equipment and storage medium in virtual environment | |
WO2019205881A1 (en) | Method and apparatus for displaying information in virtual environment, device, and storage medium | |
CN109350964A (en) | Control method, apparatus, equipment and the storage medium of virtual role | |
CN108710525A (en) | Map methods of exhibiting, device, equipment and storage medium in virtual scene | |
CN109634413B (en) | Method, device and storage medium for observing virtual environment | |
TWI802978B (en) | Method and apparatus for adjusting position of widget in application, device, and storage medium | |
CN109821237A (en) | Method, apparatus, equipment and the storage medium of visual angle rotation | |
CN111273780B (en) | Animation playing method, device and equipment based on virtual environment and storage medium | |
CN110448908A (en) | The application method of gun sight, device, equipment and storage medium in virtual environment | |
CN108786110A (en) | Gun sight display methods, equipment and storage medium in virtual environment | |
CN109917910A (en) | Display methods, device, equipment and the storage medium of line style technical ability | |
CN110393916A (en) | Method, apparatus, equipment and the storage medium of visual angle rotation | |
CN109407959A (en) | Virtual object control method, equipment and storage medium in virtual scene | |
CN110533756A (en) | Setting method, device, equipment and the storage medium of attaching type ornament | |
CN108744511A (en) | Gun sight display methods, equipment and storage medium in virtual environment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |