CN109806583A - Method for displaying user interface, device, equipment and system - Google Patents
Method for displaying user interface, device, equipment and system Download PDFInfo
- Publication number
- CN109806583A CN109806583A CN201910068959.1A CN201910068959A CN109806583A CN 109806583 A CN109806583 A CN 109806583A CN 201910068959 A CN201910068959 A CN 201910068959A CN 109806583 A CN109806583 A CN 109806583A
- Authority
- CN
- China
- Prior art keywords
- terminal
- information
- environment
- related image
- picture
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Landscapes
- Information Transfer Between Computers (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
This application discloses a kind of method for displaying user interface, device, equipment and systems, belong to field of computer technology.The described method includes: display includes the first environment picture of virtual environment and the first user interface of collaboration display control;Request signal is sent when receiving the signal triggered on collaboration display control;Receive the feedback signal for carrying unique identifying information that second terminal is sent;The information of unique identifying information and destination virtual object is sent to server, trigger the server sends the corresponding information of related image that obtains according to the acquisition of information of virtual objects to second terminal according to unique identifying information, and second terminal generates after receiving the information of related image and shows related image.The application allows users to the information for acquiring more virtual environments by related image, improves the easy-to-use degree of application program by showing in second terminal and showing related image on first terminal.
Description
Technical field
This application involves field of computer technology, in particular to a kind of method for displaying user interface, device, equipment and it is
System.
Background technique
In such as terminal of smart phone, tablet computer etc, there are the application programs of many virtual environments, such as military
Emulation application, third person shooting game (13ird-Personal Shooting Game, TPS game), the first person
Shooting game (First-person shooting game, FPS game), the online tactics competitive game (Multiplayer of more people
Online Battle Arena Games, MOBA game) etc..
When terminal operationally states application program, the user interface of application program is shown on the display screen of terminal, the use
The environment picture of virtual environment is usually shown in the interface of family.For example, showing in the user interface of MOBA game with visual human
The environment picture of the point of observation observation virtual environment of object oblique upper.
Since the content of the middle display of the environment picture shown in terminal is limited, the information that user obtains from environment picture
It is less, it is poor so as to cause the easy-to-use degree of application program.
Summary of the invention
The embodiment of the present application provides a kind of method for displaying user interface, device, equipment and system to solve related skill
The poor problem of the less easy-to-use degree for leading to application program of the environment image information amount that terminal is shown in art.The technical side
Case is as follows:
On the one hand, the embodiment of the present application provides a kind of method for displaying user interface, and the method is executed by first terminal,
The described method includes:
It shows the first user interface of application program, shows first environment picture and collaboration in first user interface
Show that control, the first environment picture include first determined by target position with destination virtual object in virtual environment
The picture of regional area;
When receiving the signal triggered on the collaboration display control, sent out by the first wireless near field communication component
Sending request signal, the request signal is used to request the application program in the second terminal to show related image, described
Related image is to show picture associated with the destination virtual object;
Second terminal, which is received, by the first wireless near field communication component passes through the second wireless near field communication component
The feedback signal of transmission includes the corresponding unique identification of the application program in the second terminal in the feedback signal
Information;
The information of the unique identifying information and the destination virtual object, the unique identifying information are sent to server
The corresponding information of the related image is sent to the second terminal for triggering the server;The destination virtual object
Information is for assisting the server corresponding information of related image according to the acquisition of information of the destination virtual object.
On the one hand, the embodiment of the present application provides a kind of method for displaying user interface, and the method is executed by second terminal,
The described method includes:
First terminal is received by the second wireless near field communication component to send by the first wireless near field communication component
Request signal;
Feedback signal, the feedback signal are sent to the first terminal by the second wireless near field communication component
In carry the unique identifying information of application program in the second terminal, the unique identifying information is sent out for trigger the server
Send the related image corresponding information;
Receive the corresponding information of the related image that the server is sent;
The related image is generated according to the corresponding information of the related image;
It shows the second user interface of the application program, shows the related image in the second user interface.
On the one hand, the embodiment of the present application provides a kind of method for displaying user interface, and the method is executed by server, institute
The method of stating includes:
Receive the information and unique identifying information of the destination virtual object that first terminal is sent;The destination virtual object is
The virtual objects in application program run in the first terminal, the unique identifying information are that described in second terminal answers
With the corresponding identification information of program;
According to the corresponding information of acquisition of information related image of the destination virtual object, the related image is to show
Picture associated with the destination virtual object;
The related image pair is sent according to the application program of the unique identifying information into the second terminal
The information answered, the corresponding information of the related image is for assisting the second terminal to generate the related image.
On the one hand, the embodiment of the present application provides a kind of method for displaying user interface, and the method is executed by second terminal,
The described method includes:
Show the user interface of application program;
Show that request window, the request window are that the application requests in first terminal are aobvious in the user interface
Show the window of related image, is shown in the request window and receive control;
When receive it is described receive the trigger signal triggered on control when, show second user interface, it is described second use
The related image is shown in the interface of family, the related image shows picture associated with destination virtual object, described
Destination virtual object is virtual objects that are movable in virtual environment and being controlled by the first terminal.
On the one hand, the embodiment of the present application provides a kind of user interface display device, and described device is applied to first terminal
In, described device includes:
Display module shows first in first user interface for showing the first user interface of application program
Environment picture and collaboration display control, the first environment picture includes the target position with destination virtual object in virtual environment
The picture in first partial region determined by setting;
First close range wireless communication module, for when receive it is described collaboration display control on trigger signal when,
Request signal is sent by the first wireless near field communication component, the request signal is used to request the institute in the second terminal
It states application program and shows related image, the related image is to show picture associated with the destination virtual object;It is logical
It crosses the first wireless near field communication component and receives the feedback that second terminal passes through the transmission of the second wireless near field communication component
Signal includes the corresponding unique identifying information of the application program in the second terminal in the feedback signal;
Sending module, for sending the information of the unique identifying information and the destination virtual object, institute to server
It states unique identifying information and sends the corresponding information of the related image to the second terminal for triggering the server;It is described
The information of destination virtual object is for assisting the server to be associated with picture according to the acquisition of information of the destination virtual object
The corresponding information in face.
On the one hand, the embodiment of the present application provides a kind of user interface display device, and described device is applied to second terminal
In, described device includes:
Second close range wireless communication module passes through for receiving first terminal by the second wireless near field communication component
The request signal that first wireless near field communication component is sent;By the second wireless near field communication component to described first
Terminal sends feedback signal, and the unique identifying information of application program in the second terminal, institute are carried in the feedback signal
It states unique identifying information and sends the corresponding information of the related image for trigger the server;
Receiving module, the corresponding information of the related image sent for receiving the server;
Processing module, for generating the related image according to the corresponding information of the related image;
Display module is shown in the second user interface for showing the second user interface of the application program
The related image.
On the one hand, the embodiment of the present application provides a kind of user interface display device, and described device is applied in server,
Described device includes:
Receiving module, the information and unique identifying information of the destination virtual object for receiving first terminal transmission;It is described
Destination virtual object is the virtual objects in the application program run in the first terminal, and the unique identifying information is second
The corresponding identification information of the application program in terminal;
Processing module, for the corresponding information of acquisition of information related image according to the destination virtual object, the pass
Joining picture is to show picture associated with the destination virtual object;
Sending module, for sending institute according to the application program of the unique identifying information into the second terminal
State the corresponding information of related image, the corresponding information of the related image is for assisting the second terminal to generate the association picture
Face.
On the one hand, the embodiment of the present application provides a kind of terminal, and the terminal includes processor, memory and the first low coverage
From wireless communication components, at least one instruction is stored in the memory, described instruction is loaded and executed by the processor
To realize the method for displaying user interface executed as described above by first terminal.
On the one hand, the embodiment of the present application provides a kind of terminal, and the terminal includes processor, memory and the second low coverage
From wireless communication components, at least one instruction is stored in the memory, described instruction is loaded and executed by the processor
To realize the method for displaying user interface executed as described above by second terminal.
On the one hand, the embodiment of the present application provides a kind of computer equipment, and the computer equipment includes processor and deposits
Reservoir, at least one instruction is stored in the memory, and described instruction is loaded by the processor and executed as above to realize
The method for displaying user interface executed by server.
On the one hand, the embodiment of the present application provides a kind of computer system, and the computer system includes as described above
Dress is shown applied to user interface display device, the user interface as described above applied in second terminal in first terminal
The user interface display device set and be applied in server as described above;
Or,
The computer system includes first terminal as described above, second terminal as described above and as described above
Computer equipment.
On the one hand, the embodiment of the present application provides a kind of computer readable storage medium, is stored in the storage medium
At least one instruction, described instruction are loaded by processor and are executed to realize method for displaying user interface as described above.
Technical solution bring beneficial effect provided by the embodiments of the present application includes at least:
First terminal and second terminal are determined by respective wireless near field communication component shows pass in second terminal
After joining picture, server sends the corresponding information of related image to second terminal, and second terminal is according to the corresponding letter of related image
Breath generates and shows related image;It is related to destination virtual object due to being shown in the second terminal near first terminal
The related image of connection allows users to so as to show more information associated with destination virtual object by second terminal
The information for obtaining more virtual environments improves the easy-to-use degree of application program.
Detailed description of the invention
In order to more clearly explain the technical solutions in the embodiments of the present application, make required in being described below to embodiment
Attached drawing is briefly described, it should be apparent that, the drawings in the following description are only some examples of the present application, for
For those of ordinary skill in the art, without creative efforts, it can also be obtained according to these attached drawings other
Attached drawing.
Fig. 1 is the structural block diagram for the computer system that one exemplary embodiment of the application provides;
Fig. 2 is the flow chart for the method for displaying user interface that one exemplary embodiment of the application provides;
Fig. 3 is the schematic diagram for the user interface that one exemplary embodiment of the application provides;
Fig. 4 is the schematic diagram for the user interface that one exemplary embodiment of the application provides;
Fig. 5 is the flow chart for the method for displaying user interface that one exemplary embodiment of the application provides;
Fig. 6 is that schematic diagram is selected in the visible area that one exemplary embodiment of the application provides;
Fig. 7 is the flow chart for the method for displaying user interface that one exemplary embodiment of the application provides;
Fig. 8 is the schematic diagram of the point of observation for the three-dimensional virtual environment that one exemplary embodiment of the application provides;
Fig. 9 is the schematic diagram for the user interface that one exemplary embodiment of the application provides;
Figure 10 is the block diagram for the user interface display device that one exemplary embodiment of the application provides;
Figure 11 is the block diagram for the user interface display device that one exemplary embodiment of the application provides;
Figure 12 is the block diagram for the user interface display device that one exemplary embodiment of the application provides;
Figure 13 is the block diagram for the terminal that one exemplary embodiment of the application provides;
Figure 14 is the block diagram for the computer equipment that one exemplary embodiment of the application provides.
Specific embodiment
To keep the purposes, technical schemes and advantages of the application clearer, below in conjunction with attached drawing to the application embodiment party
Formula is described in further detail.
Firstly, to the invention relates to several nouns simply introduced:
Wireless near field communication: being a kind of communication realized in shorter distance range.The application is implemented
In example, the short distance wireless communication technology includes Wireless Fidelity point-to-point transmission (Wireless-Fidelity Direct, Wi-Fi
Direct) technology, bluetooth (Blue Too13) technology, bluetooth Wi-Fi or NFC (Near Field Communication, near field
Communication) technology.
Wi-Fi Direct: being to realize point within the scope of relatively short distance using radio frequency (Radio Frequency, RF)
To the technology of point data transmission.
Bluetooth (Blue Too13): being the open global specifications of a kind of wireless data and voice communication, substantive content
It is that communication environment between fixed equipment or mobile device establishes general radio air interfaces, by the communication technology and calculates
Machine technology makes various electronic equipments in the case where no electric wire or cable are connected with each other further combined with getting up, can be in low coverage
From the technology that realization is in communication with each other or operates in range.
Bluetooth Wi-Fi: it scans to obtain the device identification of neighbouring terminal by Bluetooth technology, passes through Wi-FiDirect skill
The technology that art carries out data transmission.
NFC: being a kind of radio technology of short distance high frequency, has the function of data double-way transmission in shorter distance range
Can, many terminals are equipped with near-field communication component at present.When two terminals with NFC function enter near-field communication component
, it can be achieved that data double-way transmits when working range.
Virtual environment: being the virtual environment of display when application program is run at the terminal (or offer).The virtual environment can
To be the simulated environment to real world, it is also possible to the semifictional three-dimensional environment of half emulation, can also be pure imaginary three-dimensional
Environment.Virtual environment can be any one in two-dimensional virtual environment, 2.5 dimension virtual environments and three-dimensional virtual environment.It is optional
Ground, the virtual environment are also used to the battle between at least two destination virtual objects.
Virtual objects: refer to the movable object in virtual environment.The movable object can be virtual portrait, virtual
At least one of animal, cartoon character.Optionally, when virtual environment is three-dimensional virtual environment, destination virtual object is base
In the three-dimensional stereo model of animation bone technology creation.Each destination virtual object is in three-dimensional virtual environment with the shape of itself
Shape and volume occupy a part of space in three-dimensional virtual environment.
View-finder: being the area of visual field of virtual objects in two-dimensional virtual environment.Illustratively, two-dimensional virtual environment includes
Movable virtual objects and virtual objects in the background map of 10000 × 10000 pixels and the background map, view-finder be with
Position where virtual objects is reference point (such as central point), and side length is the box of 160 × 90 pixels, and electronic equipment can pass through
The view-finder finds a view virtual environment to obtain the corresponding area of visual field of virtual objects, includes finding a view to obtain in the area of visual field
The local map of virtual environment, virtual object present on movable virtual objects and the local map in the local map
Product.
Point of observation: be otherwise known as virtual camera, is the corresponding observation position of virtual objects in three-dimensional virtual environment.Observation
Point has the relative position and visual angle relative to virtual objects.For example, the coordinate difference of point of observation and virtual objects be (0 ,-△ y1,
△ z1), observation visual angle α, the picture of the virtual environment shown in terminal is that the environment of point of observation observation virtual environment is drawn
Face.
Fig. 1 shows the structural block diagram of the computer system of one exemplary embodiment of the application offer.The department of computer science
System 100 includes: first terminal 110, second terminal 120 and server 130.
110 installation and operation of first terminal has the application program for supporting virtual environment.The application program can be military imitative
True application program, TPS game, FPS game, MOBA game, any one in more people's gunbattle class survival games.First terminal
110 be the terminal that the first user uses, and the first user is located at the destination virtual pair in virtual environment using the control of first terminal 110
As carry out activity, which includes but is not limited to: adjustment body posture creeps, walking, runs, rides, jumping, driving, picking up
At least one of take, shoot, attack, throw.Schematically, destination virtual object is virtual portrait, such as human simulation object angle
Color or cartoon character role.
First terminal 110 and second terminal 120 are connected by wireless network or cable network with server 130.
Server 130 includes at least one in a server, multiple servers, cloud computing platform and virtualization center
Kind.Server 130 is used to that the application program of virtual environment to be supported to provide background service.Optionally, server 130 undertakes mainly
Work is calculated, first terminal 110 and second terminal 120 undertake secondary calculation work;Alternatively, server 130 undertakes secondary calculation
Work, first terminal 110 and second terminal 120 undertake main calculating work;Alternatively, server 130, first terminal 110 and
Cooperated computing is carried out using distributed computing architecture between two terminals, 120 three.
It is provided with the first wireless near field communication component in first terminal 110, is provided with the second low coverage in second terminal 120
From wireless communication components, first terminal 110 and second terminal 120 can pass through the first wireless near field communication component and the second low coverage
Realize that wireless near field communication exchanges data from wireless communication components.
First terminal 110 shows the first user interface, the first user interface when running application program on a display screen
In show the first environment picture and collaboration display control of virtual environment, when receiving the letter that triggers on collaboration display control
Number when, by the first wireless near field communication component send request signal, the request signal for request receiving request letter
Number terminal on show related image, which is to show picture associated with destination virtual object, when receiving
When the feedback signal that second terminal 120 is sent by the second wireless near field communication component, by wired or wireless network to clothes
Business device 130 sends the unique identifying information and destination virtual of the application program in the second terminal 120 carried in feedback signal
The information of object.
After server 130 receives the unique identifying information of the transmission of first terminal 110 and the information of destination virtual object,
According to the corresponding information of acquisition of information related image of destination virtual object;According to unique identifying information, by wired or wireless
Network sends the corresponding information of related image to second terminal 120.
Second terminal 120 is equipped with and the application journey installed in the identical application program of first terminal 110 or two terminals
Sequence is the same type application program of different control system platforms.Second terminal 120 is close by second after starting application program
After distance wireless communication component receives request signal of the first terminal 110 by the transmission of the first wireless near field communication component, lead to
It crosses the second wireless near field communication component and carries the anti-of the corresponding unique identifying information of application program to the transmission of first terminal 110
Feedback signal is generated when the corresponding information of the related image for receiving the transmission of server 130 according to the corresponding information of related image
Related image, display include the second user interface of related image.
First terminal 110 can refer to one in multiple terminals, and second terminal 120 can refer to one in multiple terminals
A, the present embodiment is only illustrated with first terminal 110 and second terminal 120.First terminal 110 and second terminal 120
Device type is identical or different, which includes: smart phone, tablet computer, E-book reader, MP3 (Moving
Picture Experts Group Audio Layer III, dynamic image expert's compression standard audio level 3) player,
MP4 (Moving Picture Experts Group Audio Layer IV, dynamic image expert's compression standard audio level
4) at least one of player.
Those skilled in the art could be aware that the quantity of above-mentioned terminal can be more or less.For example above-mentioned terminal can be with
Only one perhaps above-mentioned terminal be tens or several hundred or greater number.The embodiment of the present application to the quantity of terminal and
Device type is not limited.
Fig. 2 shows one exemplary embodiment of the application and provides the flow chart of method for displaying user interface.This method can
Applied in computer system 100 as shown in Figure 1.This method comprises:
Step 201, first terminal shows the first user interface of application program, has shown first in the first user interface
Environment picture and collaboration display control.
The application program of virtual environment is installed, when first terminal runs the application program, display is answered in first terminal
With the first user interface of program, first environment picture and collaboration display control are shown in the first user interface.Wherein, first
Environment picture includes the picture in first partial region determined by target position with destination virtual object in virtual environment, mesh
Mark virtual objects are that user is controlled by first terminal, the movable virtual objects in virtual environment;Collaboration shows that control is
It can be to show and destination virtual pair in the display screen display related image of other terminals, related image for prompting the user with
As associated picture.
Illustratively, as shown in figure 3, showing the first user interface 112 in the display screen 111 of first terminal 110, first
First environment picture and collaboration display control 113 are shown in user interface 112, are shown user in first environment picture and are passed through
The destination virtual object 114 that first terminal 110 controls, and according to the first of the determination of the target position of destination virtual object 114
Regional area.
Step 202, when receiving the signal triggered on collaboration display control, first terminal passes through the first short distance nothing
Line communication component sends request signal.
Illustratively, as shown in figure 3, user's touch-control collaboration display control generates signal, first terminal 110 is receiving touching
After the signal for sending out buddy Control triggering, request signal is sent to second terminal 120 by the first wireless near field communication component, it should
Request signal shows related image in second terminal 120 for requesting.
Optionally, the application program in first terminal supports Wi-Fi Direct function, Bluetooth function, bluetooth Wi-Fi function
At least one of energy and NFC function near radio transfer function.
Illustratively, when the first short-range communication component is the first Wi-Fi component, pass through the first Wi- in first terminal
Before Fi component sends request signal, need to guarantee that Wi-Fi Direct function is in the open state, it can be in application program launching
When automatically turn on Wi-Fi Direct function, or open Wi-Fi Direct function in the setting window of application program, or the
The setting window of one terminal operating system opens Wi-Fi Direct function, or triggers on collaboration display control when receive
Wi-Fi Direct function is opened when signal;When Wi-Fi Direct function is in the open state, first terminal passes through first
Wi-Fi component sends request signal.
Illustratively, when the first short-range communication component is the first bluetooth module, pass through the first bluetooth in first terminal
Before component sends request signal, needs to guarantee that Bluetooth function is in the open state, can be automatically turned in application program launching
Bluetooth function, or Bluetooth function is opened in the setting window of application program, or open in the setting window of first terminal operating system
Bluetooth function is opened, or opens Bluetooth function when receiving the signal triggered on collaboration display control;First terminal passes through the
One bluetooth module is searched for obtain the device identification of second terminal, according to the device identification of second terminal by the first bluetooth module to
Second terminal sends Bluetooth pairing request, receives pairing signal by what the second bluetooth module was sent when receiving second terminal
Afterwards, request signal is sent to second terminal by the first bluetooth module.
Illustratively, when the first short-range communication component is the first bluetooth module and the first Wi-Fi component, eventually first
Before end sends request signal by the first Wi-Fi component, needs to guarantee that Bluetooth function and Wi-Fi Direct function are in and open
State is opened, Bluetooth function and Wi-Fi Direct function, or setting in application program can be automatically turned in application program launching
It sets window and opens Bluetooth function and Wi-Fi Direct function, or open bluetooth function in the setting window of first terminal operating system
It can be with Wi-Fi Direct function, or unlatching Bluetooth function and Wi- when receiving the signal triggered on collaboration display control
Fi Direct function;When Bluetooth function and Wi-FiDirect function in the open state, first terminal passes through the first bluetooth
Component is searched for obtain the device identification of second terminal, passes through the first Wi-Fi component to second according to the device identification of second terminal
Terminal sends request signal.
Illustratively, when the first short-range communication component is the first NFC component, pass through the first NFC group in first terminal
Before part sends request signal, needs to guarantee that NFC function is in the open state, NFC can be automatically turned in application program launching
Function, or NFC function is opened in the setting window of application program, or open NFC in the setting window of first terminal operating system
Function, or NFC function is opened when receiving the signal triggered on collaboration display control;When NFC function is in the open state
When, first terminal sends request signal by the first NFC component.
Step 203, it is close by second when second terminal receives request signal by the second wireless near field communication component
Distance wireless communication component sends feedback signal to first terminal, includes unique identifying information in the feedback signal.
Wherein, which is the corresponding unique identifying information of application program in second terminal.Second terminal
In be equipped with and run in the application program and first terminal installed in application program identical in first terminal or second terminal
Application program be different control system platforms same type application program.Optionally, the application program pair in second terminal
The unique identifying information answered includes at least one of user account number, device identification and IP address of second terminal.
Optionally, it is close by second when second terminal receives request signal by the second wireless near field communication component
It includes but is not limited to following steps that distance wireless communication component, which sends feedback signal to first terminal:
Step 203a passes through first closely when second terminal receives first terminal by the second wireless near field communication component
When the request signal that distance wireless communication component is sent, display request window.
Illustratively, as shown in figure 3, second terminal 220 receives first eventually by the second wireless near field communication component
When the request signal that end 110 is sent by the first wireless near field communication component, request window 122 is shown on display screen 121,
If carrying the corresponding user A of first terminal 110 in request signal, request to show that " user A is initiated to you in window 122
Multi-screen is from cooperating apply " and receive control 123 and refuse control 124.
Step 203b passes through the second near radio when second terminal receives the signal triggered on requesting window
Communication component sends feedback signal to first terminal
Illustratively, as shown in figure 3, user's touch-control, which receives control 123, generates trigger signal, second terminal 120 is being received
Feedback signal is sent to first terminal 110 by the second wireless near field communication component after to trigger signal, in the feedback signal
Carry the unique identifying information of second terminal 120.
The application program that is run in second terminal support Wi-Fi Direct function, Bluetooth function, bluetooth Wi-Fi function with
And at least one of NFC function near radio transfer function.
Illustratively, when the second short-range communication component is the 2nd Wi-Fi component, pass through the 2nd Wi- in second terminal
Before Fi component receives request signal, need to guarantee that Wi-Fi Direct function is in the open state, it can be in application program launching
When automatically turn on Wi-Fi Direct function, or open Wi-Fi Direct function in the setting window of application program, or the
The setting window of two terminal operating systems opens Wi-Fi Direct function;When Wi-Fi Direct function is in the open state
When, second terminal by the 2nd Wi-Fi component receive first terminal send request signal after, by the 2nd Wi-Fi component to
First terminal sends feedback signal.
Illustratively, when the second short-range communication component is the second bluetooth module, pass through the second bluetooth in second terminal
Before component receives request signal, need to guarantee that Bluetooth function is in the open state, it can be by automatic in application program launching
Bluetooth function is opened, or opens Bluetooth function, or the setting window in second terminal operating system in the setting window of application program
Mouth opens Bluetooth function;When Bluetooth function is in the open state, second terminal receives first terminal by the second bluetooth module
The Bluetooth pairing request sent by the first bluetooth module, is sent to first terminal by the second bluetooth module and receives pairing letter
Number, after receiving request signal of the first terminal by the transmission of the first bluetooth module by the second bluetooth module, pass through the second bluetooth
Component sends feedback signal to first terminal.
It should be noted that can be first terminal in the embodiment of the present application and search for obtain by the first bluetooth module
After the device identification of two terminals, Bluetooth pairing request is sent to second terminal by the first bluetooth module;It is also possible to second eventually
End is searched for after obtaining the device identification of first terminal by the second bluetooth module, is sent by the second bluetooth module to first terminal
Bluetooth pairing request, it is not limited here.
Illustratively, when the second short-range communication component is the second bluetooth module and the 2nd Wi-Fi component, eventually second
Before end receives request signal by the 2nd Wi-Fi component, needs to guarantee that Bluetooth function and Wi-Fi Direct function are in and open
State is opened, it can be by automatically turning on Bluetooth function and Wi-Fi Direct function in application program launching, or in application program
Setting window open Bluetooth function and Wi-Fi Direct function, or second terminal operating system setting window open it is blue
Tooth function and Wi-Fi Direct function;When Bluetooth function and Wi-Fi Direct function in the open state, eventually first
After end searches the device identification of second terminal by the first bluetooth module, second terminal passes through the 2nd Wi-Fi component reception the
One terminal passes through the request signal that the first Wi-Fi component is sent, and the equipment mark of first terminal is searched by the second bluetooth module
Know, feedback signal is sent to first terminal by the 2nd Wi-Fi component according to the device identification of first terminal.
Illustratively, when the second short-range communication component is the 2nd NFC component, pass through the 2nd NCF group in second terminal
Before part receives request signal, need to guarantee that NFC function is in the open state, it can be by being opened automatically on startup in application program
NFC function is opened, or opens NFC function in the setting window of application program, or open in the setting window of second terminal operating system
Open NFC function;When NFC function is in the open state, second terminal receives first terminal by the 2nd NFC component and passes through first
After the request signal that NFC component is sent, feedback signal is sent to first terminal by the 2nd NFC component.
Step 204, first terminal sends the information of unique identifying information and destination virtual object to server.
After first terminal receives feedback signal by the first wireless near field communication component, acquire in feedback signal
The unique identifying information for including sends the unique identifying information and destination virtual object to server by wired or wireless network
Information.
The information that first terminal sends unique identifying information and destination virtual object to server is including but not limited to following
Step 204a and step 204b, step 204a and step 204c, step 204a and step 204d:
Step 204a, first terminal send unique identifying information after receiving feedback signal, to server.
For first terminal after receiving feedback signal, determination shows related image in second terminal, sends to server
Unique identifying information, unique identifying information send the corresponding information of related image to second terminal for trigger the server.
Step 204b, first terminal send the information of destination virtual object every first time interval to server.
First terminal sends the information of destination virtual object, the destination virtual object every first time interval to server
Information the corresponding information of related image is generated according to the information of destination virtual object for secondary server.
First terminal sends destination virtual object to server after receiving feedback signal, every first time interval
Information;Or, first terminal every first time interval to server send destination virtual object information the step of independent of
Other steps in the embodiment of the present application.
Step 204c, first terminal send the information of destination virtual object based on event triggering property to server.
Optionally, trigger event includes that the target position of destination virtual object changes the event of triggering.When the first end
When termination receives trigger event, the information of destination virtual object is sent to server.
First terminal sends the letter of destination virtual object based on event triggering property after receiving feedback signal to server
Breath;Or, the step of first terminal sends the information of destination virtual object to server based on event triggering property is independent of this reality
Apply other steps in example.
Step 204d, first terminal send the information of destination virtual object, and first based on event triggering property to server
Terminal sends the information of destination virtual object every first time interval to server.
First terminal sends the information of destination virtual object every first time interval to server, and, it is touched when receiving
When hair event, the information of destination virtual object is sent to server.Optionally, trigger event includes the target of destination virtual object
Position changes the event of triggering.
First terminal can execute step 204d after executing step 204a;Or, step 204d is not depended in this present embodiment
Other steps.
Step 205, server generates the corresponding information of related image according to the information of destination virtual object.
Related image includes at least one of second environment picture, third environment picture and situation picture.
Wherein, second environment picture includes with the picture of the second regional area determined by target position, the second partial zones
The area in domain is greater than the area in first partial region;Third environment picture includes to be associated with pass of the virtual objects in virtual environment
Join the picture in first partial region determined by position, association virtual objects are to be in same camp with the destination virtual object
Virtual objects;Situation picture is the picture for showing the map of related information, and related information is destination virtual object and/or pass
Join the information for including within sweep of the eye of virtual objects, map is the map of virtual environment.
It is corresponding according to the acquisition of information related image of destination virtual object to server for two-dimensional environment
The process of information is illustrated.Virtual environment involved in the embodiment of the present application can be two-dimensional environment, be also possible to 2.5 dimension rings
Border or three-dimensional environment, explanation is exemplary illustration below, does not constitute and limits to virtual environment.
Illustratively, when related image includes second environment picture, the information of destination virtual object includes target position,
Server obtains second game by the second view-finder in virtual environment with target position the second view-finder of determination as a reference point
Portion region acquires information of the information as related image for the virtual objects for including in the second regional area.
Illustratively, when second environment picture includes third environment picture, the information of destination virtual object includes association
The corresponding account number of virtual objects, server correspond to the associated bit that account number acquires association virtual objects according to association virtual objects
It sets, with relative position determining association view-finder as a reference point, obtains first partial by being associated with view-finder in virtual environment
Region acquires information of the information as related image for the virtual objects for including in first partial region.
Illustratively, when second environment picture is situation picture, the information of destination virtual object include target position and
It is associated with the corresponding account number of virtual objects, server acquires the association of association virtual objects according to the account number of association virtual objects
Position, according to target position and relative position the second view-finder of determination as a reference point be associated with view-finder, in virtual environment
The second regional area is obtained by the second view-finder, first partial region is obtained by being associated with view-finder in virtual environment, obtains
Obtain information of the information for the virtual objects for including into the second regional area and first partial region as related image.
Step 206, server sends the corresponding information of related image to second terminal according to unique identifying information.
After server receives the unique identifying information of first terminal transmission, need to send according to unique identifying information determination
The terminal of the corresponding information of related image is second terminal, sends the corresponding information of related image to second terminal.
Step 207, second terminal generates related image according to the corresponding information of related image.
Illustratively, when related image is second environment picture, second terminal according to server send second part
The information for the virtual objects for including in region generates second environment picture.For example, the virtual objects that the second regional area includes
Information includes position and the mark for the virtual objects for including in portion, second game region, and second terminal is by the second view-finder to virtual
Environment is found a view after obtaining the second regional area, generates second environment picture according to above- mentioned information.
Illustratively, when second environment picture is third environment picture, second terminal sent according to server first
The information for the virtual objects for including in regional area generates third environment picture.For example, first partial region include it is virtual right
The information of elephant includes position and the mark for the virtual objects for including in first partial region, and second terminal passes through association view-finder pair
Virtual environment is found a view after obtaining first partial region, generates third environment picture according to above- mentioned information.
Illustratively, when second environment picture is situation map picture, second terminal sent according to server second
The information for the virtual objects for including in regional area and first partial region generates situation map picture.For example, the second partial zones
The information for the virtual objects that domain includes includes position and the mark for the virtual objects for including, first partial area in portion, second game region
The information for the virtual objects that domain includes includes position and the mark for the virtual objects for including in first partial region, and second terminal exists
Situation picture is generated according to above- mentioned information on map.Wherein, which is the map of virtual environment.
Step 208, second terminal shows the second user interface of application program, shows relevant picture in second user interface
Face.
Second terminal shows second user interface on a display screen, and second user shows related image in interface.Example
Property, by taking related image is second environment picture as an example, as shown in figure 4, showing second on the display screen 121 of second terminal 120
User interface 125 shows related image in second user interface 125, which is second environment picture.Due to second
The area of the second regional area shown in environment picture is greater than the first partial region that shows in first environment picture, therefore the
Two environment pictures, which can be shown, does not have display in first environment picture, " the friendly troop in same camp is belonged to destination virtual object
A ", and it is not belonging to destination virtual object " the enemy A " and " enemy B " in same camp.
In conclusion first terminal and second terminal pass through respective wireless near field communication group in the embodiment of the present application
Part is determined show related image in second terminal after, server is to the corresponding information of second terminal transmission related image, second
Terminal generates according to the corresponding information of related image and shows related image;Due to being located in the second terminal near first terminal
Show related image associated with destination virtual object, so as to by second terminal show more with destination virtual object
Associated information allows users to the information for obtaining more virtual environments, improves the easy-to-use degree of application program.
In the embodiment of the present application, virtual environment can be two-dimensional virtual environment, be also possible to 2.5 dimension virtual environments or three-dimensional
Virtual environment.Following figure 5 embodiment will carry out the method for displaying user interface that the application proposes by taking two-dimensional virtual environment as an example
Explanation;Fig. 7 embodiment will be by taking three-dimensional virtual environment as an example, and the method for displaying user interface proposed to the application is illustrated.
Fig. 5 shows the flow chart of the method for displaying user interface of one exemplary embodiment of the application offer.This method
It can be applied in computer system 100 as shown in Figure 1.This method comprises:
Step 501, first terminal obtains target position of the destination virtual object in virtual environment.
The application program of virtual environment is installed, destination virtual object is that user is controlled by first terminal in first terminal
Virtual environment in object.
Step 502, first terminal takes virtual environment by the first view-finder determined centered on target position
Scape obtains first partial region.
First terminal determines first using target position as the central point of the first view-finder, according to the size of the first view-finder
It after view-finder, finds a view virtual environment, obtains first partial region.
Illustratively, as shown in fig. 6, first terminal is according to centered on target position 611, according to the first view-finder 610
Size (length and width of the first view-finder 610), find a view virtual environment 600, obtain first partial region 620.
Step 503, first terminal is according to first partial Area generation first environment picture.
First terminal is according to the first partial Area generation first environment picture acquired.
Step 504, first terminal shows the first user interface of application program, shows the first ring in the first user interface
Border picture and collaboration display control.
The application program of virtual environment is installed, when first terminal runs the application program, display is answered in first terminal
With the first user interface of program.
First terminal shows that the step of the first user interface can refer to the step 201 in Fig. 2 embodiment, does not do herein superfluous
It states.
Step 505, when receiving the signal triggered on collaboration display control, first terminal passes through the first short distance nothing
Line communication component sends request signal.
First terminal is when receiving in the signal triggered on collaboration display control through the first wireless near field communication group
The step of part sends request signal to second terminal can refer to the step 202 in Fig. 2 embodiment, and this will not be repeated here.
Step 506, it is close by second when second terminal receives request signal by the second wireless near field communication component
Distance wireless communication component sends feedback signal to first terminal.
Wherein, the corresponding unique identifying information of application program in second terminal is included in feedback signal.In second terminal
It is equipped with and runs in the application program and first terminal installed in application program identical in first terminal or second terminal
Application program is the same type application program of different control system platforms.
When second terminal receives request signal by the second wireless near field communication component, pass through the second near radio
The step of communication component sends feedback signal to first terminal can refer to the step 203 in Fig. 2 embodiment, and this will not be repeated here.
Step 507, first terminal sends the information of unique identifying information and destination virtual object, destination virtual to server
The information of object includes target position.
First terminal sends the step of information of unique identifying information and destination virtual object to server and can refer to Fig. 2 reality
The step 204 in example is applied, this will not be repeated here.
Step 508, server finds a view to virtual environment by the second view-finder determined centered on target position,
Obtain the second regional area.
Server determines that second takes using target position as the central point of the second view-finder, according to the size of the second view-finder
It after scape frame, finds a view virtual environment, obtains the second regional area.Wherein, the area of the second view-finder is greater than first and finds a view
Frame.
Step 509, server obtains the information for the virtual objects for including in the second regional area.
Wherein, the information for the virtual objects for including in the second regional area includes position and the mark of virtual objects.
Step 510, server according to unique identifying information to second terminal send the second regional area in include it is virtual
The information of object.
After server receives the unique identifying information of first terminal transmission, need to send according to unique identifying information determination
The terminal of information is second terminal, to send the information for the virtual objects for including in the second regional area to second terminal.
Step 511, second terminal is found a view by the second view-finder to map identified centered on target position,
Obtain the second local map.
Wherein, which is the map of virtual environment.Second terminal using target position as the central point of the second view-finder,
After determining the second view-finder according to the size of the second view-finder, to map is found a view, and obtains the second local map.Due to target
Virtual objects are located at the center of the second localized area, therefore the information for the virtual objects for including in the second regional area includes target position
It sets.
Step 512, second terminal is according to the information for the virtual objects for including in the second regional area in the second local map
Upper generation virtual objects.
Illustratively, vivid and every kind of void of map, each type of destination virtual object is stored in second terminal
Corresponding relationship between the mark and image of quasi- object.
According to the mark for the virtual objects for including in the second regional area, inquiry corresponding relationship determines to be needed to show second terminal
The image of the virtual objects shown;According to the position for the virtual objects for including in the second regional area, determining need to be to be shown virtual
The position of object, to generate virtual objects in the second local map.
Step 513, second terminal generates the second ring according to the virtual objects in the second local map and the second local map
Border picture.
Second terminal according to the second local map acquired and the virtual objects generated in the second local map,
Generate second environment picture.
As shown in fig. 6, being shown through the first view-finder 610 in the first user interface of first terminal 110 virtual
The first partial region for finding a view selected in environment 600;It is shown in the second user interface of second terminal 120 and is taken by second
The second regional area that the target position of scape frame 620 and destination virtual object 630 is selected in virtual environment 600.Due to first
The area of view-finder 610 less than the second view-finder 620 area, therefore what destination virtual object 630 was shown in second terminal 630
Ratio is less than the ratio shown in first terminal 110.
Step 514, second terminal shows the second user interface of application program, shows the second ring in second user interface
Border picture.
Second terminal shows second user interface on a display screen, and the second environment of generation is shown in second user interface
Picture.It is exemplary, as shown in figure 4, showing second user interface 125, second user on the display screen 121 of second terminal 120
Second environment picture is shown in interface 125.
In conclusion first terminal and second terminal pass through respective wireless near field communication group in the embodiment of the present application
Part is determined show related image in second terminal after, server is to the corresponding information of second terminal transmission related image, second
Terminal generates according to the corresponding information of related image and shows related image;Due to being located in the second terminal near first terminal
Show related image associated with destination virtual object, so as to by second terminal show more with destination virtual object
Associated information allows users to the information for obtaining more virtual environments, improves the easy-to-use degree of application program.
Fig. 7 shows the flow chart of the method for displaying user interface of one exemplary embodiment of the application offer.This method
It can be applied in computer system 100 as shown in Figure 1.This method comprises:
Step 701, first terminal obtains target position of the destination virtual object in virtual environment.
The application program of virtual environment is installed, destination virtual object is that user is controlled by first terminal in first terminal
Virtual environment in object, target position can be coordinates of targets of the destination virtual object in virtual environment.
Step 702, first terminal determines first observation position of first point of observation in virtual environment according to target position.
First observation position and first terminal corresponding destination virtual object of first point of observation in virtual environment are in void
Target position in near-ring border is related.Illustratively, coordinates of targets of the destination virtual object in virtual environment be (x1, y1,
Z1), the first point of observation is the point for being set to destination virtual object rear ramp, the coordinate difference of the first point of observation and destination virtual object
Value is (0 ,-△ y1, △ z1).Then first terminal is sat according to the first point of observation is calculated in the coordinate of destination virtual object first
It is designated as (x1, y1- △ y1, z1+ △ z1), so that it is determined that the first observation position of the first point of observation.Wherein, the void in virtual environment
The coordinates of targets of quasi- object can be the reference point coordinate of virtual objects, and reference point can be one on virtual objects predeterminated position
A pixel, the predeterminated position can be eyes, the crown, left shoulder of virtual objects etc..
Step 703, first terminal obtains first partial area by the first view virtual environment in the first observation position
Domain.
First point of observation corresponds to the first visual angle, and the first visual angle has view directions and visual angle size.Illustratively, such as Fig. 8
Shown, the first point of observation P1 is located at the rear ramp of destination virtual object 800, and first terminal passes through simulation in virtual environment,
Virtual environment is observed by the first visual angle α 1 and obtains first partial region in the first observation position where first point of observation P1.
Step 704, first terminal is according to first partial Area generation first environment picture.
First terminal is according to the first partial Area generation first environment picture acquired.
Step 705, first terminal shows the first user interface of application program, shows virtual ring in the first user interface
The first environment picture and collaboration display control in border.
The application program of virtual environment is installed in first terminal, when first terminal runs the application program, display the
One user interface shows first environment picture and collaboration display control in the first user interface.Wherein, collaboration display control is
It can be in the display screen display second environment picture of other terminals for prompting the user with.
Illustratively, as shown in figure 9, showing the first user interface 112 in the display screen 111 of first terminal 110, first
First environment picture and collaboration display control 113 are shown in user interface 112, are shown user in first environment picture and are passed through
Destination virtual object 114 and corresponding first point of observation of destination virtual object 114 observation that first terminal 110 controls are virtual
The picture in the first partial region of environment.
Step 706, when receiving the signal triggered on collaboration display control, first terminal passes through the first short distance nothing
Line communication component sends request signal.
First terminal is when receiving in the signal triggered on collaboration display control through the first wireless near field communication group
The step of part sends request signal to second terminal can refer to the step 202 in Fig. 2 embodiment, and this will not be repeated here.
Step 707, it is close by second when second terminal receives request signal by the second wireless near field communication component
Distance wireless communication component sends feedback signal to first terminal.
Wherein, the corresponding unique identifying information of application program in second terminal is included in feedback signal.Second terminal is logical
When crossing the second wireless near field communication component and receiving request signal, by the second wireless near field communication component to first terminal
The step of sending feedback signal can refer to the step 203 in Fig. 2 embodiment, and this will not be repeated here.
Step 708, first terminal sends the information of unique identifying information and destination virtual object, destination virtual to server
The information of object includes target position.
First terminal sends the step of information of unique identifying information and destination virtual object to server and can refer to Fig. 2 reality
The step 204 in example is applied, this will not be repeated here.
Step 709, server determines second observation position of second point of observation in virtual environment according to target position.
Second observation position of second point of observation in virtual environment is related to the target position of destination virtual object.Example
Property, coordinates of targets of the destination virtual object in virtual environment is (x1, y1, z1), and the second point of observation is to be set to target void
The coordinate difference of quasi- object rear ramp, the second point of observation and destination virtual object is (0 ,-△ y2, △ z2), then server according to
The second coordinate that the second point of observation is calculated in the coordinate of destination virtual object is (x1, y1- △ y2, z1+ △ z2), so that it is determined that
Second observation position of the second point of observation.Wherein, the coordinates of targets of the virtual objects in virtual environment can be virtual objects
Reference point coordinate, reference point can be a pixel on virtual objects predeterminated position, which can be virtual right
Eyes, the crown, left shoulder of elephant etc..
Step 710, server obtains the second regional area by the second view virtual environment in the second observation position.
Second point of observation corresponds to the second visual angle, and the second visual angle has view directions and visual angle size.Illustratively, such as Fig. 8
Shown, the second point of observation P2 is located at the rear ramp of destination virtual object 800, and server is simulated in virtual environment, sees second
Virtual environment is observed by the second visual angle α 2 and obtains the second regional area in the second observation position where examining point P2.
Step 711, server obtains the information for the virtual objects for including in the second regional area.
Wherein, the information for the virtual objects for including in the second regional area includes position and the mark of virtual objects.
Step 712, server according to unique identifying information to second terminal send the second regional area in include it is virtual
The information of object.
After server receives the unique identifying information of first terminal transmission, need to send according to unique identifying information determination
The terminal of information is second terminal, to send the information for the virtual objects for including in the second regional area to second terminal.
Step 713, second terminal determines corresponding second point of observation of destination virtual object in the void according to target position
The second observation position in the spatial context in near-ring border.
Since destination virtual object is located at the center of the second localized area, therefore the virtual objects for including in the second regional area
Information include target position.The spatial context of virtual environment is the virtual environment for not including virtual objects.For example, virtual environment
Movable virtual objects, spatial context are then island and ocean in virtual environment including island, ocean and on island.
Second terminal determines corresponding second point of observation of destination virtual object in the virtual environment according to target position
The second observation position in spatial context.Illustratively, second terminal is empty according to coordinates of targets and the second point of observation and target
The second coordinate of the second observation position where the second point of observation is calculated, in the second coordinate institute in the coordinate difference of quasi- object
Second observation position by the second visual angle α 2 observation virtual environment spatial context determine local background space.
Step 714, second terminal is obtained according to the second view spatial context of the second observation position and the second point of observation
To local background space.
Wherein, local background space is the second regional area for not including virtual objects.
Step 715, second terminal is according to the information for the virtual objects for including in the second regional area in local background space
Upper generation virtual objects.
Illustratively, the spatial context of virtual environment, the model of each type of virtual objects are stored in second terminal
And each type of virtual objects mark and model corresponding relationship.
According to the mark for the virtual objects for including in the second regional area, inquiry corresponding relationship determines to be needed to show second terminal
The model of the virtual objects shown;According to the coordinate for the virtual objects for including in the second regional area, determining need to be to be shown virtual
The position of object, to generate virtual objects on local background space.
Step 716, second terminal is raw according to the virtual objects in local background space and the local background space
At second environment picture.
Second terminal according to the local background space acquired and the virtual objects generated in local background space,
Generate second environment picture.
Step 717, second terminal shows the second user interface of application program, shows the second ring in second user interface
Border picture.
Illustratively, as shown in figure 9, showing the first user interface 112 in the display screen 111 of first terminal 110, first
First environment picture is shown in user interface 112, and destination virtual object 114 and target are shown in second environment picture
Corresponding first point of observation of virtual objects 114 observes the picture in the first partial region of virtual environment;The display of second terminal 120
Second user interface 122 is shown in screen 121, second environment picture, second environment picture are shown in second user interface 122
In show destination virtual object 114 and corresponding second point of observation of destination virtual object 114 observation virtual environment second
The picture of regional area.
In conclusion first terminal and second terminal pass through respective wireless near field communication group in the embodiment of the present application
Part is determined show related image in second terminal after, server is to the corresponding information of second terminal transmission related image, second
Terminal generates according to the corresponding information of related image and shows related image;Due to being located in the second terminal near first terminal
Show related image associated with destination virtual object, so as to by second terminal show more with destination virtual object
Associated information allows users to the information for obtaining more virtual environments, improves the easy-to-use degree of application program.
Figure 10 shows the block diagram of the user interface display device of one exemplary embodiment of the application offer.The device
It being implemented in combination with as the first terminal 110 in Fig. 1 embodiment by software, hardware or both.The device includes: aobvious
Show module 1010, the first close range wireless communication module 1020 and sending module 1030.
Display module 1010 shows first in the first user interface for showing the first user interface of application program
Environment picture and collaboration display control, first environment picture includes the target position institute with destination virtual object in virtual environment
The picture in determining first partial region.
First close range wireless communication module 1020, for when receive collaboration display control on trigger signal when,
Request signal is sent by the first wireless near field communication component, request signal is used to request the application program in second terminal aobvious
Show that related image, related image are to show picture associated with destination virtual object;Pass through the first wireless near field communication
Component receives second terminal and passes through the feedback signal that the second wireless near field communication component is sent, and includes second in feedback signal
The corresponding unique identifying information of application program in terminal.
Sending module 1030, for sending the information of unique identifying information and destination virtual object, Wei Yishi to server
Other information sends the corresponding information of related image to second terminal for trigger the server;The information of destination virtual object is for auxiliary
Help server according to the corresponding information of acquisition of information related image of destination virtual object.
In an alternative embodiment, sending module 1030 are also used to after receiving feedback signal, are sent out to server
Send unique identifying information;It is spaced the information that destination virtual object is sent to server at predetermined time intervals, and/or, it is based on event
Triggering property sends the information of destination virtual object to server.
In an alternative embodiment, the first wireless near field communication component includes the first bluetooth module, the second low coverage
It include the second bluetooth module from wireless communication components;
First close range wireless communication module 1020 is also used to search for by the first bluetooth module to obtain setting for second terminal
Standby mark;Bluetooth pairing request is sent to second terminal by the first bluetooth module according to device identification;It is whole when receiving second
End by the second bluetooth module send receive pairing signal after, by the first bluetooth module to second terminal send request letter
Number.
In an alternative embodiment, the first wireless near field communication component includes that the first Wi-Fi component and second are blue
Dental assembly, the second wireless near field communication component include the 2nd Wi-Fi component and the second bluetooth module;
First close range wireless communication module 1020 is also used to search for by the first bluetooth module to obtain setting for second terminal
Standby mark;After searching the device identification of second terminal, request signal is sent to second terminal by the first Wi-Fi component,
Request signal is for assisting second terminal to receive request signal by the 2nd Wi-Fi component.
In an alternative embodiment, related image includes that second environment picture, third environment picture and situation are drawn
At least one of face;Second environment picture includes with the picture of the second regional area determined by target position, the second part
The area in region is greater than the area in first partial region;Third environment picture includes to be associated with virtual objects in virtual environment
The picture in first partial region determined by relative position, association virtual objects are to be in same camp with destination virtual object
Object;Situation picture is the picture for showing the map of related information, and related information is that destination virtual object and/or association are virtual
The information for including within sweep of the eye of object, map are the maps of virtual environment.
In an alternative embodiment, virtual environment is two-dimensional environment, and related image includes second environment picture;First
Regional area is the region that identified first view-finder finds a view to virtual environment centered on target position;Second
Regional area is the region that identified second view-finder finds a view to virtual environment centered on target position;Its
In, the area of the second view-finder is greater than the area of the first view-finder.
In an alternative embodiment, the information of destination virtual object includes target position;The corresponding letter of related image
Breath includes that server finds a view after obtaining the second regional area virtual environment by the second view-finder, obtains the second partial zones
The information for the virtual objects for including in domain.
In an alternative embodiment, virtual environment is 2.5 dimension environment or three-dimensional environment, and related image includes the second ring
Border picture;First partial region is the region obtained with the corresponding first point of observation observation virtual environment of destination virtual object;The
Two regional areas are the regions obtained with the corresponding second point of observation observation virtual environment of destination virtual object;Wherein, it second sees
Examine the field range that field range a little is greater than the first point of observation.
In an alternative embodiment, the information of destination virtual object includes target position;The corresponding letter of related image
Breath includes the second observation position that server determines the second point of observation according to target position, according to the second observation position and the second view
Angle observation virtual environment obtains the information acquired after the second regional area;Wherein, the height of the second observation position is higher than the
The height of one observation position, and/or, the second visual angle is greater than the first visual angle.
Figure 11 shows the block diagram of the user interface display device of one exemplary embodiment of the application offer.The device
It being implemented in combination with as the second terminal 120 in Fig. 1 embodiment by software, hardware or both.The device includes:
Two close range wireless communication modules 1110, receiving module 1120, processing module 1130 and display module 1140.
Second close range wireless communication module 1110, for receiving first terminal by the second wireless near field communication component
The request signal sent by the first wireless near field communication component;By the second wireless near field communication component to first terminal
Feedback signal is sent, the unique identifying information of application program in second terminal is carried in feedback signal, unique identifying information is used
The corresponding information of related image is sent in trigger the server.
Receiving module 1120, for receiving the corresponding information of related image of server transmission.
Processing module 1130, for generating related image according to the corresponding information of related image.
Display module 1140 is shown relevant for showing the second user interface of application program in second user interface
Picture.
In an alternative embodiment, display module 1140 are also used to show request window, request according to request signal
Window is for requesting display related image.
Second close range wireless communication module 1110 is also used to work as the permission signal for receiving and triggering on request window
When, feedback signal is sent to first terminal by the second wireless near field communication component.
In an alternative embodiment, related image includes second environment picture, and second environment picture includes with target
The picture of second regional area determined by target position of the virtual objects in virtual environment, the corresponding packet of related image
Include the information for the virtual objects for including in the second part;Virtual environment is two-dimensional environment;
Processing module 1130 is also used to centered on target position identified second view-finder to map and find a view
To the second local map, map is the map of virtual environment, and the second local map is the map of the second regional area;According to second
The information for the virtual objects for including in regional area generates virtual objects in the second local map;According to the second local map and
Virtual objects in second local map generate related image.
In an alternative embodiment, related image includes second environment picture, and second environment picture includes with target
The picture of second regional area determined by target position of the virtual objects in virtual environment, the corresponding packet of related image
Include the information for the virtual objects for including in the second part;Virtual environment is 2.5 dimension environment or three-dimensional environment;
Processing module 1130 is also used to be determined corresponding second point of observation of destination virtual object virtual according to target position
The second observation position in the spatial context of environment;According to the second view background of the second observation position and the second point of observation
Space obtains local background space, and local background space is the background of the second regional area;According to including in the second regional area
The information of virtual objects generate virtual objects on local background space;According in local background space and local spatial context
Virtual objects generate related image.
Figure 12 shows the block diagram of the user interface display device of one exemplary embodiment of the application offer.The device
It being implemented in combination with as the server 130 in Fig. 1 embodiment by software, hardware or both.The device includes: to receive
Module 1210, processing module 1220 and sending module 1230:.
Receiving module 1210, the information and unique identifying information of the destination virtual object for receiving first terminal transmission;
Destination virtual object is the virtual objects in the application program run in first terminal, and unique identifying information is in second terminal
The corresponding identification information of application program.
Processing module 1220 is associated with picture for the corresponding information of acquisition of information related image according to destination virtual object
Face is to show picture associated with destination virtual object.
Sending module 1230, for sending related image pair according to application program of the unique identifying information into second terminal
The information answered, the corresponding information of related image is for assisting second terminal to generate related image.
In an alternative embodiment, related image includes second environment picture, and second environment picture includes with target
The picture of second regional area determined by target position of the virtual objects in virtual environment, the corresponding packet of related image
The information for the virtual objects for including in the second part is included, the information of destination virtual object includes destination virtual object in virtual environment
In target position;Virtual environment is two-dimensional environment;
Processing module 1220 is also used to centered on target position identified second view-finder and takes to virtual environment
Scape obtains the second regional area;Obtain the information for the virtual objects for including in the second regional area.
In an alternative embodiment, related image includes second environment picture, and second environment picture includes with target
The picture of second regional area determined by target position of the virtual objects in virtual environment, the corresponding packet of related image
The information for the virtual objects for including in the second part is included, the information of destination virtual object includes destination virtual object in virtual environment
In target position;Virtual environment is 2.5 dimension environment or three-dimensional environment;
Processing module 1220 is also used to determine the second of corresponding second point of observation of destination virtual object according to target position
Observe position;The second regional area is obtained according to the second observation position and the second view virtual environment;Obtain the second part
The information for the virtual objects for including in region.
Figure 13 shows the structural block diagram of the terminal 1300 of one exemplary embodiment of the application offer.The terminal 1300
It can be portable mobile termianl, such as: smart phone, tablet computer, MP3 player (Moving Picture Experts
Group Audio Layer III, dynamic image expert's compression standard audio level 3), MP4 (Moving Picture
Experts Group Audio Layer IV, dynamic image expert's compression standard audio level 4) player.Terminal 1300 is also
Other titles such as user equipment, portable terminal may be referred to as.
In general, terminal 1300 includes: processor 1301 and memory 1302.
Processor 1301 may include one or more processing cores, such as 4 core processors, 8 core processors etc..Place
Reason device 1301 can use DSP (Digital Signal Processing, Digital Signal Processing), FPGA (Field-
Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array, may be programmed
Logic array) at least one of example, in hardware realize.Processor 1301 also may include primary processor and coprocessor, master
Processor is the processor for being handled data in the awake state, also referred to as CPU (Central Processing
Unit, central processing unit);Coprocessor is the low power processor for being handled data in the standby state.?
In some embodiments, processor 1301 can be integrated with GPU (Graphics Processing Unit, image processor),
GPU is used to be responsible for the rendering and drafting of content to be shown needed for display screen.In some embodiments, processor 1301 can also be wrapped
AI (Artificial Intelligence, artificial intelligence) processor is included, the AI processor is for handling related machine learning
Calculating operation.
Memory 1302 may include one or more computer readable storage mediums, which can
To be tangible and non-transient.Memory 1302 may also include high-speed random access memory and nonvolatile memory,
Such as one or more disk storage equipments, flash memory device.In some embodiments, non-transient in memory 1302
Computer readable storage medium for storing at least one instruction, at least one instruction for performed by processor 1301 with
Realize the method for displaying user interface provided herein executed by first terminal or second terminal.
In some embodiments, terminal 1300 is also optional includes: peripheral device interface 1303 and at least one periphery are set
It is standby.Specifically, peripheral equipment includes: radio circuit 1304, touch display screen 1305, camera 1306, voicefrequency circuit 1307, determines
At least one of hyte part 1308 and power supply 1309.
Peripheral device interface 1303 can be used for I/O (Input/Output, input/output) is relevant outside at least one
Peripheral equipment is connected to processor 1301 and memory 1302.In some embodiments, processor 1301, memory 1302 and periphery
Equipment interface 1303 is integrated on same chip or circuit board;In some other embodiments, processor 1301, memory
1302 and peripheral device interface 1303 in any one or two can be realized on individual chip or circuit board, this implementation
Example is not limited this.
Radio circuit 1304 is for receiving and emitting RF (Radio Frequency, radio frequency) signal, also referred to as electromagnetic signal.
Radio circuit 1304 is communicated by electromagnetic signal with communication network and other communication equipments.Radio circuit 1304 is by telecommunications
Number being converted to electromagnetic signal is sent, alternatively, the electromagnetic signal received is converted to electric signal.Optionally, radio circuit
1304 include: antenna system, RF transceiver, one or more amplifiers, tuner, oscillator, digital signal processor, volume solution
Code chipset, user identity module card etc..Radio circuit 1304 can by least one wireless communication protocol come with it is other
Terminal is communicated.The wireless communication protocol includes but is not limited to: WWW, Metropolitan Area Network (MAN), Intranet, each third generation mobile communication network
(2G, 3G, 4G and 5G), WLAN and/or Wi-Fi (Wireless Fidelity, Wireless Fidelity) network.The application is real
It applies in example, radio circuit 1304 can also include at least one of Wi-Fi component, bluetooth module and NFC component.
Touch display screen 1305 is for showing UI (User Interface, user interface).The UI may include figure, text
Sheet, icon, video and its their any combination.Touch display screen 1305 also has acquisition on the surface of touch display screen 1305
Or the ability of the touch signal of surface.The touch signal can be used as control signal and be input at processor 1301
Reason.Touch display screen 1305 is for providing virtual push button and/or dummy keyboard, also referred to as soft button and/or soft keyboard.In some realities
It applies in example, touch display screen 1305 can be one, and the front panel of terminal 1300 is arranged;In further embodiments, it touches aobvious
Display screen 1305 can be at least two, be separately positioned on the different surfaces of terminal 1300 or in foldover design;In still other implementation
In example, touch display screen 1305 can be flexible display screen, be arranged on the curved surface of terminal 1300 or on fold plane.Very
Extremely, touch display screen 1305 can also be arranged to non-rectangle irregular figure, namely abnormity screen.Touch display screen 1305 can be with
Using LCD (Liquid Crystal Display, liquid crystal display), OLED (Organic Light-Emitting Diode,
Organic Light Emitting Diode) etc. materials preparation.
CCD camera assembly 1306 is for acquiring image or video.Optionally, CCD camera assembly 1306 includes front camera
And rear camera.In general, front camera is for realizing video calling or self-timer, rear camera is for realizing photo or view
The shooting of frequency.In some embodiments, rear camera at least two are that main camera, depth of field camera, wide-angle are taken the photograph respectively
As any one in head, to realize that main camera and the fusion of depth of field camera realize background blurring function, main camera and wide
Pan-shot and VR (Virtual Reality, virtual reality) shooting function are realized in camera fusion in angle.In some embodiments
In, CCD camera assembly 1306 can also include flash lamp.Flash lamp can be monochromatic warm flash lamp, be also possible to double-colored temperature flash of light
Lamp.Double-colored temperature flash lamp refers to the combination of warm light flash lamp and cold light flash lamp, can be used for the light compensation under different-colour.
Voicefrequency circuit 1307 is used to provide the audio interface between user and terminal 1300.Voicefrequency circuit 1307 may include
Microphone and loudspeaker.Microphone is used to acquire the sound wave of user and environment, and converts sound waves into electric signal and be input to processing
Device 1301 is handled, or is input to radio circuit 1304 to realize voice communication.For stereo acquisition or the mesh of noise reduction
, microphone can be separately positioned on the different parts of terminal 1300 to be multiple.Microphone can also be array microphone or complete
To acquisition type microphone.Loudspeaker is then used to that sound wave will to be converted to from the electric signal of processor 1301 or radio circuit 1304.
Loudspeaker can be traditional wafer speaker, be also possible to piezoelectric ceramic loudspeaker.When loudspeaker is piezoelectric ceramic loudspeaker
When, the audible sound wave of the mankind can be not only converted electrical signals to, the sound that the mankind do not hear can also be converted electrical signals to
Wave is to carry out the purposes such as ranging.In some embodiments, voicefrequency circuit 1307 can also include earphone jack.
Positioning component 1308 is used for the current geographic position of positioning terminal 1300, to realize navigation or LBS (Location
Based Service, location based service).Positioning component 1308 can be the GPS (Global based on the U.S.
Positioning System, global positioning system), China dipper system or Russia Galileo system positioning group
Part.
Power supply 1309 is used to be powered for the various components in terminal 1300.Power supply 1309 can be alternating current, direct current
Electricity, disposable battery or rechargeable battery.When power supply 1309 includes rechargeable battery, which can be line charge
Battery or wireless charging battery.Wired charging battery is the battery to be charged by Wireline, and wireless charging battery is to pass through
The battery of wireless coil charging.The rechargeable battery can be also used for supporting fast charge technology.
In some embodiments, terminal 1300 further includes having one or more sensors 1310.One or more sensing
Device 1310 includes but is not limited to: acceleration transducer 1311, gyro sensor 1312, pressure sensor 1313, fingerprint sensing
Device 1314, optical sensor 1315 and proximity sensor 1316.
Acceleration transducer 1311 can detecte the acceleration in three reference axis of the coordinate system established with terminal 1300
Size.For example, acceleration transducer 1311 can be used for detecting component of the acceleration of gravity in three reference axis.Processor
The 1301 acceleration of gravity signals that can be acquired according to acceleration transducer 1311, control touch display screen 1305 with transverse views
Or longitudinal view carries out the display of user interface.Acceleration transducer 1311 can be also used for game or the exercise data of user
Acquisition.
Gyro sensor 1312 can detecte body direction and the rotational angle of terminal 1300, gyro sensor 1312
Acquisition user can be cooperateed with to act the 3D of terminal 1300 with acceleration transducer 1311.Processor 1301 is according to gyro sensors
The data that device 1312 acquires, following function may be implemented: action induction (for example changing UI according to the tilt operation of user) is clapped
Image stabilization, game control and inertial navigation when taking the photograph.
The lower layer of side frame and/or touch display screen 1305 in terminal 1300 can be set in pressure sensor 1313.When
When the side frame of terminal 1300 is arranged in pressure sensor 1313, user can detecte to the gripping signal of terminal 1300, according to
The gripping signal carries out right-hand man's identification or prompt operation.When the lower layer of touch display screen 1305 is arranged in pressure sensor 1313
When, it can be according to user to the pressure operation of touch display screen 1305, the operability control on the interface UI is controlled in realization
System.Operability control includes at least one of button control, scroll bar control, icon control, menu control.
Fingerprint sensor 1314 is used to acquire the fingerprint of user, according to the identity of collected fingerprint recognition user.?
When the identity for identifying user is trusted identity, the user is authorized to execute relevant sensitive operation, the sensitivity by processor 1301
Operation includes solving lock screen, checking encryption information, downloading software, payment and change setting etc..Fingerprint sensor 1314 can be by
The front, the back side or side of terminal 1300 are set.When being provided with physical button or manufacturer Logo in terminal 1300, fingerprint sensing
Device 1314 can be integrated with physical button or manufacturer Logo.
Optical sensor 1315 is for acquiring ambient light intensity.In one embodiment, processor 1301 can be according to light
The ambient light intensity that sensor 1315 acquires is learned, the display brightness of touch display screen 1305 is controlled.Specifically, work as ambient light intensity
When higher, the display brightness of touch display screen 1305 is turned up;When ambient light intensity is lower, the aobvious of touch display screen 1305 is turned down
Show brightness.In another embodiment, the ambient light intensity that processor 1301 can also be acquired according to optical sensor 1315, is moved
The acquisition parameters of state adjustment CCD camera assembly 1306.
Proximity sensor 1316, also referred to as range sensor are generally arranged at the front of terminal 1300.Proximity sensor 1316
For acquiring the distance between the front of user Yu terminal 1300.In one embodiment, when proximity sensor 1316 detects
When the distance between the front of user and terminal 1300 gradually becomes smaller, touch display screen 1305 is controlled from bright screen by processor 1301
State is switched to breath screen state;When proximity sensor 1316 detects that the distance between user and the front of terminal 1300 gradually become
When big, touch display screen 1305 is controlled by processor 1301 and is switched to bright screen state from breath screen state.
It, can be with it will be understood by those skilled in the art that the restriction of the not structure paired terminal 1300 of structure shown in Figure 13
Including than illustrating more or fewer components, perhaps combining certain components or being arranged using different components.
Figure 14, it illustrates the structural schematic diagrams for the computer equipment that one exemplary embodiment of the application provides.The meter
Calculating machine equipment can be the server 140 in Fig. 1 embodiment.Specifically: the computer equipment 1400 includes central processing
Unit (CPU) 1401, the system storage including random access memory (RAM) 1402 and read-only memory (ROM) 1403
1404, and the system bus 1405 of connection system storage 1404 and central processing unit 1401.The computer equipment
1400 further include the basic input/output (I/O system) 1406 that information is transmitted between each device helped in computer,
With the mass-memory unit 1407 for storage program area 1413, application program 1412 and other program modules 1415.
The basic input/output 1406 includes display 1408 for showing information and inputs for user
The input equipment 1409 of such as mouse, keyboard etc of information.Wherein the display 1408 and input equipment 1409 all pass through
The input and output controller 1410 for being connected to system bus 1405 is connected to central processing unit 1401.The basic input/defeated
System 1406 can also include input and output controller 1410 to touch for receiving and handling from keyboard, mouse or electronics out
Control the input of multiple other equipment such as pen.Similarly, input and output controller 1410 also provide output to display screen, printer or
Other kinds of output equipment.
The mass-memory unit 1407 (is not shown by being connected to the bulk memory controller of system bus 1405
It is connected to central processing unit 1401 out).The mass-memory unit 1407 and its associated computer-readable storage medium
Matter is that computer equipment 1400 provides non-volatile memories.That is, the mass-memory unit 1407 may include all
Such as the computer readable storage medium (not shown) of hard disk or CD-ROI driver etc.
Without loss of generality, the computer readable storage medium may include computer storage media and communication media.Meter
Calculation machine storage medium is believed including computer readable instructions, data structure, program module or other data etc. for storage
The volatile and non-volatile of any method or technique realization of breath, removable and irremovable medium.Computer storage medium
Including RAM, ROM, EPROM, EEPROM, flash memory or other solid-state storages its technologies, CD-ROM, DVD or other optical storages, magnetic
Tape drum, tape, disk storage or other magnetic storage devices.Certainly, skilled person will appreciate that computer storage is situated between
Matter is not limited to above-mentioned several.Above-mentioned system storage 1404 and mass-memory unit 1407 may be collectively referred to as memory.
Memory is stored with one or more programs, and one or more programs are configured to by one or more central processings
Unit 1401 executes, and one or more programs include the finger of the dispatching method for realizing the virtual objects in above-mentioned virtual environment
It enables, it is aobvious that central processing unit 1401 executes the user interface that the one or more program realizes that above-mentioned each embodiment of the method provides
Show method.
According to the various embodiments of the application, the computer equipment 1400 can also be connected by networks such as internets
The remote computer operation being connected on network.Namely computer equipment 1400 can be by being connected on the system bus 1405
Network Interface Unit 1411 be connected to network 1412, in other words, Network Interface Unit 1411 can be used also to be connected to it
The network or remote computer system (not shown) of his type.
The memory further includes that one or more than one program, the one or more programs are stored in
In memory, the one or more programs include for carrying out in user interface method provided in an embodiment of the present invention
The step as performed by server.
The embodiment of the present application also provides a kind of computer readable storage medium, and at least one finger is stored in the storage medium
Enable, at least a Duan Chengxu, code set or instruction set, at least one instruction, an at least Duan Chengxu, the code set or
Instruction set is loaded as the processor and is executed to realize the method for displaying user interface as described in above-mentioned any embodiment.
Present invention also provides a kind of computer program products to make when computer program product is run on computers
It obtains computer and executes the method for displaying user interface that above-mentioned each embodiment of the method provides.
It should be understood that referenced herein " multiple " refer to two or more."and/or", description association
The incidence relation of object indicates may exist three kinds of relationships, for example, A and/or B, can indicate: individualism A exists simultaneously A
And B, individualism B these three situations.Character "/" typicallys represent the relationship that forward-backward correlation object is a kind of "or".
Above-mentioned the embodiment of the present application serial number is for illustration only, does not represent the advantages or disadvantages of the embodiments.
Those of ordinary skill in the art will appreciate that realizing that all or part of the steps of above-described embodiment can pass through hardware
It completes, relevant hardware can also be instructed to complete by program, the program can store in a kind of computer-readable
In storage medium, storage medium mentioned above can be read-only memory, disk or CD etc..
The foregoing is merely the preferred embodiments of the application, not to limit the application, it is all in spirit herein and
Within principle, any modification, equivalent replacement, improvement and so on be should be included within the scope of protection of this application.
Claims (26)
1. a kind of method for displaying user interface, which is characterized in that the method is executed by first terminal, which comprises
It shows the first user interface of application program, first environment picture and collaboration display is shown in first user interface
Control, the first environment picture include first partial determined by target position with destination virtual object in virtual environment
The picture in region;
When receiving the signal triggered on the collaboration display control, asked by the transmission of the first wireless near field communication component
Signal is sought, the request signal is used to request the application program in the second terminal to show related image, the association
Picture is to show picture associated with the destination virtual object;
Second terminal is received by the first wireless near field communication component to send by the second wireless near field communication component
Feedback signal, include the corresponding unique identification letter of the application program in the second terminal in the feedback signal
Breath;
The information of the unique identifying information and the destination virtual object is sent to server, the unique identifying information is used for
It triggers the server and sends the corresponding information of the related image to the second terminal;The information of the destination virtual object
For assisting the server corresponding information of related image according to the acquisition of information of the destination virtual object.
2. the method according to claim 1, wherein described send the unique identifying information and institute to server
State the information of destination virtual object, comprising:
After receiving the feedback signal, Xiang Suoshu server sends the unique identifying information;
It is spaced the information that the destination virtual object is sent to the server at predetermined time intervals, and/or, it is triggered based on event
Property sends the information of the destination virtual object to the server.
3. the method according to claim 1, wherein the first wireless near field communication component includes described
One bluetooth module, the second wireless near field communication component include second bluetooth module;
It is described that request signal is sent by the first wireless near field communication component, comprising:
It searches for obtain the device identification of the second terminal by first bluetooth module;
Bluetooth pairing request is sent to the second terminal by first bluetooth module according to the device identification;
After receiving pairing signal, pass through described first by second bluetooth module transmission when receiving the second terminal
Bluetooth module sends the request signal to the second terminal.
4. according to the method described in claim 2, it is characterized in that, the first wireless near field communication component includes first
Wi-Fi component and second bluetooth module, the second wireless near field communication component include the 2nd Wi-Fi component and
Second bluetooth module;
It is described that request signal is sent by the first wireless near field communication component, comprising:
It searches for obtain the device identification of the second terminal by first bluetooth module;
After searching the device identification of the second terminal, sent by the first Wi-Fi component to the second terminal
The request signal, the request signal is for assisting the second terminal by asking described in the 2nd Wi-Fi component reception
Seek signal.
5. method according to any one of claims 1 to 4, which is characterized in that the related image include second environment picture,
At least one of third environment picture and situation picture;
The second environment picture includes with the picture of the second regional area determined by the target position, second part
The area in region is greater than the area in the first partial region;
The third environment picture includes to be associated with virtual objects first determined by the relative position in the virtual environment
The picture of regional area, the association virtual objects are the objects that same camp is in the destination virtual object;
The situation picture is the picture for showing the map of related information, and the related information is the destination virtual object
And/or the information for including within sweep of the eye of the association virtual objects, the map is the map of the virtual environment.
6. according to the method described in claim 5, it is characterized in that, the virtual environment is two-dimensional environment, the related image
Including the second environment picture;
The first partial region be centered on the target position determined by the first view-finder to the virtual environment into
The region that row is found a view;
Second regional area be centered on the target position determined by the second view-finder to the virtual environment into
The region that row is found a view;
Wherein, the area of second view-finder is greater than the area of first view-finder.
7. according to the method described in claim 6, it is characterized in that, the information of the destination virtual object includes the target position
It sets;
The corresponding information of the related image includes that the server carries out the virtual environment by second view-finder
It finds a view after obtaining second regional area, obtains the information for the virtual objects for including in second regional area.
8. according to the method described in claim 5, it is characterized in that, the virtual environment is 2.5 dimension environment or three-dimensional environment, institute
Stating related image includes the second environment picture;
The first partial region is to observe the virtual environment with corresponding first point of observation of the destination virtual object to obtain
Region;
Second regional area is to observe the virtual environment with corresponding second point of observation of the destination virtual object to obtain
Region;
Wherein, the field range of second point of observation is greater than the field range of first point of observation.
9. according to the method described in claim 8, it is characterized in that, the information of the destination virtual object includes the target position
It sets;
The corresponding information of the related image includes that the server according to the target position determines second point of observation
Second observation position obtains the second game according to virtual environment described in the second observation position and second view
The information acquired behind portion region;
Wherein, the height of second observation position is higher than the height of first observation position, and/or, second visual angle
Greater than first visual angle.
10. a kind of method for displaying user interface, which is characterized in that the method is executed by second terminal, which comprises
First terminal is received by the second wireless near field communication component to ask by what the first wireless near field communication component was sent
Seek signal;
Feedback signal is sent to the first terminal by the second wireless near field communication component, is taken in the feedback signal
Unique identifying information with application program in the second terminal, the unique identifying information send institute for trigger the server
State the corresponding information of related image;
Receive the corresponding information of the related image that the server is sent;
The related image is generated according to the corresponding information of the related image;
It shows the second user interface of the application program, shows the related image in the second user interface.
11. according to the method described in claim 10, it is characterized in that, described pass through the second wireless near field communication component
Feedback signal is sent to the first terminal, comprising:
Request window is shown according to the request signal, and the request window is for requesting display related image;
When receive it is described request window on trigger permission signal when, by the second wireless near field communication component to
The first terminal sends the feedback signal.
12. method described in 0 or 11 according to claim 1, which is characterized in that the related image includes second environment picture,
The second environment picture includes the second regional area determined by target position with destination virtual object in virtual environment
Picture, the corresponding information of the related image includes the information for the virtual objects for including in second part;
The virtual environment is two-dimensional environment;
It is described that the related image is generated according to the corresponding information of the related image, comprising:
Second view-finder to map determined by centered on the target position is found a view to obtain the second local map, described
Map is the map of the virtual environment, and second local map is the map of second regional area;
The void is generated in second local map according to the information for the virtual objects for including in second regional area
Quasi- object;
The related image is generated according to the virtual objects in second local map and second local map.
13. method described in 0 or 11 according to claim 1, which is characterized in that the related image includes second environment picture,
The second environment picture includes the second regional area determined by target position with destination virtual object in virtual environment
Picture, the corresponding information of the related image includes the information for the virtual objects for including in second part;
The virtual environment is 2.5 dimension environment or three-dimensional environment;
It is described that the related image is generated according to the corresponding information of the related image, comprising:
Determine corresponding second point of observation of the destination virtual object in the background of the virtual environment according to the target position
The second observation position in space;
The spatial context according to the second view of the second observation position and second point of observation obtains the office
Portion's spatial context, the local background space are the backgrounds of second regional area;
The void is generated on the local background space according to the information for the virtual objects for including in second regional area
Quasi- object;
The related image is generated according to the virtual objects in the local background space and the local background space.
14. a kind of method for displaying user interface, which is characterized in that the method is executed by server, which comprises
Receive the information and unique identifying information of the destination virtual object that first terminal is sent;The destination virtual object is described
The virtual objects in application program run in first terminal, the unique identifying information are described using journey in second terminal
The corresponding identification information of sequence;
According to the corresponding information of acquisition of information related image of the destination virtual object, the related image is to show and institute
State the associated picture of destination virtual object;
It is corresponding that the related image is sent according to the application program of the unique identifying information into the second terminal
Information, the corresponding information of the related image is for assisting the second terminal to generate the related image.
15. according to the method for claim 14, which is characterized in that the related image includes second environment picture, described
Second environment picture includes the picture of the second regional area determined by target position with destination virtual object in virtual environment
Face, the corresponding information of the related image include the information for the virtual objects for including, the destination virtual in second part
The information of object includes target position of the destination virtual object in the virtual environment;
The virtual environment is two-dimensional environment;
The corresponding information of acquisition of information related image according to the destination virtual object, comprising:
Second view-finder determined by centered on the target position finds a view the virtual environment to obtain described second
Regional area;
Obtain the information for the virtual objects for including in second regional area.
16. according to the method for claim 14, which is characterized in that the related image includes second environment picture, described
Second environment picture includes the picture of the second regional area determined by target position with destination virtual object in virtual environment
Face, the corresponding information of the related image include the information for the virtual objects for including, the destination virtual in second part
The information of object includes target position of the destination virtual object in the virtual environment;
The virtual environment is 2.5 dimension environment or three-dimensional environment;
The corresponding information of acquisition of information related image according to the destination virtual object, comprising:
The second observation position of corresponding second point of observation of the destination virtual object is determined according to the target position;
Second regional area is obtained according to virtual environment described in the second observation position and second view;
Obtain the information for the virtual objects for including in second regional area.
17. a kind of method for displaying user interface, which is characterized in that the method is executed by second terminal, which comprises
Show the user interface of application program;
Show that request window, the request window are that the application requests display in first terminal is closed in the user interface
Join the window of picture, is shown in the request window and receive control;
When receive it is described receive the trigger signal triggered on control when, show second user interface, second user circle
The related image is shown in face, the related image shows picture associated with destination virtual object, the target
Virtual objects are virtual objects that are movable in virtual environment and being controlled by the first terminal.
18. according to the method for claim 17, which is characterized in that the first terminal is shown including first environment picture
The first user interface, the first environment picture includes being determined with target position of the destination virtual object in virtual environment
First partial region picture;
The related image includes at least one of second environment picture, third environment picture and situation picture;
The second environment picture includes with the picture of the second regional area determined by the target position, second part
The area in region is greater than the area in the first partial region;
The third environment picture includes to be associated with virtual objects first determined by the relative position in the virtual environment
The picture of regional area, the association virtual objects are the objects that same camp is in the destination virtual object;
The situation picture is the picture for showing the map of related information, and the related information is the destination virtual object
And/or the information for including within sweep of the eye of the association virtual objects, the map is the map of the virtual environment.
19. a kind of user interface display device, which is characterized in that described device is applied in first terminal, and described device includes:
Display module shows first environment in first user interface for showing the first user interface of application program
Picture and collaboration display control, the first environment picture includes the target position institute with destination virtual object in virtual environment
The picture in determining first partial region;
First close range wireless communication module, for passing through when receiving the signal triggered on the collaboration display control
First wireless near field communication component sends request signal, and what the request signal was used to request in the second terminal described answers
With program display related image, the related image is to show picture associated with the destination virtual object;Pass through institute
It states the first wireless near field communication component and receives the feedback signal that second terminal passes through the transmission of the second wireless near field communication component,
It include the corresponding unique identifying information of the application program in the second terminal in the feedback signal;
Sending module, for sending the information of the unique identifying information and the destination virtual object to server, it is described only
One identification information sends the corresponding information of the related image to the second terminal for triggering the server;The target
The information of virtual objects is for assisting server related image pair according to the acquisition of information of the destination virtual object
The information answered.
20. a kind of user interface display device, which is characterized in that described device is applied in second terminal, and described device includes:
Second close range wireless communication module passes through first for receiving first terminal by the second wireless near field communication component
The request signal that wireless near field communication component is sent;By the second wireless near field communication component to the first terminal
Send feedback signal, carry the unique identifying information of application program in the second terminal in the feedback signal, it is described only
One identification information sends the corresponding information of the related image for trigger the server;
Receiving module, the corresponding information of the related image sent for receiving the server;
Processing module, for generating the related image according to the corresponding information of the related image;
Display module is shown described for showing the second user interface of the application program in the second user interface
Related image.
21. a kind of user interface display device, which is characterized in that described device is applied in server, and described device includes:
Receiving module, the information and unique identifying information of the destination virtual object for receiving first terminal transmission;The target
Virtual objects are the virtual objects in the application program run in the first terminal, and the unique identifying information is second terminal
In the corresponding identification information of the application program;
Processing module, for the corresponding information of acquisition of information related image according to the destination virtual object, the association picture
Face is to show picture associated with the destination virtual object;
Sending module, for sending the pass according to the application program of the unique identifying information into the second terminal
Join the corresponding information of picture, the corresponding information of the related image is for assisting the second terminal to generate the related image.
22. a kind of terminal, which is characterized in that the terminal includes processor, memory and the first wireless near field communication component,
At least one instruction is stored in the memory, described instruction is loaded by the processor and executed to realize such as claim
1 to 9 any method for displaying user interface.
23. a kind of terminal, which is characterized in that the terminal includes processor, memory and the second wireless near field communication component,
At least one instruction is stored in the memory, described instruction is loaded by the processor and executed to realize such as claim
10 to 13 any method for displaying user interface.
24. a kind of computer equipment, which is characterized in that the computer equipment includes processor and memory, the memory
In be stored at least one instruction, described instruction is loaded by the processor and is executed to realize as claim 14 to 18 is any
The method for displaying user interface.
25. a kind of computer system, which is characterized in that the computer system includes user interface as claimed in claim 19
Display device, user interface display device as claimed in claim 20 and user interface as claimed in claim 21 are shown
Device;
Or,
The computer system includes terminal, terminal as claimed in claim 23 and such as right as claimed in claim 22
It is required that computer equipment described in 23.
26. a kind of computer readable storage medium, which is characterized in that be stored at least one instruction, institute in the storage medium
Instruction is stated to be loaded by processor and executed to realize the method for displaying user interface as described in claim 1 to 18 is any.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910068959.1A CN109806583B (en) | 2019-01-24 | 2019-01-24 | User interface display method, device, equipment and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910068959.1A CN109806583B (en) | 2019-01-24 | 2019-01-24 | User interface display method, device, equipment and system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109806583A true CN109806583A (en) | 2019-05-28 |
CN109806583B CN109806583B (en) | 2021-11-23 |
Family
ID=66603710
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910068959.1A Active CN109806583B (en) | 2019-01-24 | 2019-01-24 | User interface display method, device, equipment and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109806583B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111672106A (en) * | 2020-06-05 | 2020-09-18 | 腾讯科技(深圳)有限公司 | Virtual scene display method and device, computer equipment and storage medium |
CN112604302A (en) * | 2020-12-17 | 2021-04-06 | 腾讯科技(深圳)有限公司 | Interaction method, device, equipment and storage medium of virtual object in virtual environment |
CN112704883A (en) * | 2020-12-30 | 2021-04-27 | 腾讯科技(深圳)有限公司 | Method, device, terminal and storage medium for grouping virtual objects in virtual environment |
Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101594403A (en) * | 2008-05-29 | 2009-12-02 | Lg电子株式会社 | Transparent display and method of operation thereof |
CN101916186A (en) * | 2010-07-30 | 2010-12-15 | 深圳创维-Rgb电子有限公司 | Method, device and terminal for expanded display of mobile terminal view |
CN102077161A (en) * | 2008-06-30 | 2011-05-25 | 日本电气株式会社 | Information processing device, display control method, and recording medium |
US20130072298A1 (en) * | 2003-12-10 | 2013-03-21 | Nintendo Co., Ltd | Hand-held game apparatus and game program |
CN103041592A (en) * | 2011-06-30 | 2013-04-17 | Z124公司 | Dual screen game module |
US20140282731A1 (en) * | 2009-11-17 | 2014-09-18 | Sony Corporation | Display control system, display control device, and display control method |
CN104838353A (en) * | 2012-12-07 | 2015-08-12 | 优特设备有限公司 | Coordination contextual display data on a display screen |
CN105247469A (en) * | 2013-04-30 | 2016-01-13 | 微软技术许可有限责任公司 | Automatically manipulating visualized data based on interactivity |
JP5952644B2 (en) * | 2012-05-31 | 2016-07-13 | 任天堂株式会社 | Program, information processing method, information processing apparatus, and display system |
US20170087476A1 (en) * | 2015-09-30 | 2017-03-30 | Sony Interactive Entertainment America Llc | Systems and Methods for Providing Augmented Data-Feed for Game Play Re-creation and Dynamic Replay Entry Points |
CN107402633A (en) * | 2017-07-25 | 2017-11-28 | 深圳市鹰硕技术有限公司 | A kind of safety education system based on image simulation technology |
CN107783741A (en) * | 2016-08-25 | 2018-03-09 | 中兴通讯股份有限公司 | Control the method, device and mobile terminal of multiple mobile terminal screen tiled displays |
CN108126344A (en) * | 2018-01-24 | 2018-06-08 | 网易(杭州)网络有限公司 | The sharing method of position, storage medium in game |
JP2018097614A (en) * | 2016-12-13 | 2018-06-21 | 株式会社コロプラ | Game method and game program |
CN109101208A (en) * | 2018-08-15 | 2018-12-28 | 网易(杭州)网络有限公司 | Display methods, display device and the display system of interface |
-
2019
- 2019-01-24 CN CN201910068959.1A patent/CN109806583B/en active Active
Patent Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130072298A1 (en) * | 2003-12-10 | 2013-03-21 | Nintendo Co., Ltd | Hand-held game apparatus and game program |
CN101594403A (en) * | 2008-05-29 | 2009-12-02 | Lg电子株式会社 | Transparent display and method of operation thereof |
CN102077161A (en) * | 2008-06-30 | 2011-05-25 | 日本电气株式会社 | Information processing device, display control method, and recording medium |
US20140282731A1 (en) * | 2009-11-17 | 2014-09-18 | Sony Corporation | Display control system, display control device, and display control method |
CN101916186A (en) * | 2010-07-30 | 2010-12-15 | 深圳创维-Rgb电子有限公司 | Method, device and terminal for expanded display of mobile terminal view |
CN103041592A (en) * | 2011-06-30 | 2013-04-17 | Z124公司 | Dual screen game module |
JP5952644B2 (en) * | 2012-05-31 | 2016-07-13 | 任天堂株式会社 | Program, information processing method, information processing apparatus, and display system |
CN104838353A (en) * | 2012-12-07 | 2015-08-12 | 优特设备有限公司 | Coordination contextual display data on a display screen |
CN105247469A (en) * | 2013-04-30 | 2016-01-13 | 微软技术许可有限责任公司 | Automatically manipulating visualized data based on interactivity |
US20170087476A1 (en) * | 2015-09-30 | 2017-03-30 | Sony Interactive Entertainment America Llc | Systems and Methods for Providing Augmented Data-Feed for Game Play Re-creation and Dynamic Replay Entry Points |
CN107783741A (en) * | 2016-08-25 | 2018-03-09 | 中兴通讯股份有限公司 | Control the method, device and mobile terminal of multiple mobile terminal screen tiled displays |
JP2018097614A (en) * | 2016-12-13 | 2018-06-21 | 株式会社コロプラ | Game method and game program |
CN107402633A (en) * | 2017-07-25 | 2017-11-28 | 深圳市鹰硕技术有限公司 | A kind of safety education system based on image simulation technology |
CN108126344A (en) * | 2018-01-24 | 2018-06-08 | 网易(杭州)网络有限公司 | The sharing method of position, storage medium in game |
CN109101208A (en) * | 2018-08-15 | 2018-12-28 | 网易(杭州)网络有限公司 | Display methods, display device and the display system of interface |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111672106A (en) * | 2020-06-05 | 2020-09-18 | 腾讯科技(深圳)有限公司 | Virtual scene display method and device, computer equipment and storage medium |
CN111672106B (en) * | 2020-06-05 | 2022-05-24 | 腾讯科技(深圳)有限公司 | Virtual scene display method and device, computer equipment and storage medium |
CN112604302A (en) * | 2020-12-17 | 2021-04-06 | 腾讯科技(深圳)有限公司 | Interaction method, device, equipment and storage medium of virtual object in virtual environment |
CN112704883A (en) * | 2020-12-30 | 2021-04-27 | 腾讯科技(深圳)有限公司 | Method, device, terminal and storage medium for grouping virtual objects in virtual environment |
CN112704883B (en) * | 2020-12-30 | 2022-08-05 | 腾讯科技(深圳)有限公司 | Method, device, terminal and storage medium for grouping virtual objects in virtual environment |
Also Published As
Publication number | Publication date |
---|---|
CN109806583B (en) | 2021-11-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11703993B2 (en) | Method, apparatus and device for view switching of virtual environment, and storage medium | |
JP7143007B2 (en) | Virtual item transfer method, apparatus, electronic device and computer program | |
CN108671543A (en) | Labelled element display methods, computer equipment and storage medium in virtual scene | |
CN109529319A (en) | Display methods, equipment and the storage medium of interface control | |
CN108710525A (en) | Map methods of exhibiting, device, equipment and storage medium in virtual scene | |
CN111035918A (en) | Reconnaissance interface display method and device based on virtual environment and readable storage medium | |
CN110465073A (en) | Method, apparatus, equipment and the readable storage medium storing program for executing that visual angle adjusts in virtual environment | |
CN110496392B (en) | Virtual object control method, device, terminal and storage medium | |
CN109634413B (en) | Method, device and storage medium for observing virtual environment | |
CN109917910B (en) | Method, device and equipment for displaying linear skills and storage medium | |
CN112704883A (en) | Method, device, terminal and storage medium for grouping virtual objects in virtual environment | |
CN108295465A (en) | Share the method, apparatus, equipment and storage medium in the visual field in three-dimensional virtual environment | |
CN111273780B (en) | Animation playing method, device and equipment based on virtual environment and storage medium | |
CN111325822B (en) | Method, device and equipment for displaying hot spot diagram and readable storage medium | |
CN108536295A (en) | Object control method, apparatus in virtual scene and computer equipment | |
CN109806583A (en) | Method for displaying user interface, device, equipment and system | |
CN110393916A (en) | Method, apparatus, equipment and the storage medium of visual angle rotation | |
US20220291791A1 (en) | Method and apparatus for determining selected target, device, and storage medium | |
CN109821237A (en) | Method, apparatus, equipment and the storage medium of visual angle rotation | |
CN110833695B (en) | Service processing method, device, equipment and storage medium based on virtual scene | |
CN108744511A (en) | Gun sight display methods, equipment and storage medium in virtual environment | |
WO2022237076A1 (en) | Method and apparatus for controlling avatar, and device and computer-readable storage medium | |
CN112604302B (en) | Interaction method, device, equipment and storage medium of virtual object in virtual environment | |
CN111035929B (en) | Elimination information feedback method, device, equipment and medium based on virtual environment | |
CN113289336A (en) | Method, apparatus, device and medium for tagging items in a virtual environment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |