CN111035918A - Reconnaissance interface display method and device based on virtual environment and readable storage medium - Google Patents

Reconnaissance interface display method and device based on virtual environment and readable storage medium Download PDF

Info

Publication number
CN111035918A
CN111035918A CN201911143019.0A CN201911143019A CN111035918A CN 111035918 A CN111035918 A CN 111035918A CN 201911143019 A CN201911143019 A CN 201911143019A CN 111035918 A CN111035918 A CN 111035918A
Authority
CN
China
Prior art keywords
scout
virtual environment
environment
virtual
virtual object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911143019.0A
Other languages
Chinese (zh)
Other versions
CN111035918B (en
Inventor
刘智洪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201911143019.0A priority Critical patent/CN111035918B/en
Publication of CN111035918A publication Critical patent/CN111035918A/en
Application granted granted Critical
Publication of CN111035918B publication Critical patent/CN111035918B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5258Changing parameters of virtual cameras by dynamically adapting the position of the virtual camera to keep a game object or game character in its viewing frustum, e.g. for tracking a character or a ball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5372Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for tagging characters, objects or locations in the game scene, e.g. displaying a circle under the character controlled by the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a reconnaissance interface display method and device based on a virtual environment and a readable storage medium, and relates to the field of virtual environments. The method comprises the following steps: displaying a virtual environment interface, wherein a small map display area is superposed and displayed on an environment picture; receiving a scout release signal; displaying a scout picture in a virtual environment interface according to the scout release signal; a second virtual object located within the reconnaissance range is marked in the map zoom screen in the small map display area. The reconnaissance aircraft prop is released in the virtual environment, and the reconnaissance aircraft track tool marks the second virtual object in the virtual environment, so that the first virtual object can determine the position of the virtual object outside the sight line range in the virtual environment, the information quantity which can be received by the first virtual object when the virtual environment is observed is increased, and the observation efficiency of the first virtual object on observing the virtual environment is improved.

Description

Reconnaissance interface display method and device based on virtual environment and readable storage medium
Technical Field
The embodiment of the application relates to the field of virtual environments, in particular to a reconnaissance interface display method and device based on a virtual environment and a readable storage medium.
Background
On terminals such as smartphones, tablets and the like, there are many applications with three-dimensional virtual environments in which display elements such as virtual objects, ground and the like are three-dimensionally effected using three-dimensional stereoscopic models. The user can control the virtual object to be in running, walking, standing, lying, creeping and other postures in the three-dimensional virtual environment, and the visual angle of the virtual object can be adjusted through dragging operation.
In general, in the user interface of the application program, a user may adjust the viewing angle of the virtual object by sliding on a screen, where the sliding on the screen is used to rotate the direction (i.e., the viewing angle direction) that the virtual object faces in the three-dimensional virtual environment, so that the rotated viewing angle direction is determined as the direction that the virtual object faces.
However, the above observation method for the virtual environment cannot generate an auxiliary effect on the virtual battle fighting process, the observation result is relatively single, and when the observation is performed by the above observation method, the environment information of the virtual environment which can be expressed is limited, some information cannot be reflected by the observation method, and the observation efficiency for the virtual environment is relatively low.
Disclosure of Invention
The embodiment of the application provides a reconnaissance interface display method and device based on a virtual environment and a readable storage medium, and can solve the problems that an observation mode of the virtual environment cannot generate an auxiliary effect on a fighting process of virtual fighting, an observation result is single, and observation efficiency of the virtual environment is low. The technical scheme is as follows:
in one aspect, a scout interface display method based on a virtual environment is provided, and the method includes:
displaying a virtual environment interface, wherein the virtual environment interface comprises an environment picture for observing the virtual environment by a first virtual object, a small map display area is superposed and displayed on the environment picture, and the small map display area comprises a map zooming picture in the virtual environment;
receiving a scout release signal, the scout release signal being used to scout the virtual environment using a scout prop;
displaying a scout picture in the virtual environment interface according to the scout aircraft release signal, wherein the scout picture comprises a controller held by the first virtual object, and the controller is used for controlling the scout aircraft prop to move in the virtual environment;
and marking a second virtual object positioned in a reconnaissance range in the map zoom picture according to the position of the reconnaissance aircraft prop in the virtual environment in the small map display area.
In another aspect, a scout interface display apparatus based on a virtual environment is provided, the apparatus including:
the display module is used for displaying a virtual environment interface, wherein the virtual environment interface comprises an environment picture for observing the virtual environment by a first virtual object, a small map display area is superposed and displayed on the environment picture, and the small map display area comprises a map zooming picture in the virtual environment;
the receiving module is used for receiving a scout aircraft release signal, and the scout aircraft release signal is used for scouting the virtual environment by using a scout aircraft prop;
the display module is further configured to display a scout picture in the virtual environment interface according to the scout aircraft release signal, the scout picture includes a controller held by the first virtual object, and the controller is configured to control the scout aircraft prop to move in the virtual environment;
and the marking module is used for marking a second virtual object positioned in a reconnaissance range in the map zooming picture according to the position of the reconnaissance aircraft prop in the virtual environment in the small map display area.
In another aspect, a computer device is provided, which includes a processor and a memory, wherein the memory stores at least one instruction, at least one program, code set, or instruction set, and the at least one instruction, the at least one program, the code set, or the instruction set is loaded and executed by the processor to implement the method for displaying a virtual environment-based scout interface according to any one of the embodiments of the present application.
In another aspect, a computer-readable storage medium is provided, in which at least one instruction, at least one program, a set of codes, or a set of instructions is stored, and loaded and executed by the processor to implement the virtual environment-based scout interface display method according to any one of the embodiments of the present application.
In another aspect, a computer program product is provided, which when run on a computer causes the computer to perform the method of displaying a virtual environment-based scout interface as described in any one of the embodiments of the present application.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
the reconnaissance aircraft prop is released in the virtual environment, and the reconnaissance aircraft track tool marks the second virtual object in the virtual environment, so that the first virtual object can determine the position of the virtual object outside the sight line range in the virtual environment, the information quantity which can be received by the first virtual object when the virtual environment is observed is increased, and the observation efficiency of the first virtual object on observing the virtual environment is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is an interface schematic diagram of a prop equipment interface provided in an exemplary embodiment of the present application;
fig. 2 is a block diagram of a terminal according to an exemplary embodiment of the present application;
FIG. 3 is a schematic illustration of an implementation environment provided by an exemplary embodiment of the present application;
FIG. 4 is a flowchart of a method for displaying a virtual environment based scout interface according to an exemplary embodiment of the present application;
fig. 5 is an interface schematic diagram of a scout release process provided based on the embodiment shown in fig. 4;
FIG. 6 is a schematic interface diagram of a scout driveway release provided based on the embodiment shown in FIG. 4;
FIG. 7 is a schematic diagram of switching a model held by a first virtual object provided based on the embodiment shown in FIG. 4;
FIG. 8 is a logic diagram of state machine switching provided based on the embodiment shown in FIG. 4;
FIG. 9 is a flowchart of a method for displaying a virtual environment based scout interface according to another exemplary embodiment of the present application;
fig. 10 is a schematic diagram of a process for determining a reconnaissance range according to an angle range provided based on the embodiment shown in fig. 9;
FIG. 11 is a schematic illustration of a laser scanning process in a small map display area provided based on the embodiment shown in FIG. 9;
FIG. 12 is a flowchart of a method for displaying a virtual environment based scout interface according to another exemplary embodiment of the present application;
fig. 13 is an overall flowchart of a scout aircraft prop release process provided in an exemplary embodiment of the present application;
fig. 14 is a block diagram of a display device of a virtual environment-based scout interface according to an exemplary embodiment of the present application;
fig. 15 is a block diagram of a virtual environment-based scout interface display apparatus according to another exemplary embodiment of the present application;
fig. 16 is a block diagram of a terminal according to an exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
First, terms referred to in the embodiments of the present application are briefly described:
virtual environment: is a virtual environment that is displayed (or provided) when an application is run on the terminal. The virtual environment may be a simulation environment of a real world, a semi-simulation semi-fictional environment, or a pure fictional environment. The virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment, which is not limited in this application. The following embodiments are illustrated with the virtual environment being a three-dimensional virtual environment.
Virtual object: refers to a movable object in a virtual environment. The movable object can be a virtual character, a virtual animal, an animation character, etc., such as: characters, animals, plants, oil drums, walls, stones, etc. displayed in a three-dimensional virtual environment. Optionally, the virtual object is a three-dimensional volumetric model created based on animated skeletal techniques. Each virtual object has its own shape and volume in the three-dimensional virtual environment, occupying a portion of the space in the three-dimensional virtual environment.
Unmanned scout: the unmanned reconnaissance system refers to a virtual unmanned aerial vehicle for reconnaissance of a position of a virtual object in a virtual environment, and optionally, the unmanned reconnaissance system may be used for reconnaissance of a position of a virtual object by an enemy, a position of a teammate virtual object, and a position of a designated virtual article in the virtual environment, such as: appointing the position of the gun prop, the position of the medicine prop and the like. Optionally, taking an example that the unmanned reconnaissance aircraft performs reconnaissance on the virtual object by using an enemy, after the unmanned reconnaissance aircraft scans the virtual environment, determining a position of the enemy virtual object in the virtual environment, and mapping the position of the enemy virtual object in the virtual environment to a small map for displaying. Optionally, after the unmanned scout is released into the virtual environment, the unmanned scout may be visible or invisible, or may be visible for a part of the virtual objects and invisible for other virtual objects, such as: after the virtual object A releases the unmanned scout into the virtual environment, the unmanned scout is invisible to the virtual object A and its teammates, and is visible to the enemy virtual object of the virtual object A.
Optionally, the unmanned reconnaissance aircraft corresponds to a single-application duration in the opposite office, that is, when the unmanned reconnaissance aircraft is released, the unmanned reconnaissance aircraft can reconnaissance the virtual environment within the single-application duration, and when the reconnaissance duration reaches the single-application duration, the reconnaissance process is ended. Optionally, the unmanned reconnaissance plane may be released once in a single session, or may be released multiple times without being destroyed, and the embodiment of the present application does not limit the release manner of the unmanned reconnaissance plane.
Optionally, the unmanned scout plane needs to be equipped as a prop before the start of the game, i.e. when the player is equipped with the unmanned scout plane before the start of the game, the player can control the virtual object to apply to the unmanned scout plane during the game.
Referring to fig. 1, which is a schematic view illustrating an interface diagram of a property equipment interface provided in an exemplary embodiment of the present application, as shown in fig. 1, a property selection area 110 is displayed in the property equipment interface 100, where the property selection area 110 is used for displaying candidate properties, and includes an unmanned scout plane 111, when the unmanned scout plane 111 is checked, the unmanned scout plane 111 is equipped before a game is started, and the unmanned scout plane 111 can be applied in a game process.
The terminal in the present application may be a desktop computer, a laptop computer, a mobile phone, a tablet computer, an e-book reader, an MP3(Moving Picture Experts Group Audio Layer III, mpeg compression standard Audio Layer 3) player, an MP4(Moving Picture Experts Group Audio Layer IV, mpeg compression standard Audio Layer 4) player, and so on. The terminal is installed and operated with an application program supporting a virtual environment, such as an application program supporting a three-dimensional virtual environment. The application program may be any one of a virtual reality application program, a three-dimensional map program, a military simulation program, a Third-Person shooter game (FPS), a First-Person shooter game (FPS), and a Multiplayer online tactical sports game (MOBA). Alternatively, the application program may be a stand-alone application program, such as a stand-alone three-dimensional game program, or may be a network online application program.
Fig. 2 shows a block diagram of an electronic device according to an exemplary embodiment of the present application. The electronic device 200 includes: an operating system 220 and application programs 222.
Operating system 220 is the base software that provides applications 222 with secure access to computer hardware.
Application 222 is an application that supports a virtual environment. Optionally, application 222 is an application that supports a three-dimensional virtual environment. The application 222 may be any one of a virtual reality application, a three-dimensional map program, a military simulation program, a TPS game, an FPS game, an MOBA game, and a multi-player gunfight type live game. The application 222 may be a stand-alone application, such as a stand-alone three-dimensional game program, or an online application.
Fig. 3 shows a block diagram of a computer system provided in an exemplary embodiment of the present application. The computer system 300 includes: a first device 320, a server 340, and a second device 360.
The first device 320 is installed and operated with an application program supporting a virtual environment. The application program can be any one of a virtual reality application program, a three-dimensional map program, a military simulation program, a TPS game, an FPS game, an MOBA game and a multi-player gunfight living game. The first device 320 is a device used by a first user who uses the first device 320 to control a first virtual object located in a virtual environment for activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the first virtual object is a first virtual character, such as a simulated persona or an animated persona.
The first device 320 is connected to the server 340 through a wireless network or a wired network.
The server 340 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. The server 340 is used for providing background services for applications supporting a three-dimensional virtual environment. Alternatively, server 340 undertakes primary computing work and first device 320 and second device 360 undertakes secondary computing work; alternatively, the server 340 undertakes secondary computing work and the first device 320 and the second device 360 undertake primary computing work; alternatively, the server 340, the first device 320, and the second device 360 perform cooperative computing by using a distributed computing architecture.
The second device 360 is installed and operated with an application program supporting a virtual environment. The application program can be any one of a virtual reality application program, a three-dimensional map program, a military simulation program, an FPS game, an MOBA game and a multi-player gun battle type survival game. The second device 360 is a device used by a second user who uses the second device 360 to control a second virtual object located in the virtual environment to perform activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing. Illustratively, the second virtual object is a second virtual character, such as a simulated persona or an animated persona.
Optionally, the first virtual character and the second virtual character are in the same virtual environment. Alternatively, the first avatar and the second avatar may belong to the same team, the same organization, have a friend relationship, or have temporary communication rights. Alternatively, the first virtual character and the second virtual character may belong to different teams, different organizations, or two groups with enemy.
Alternatively, the applications installed on the first device 320 and the second device 360 are the same, or the applications installed on the two devices are the same type of application for different control system platforms. The first device 320 may generally refer to one of a plurality of devices, and the second device 360 may generally refer to one of a plurality of devices, and this embodiment is illustrated by the first device 320 and the second device 360 only. The device types of the first device 320 and the second device 360 are the same or different, and include: at least one of a game console, a desktop computer, a smartphone, a tablet, an e-book reader, an MP3 player, an MP4 player, and a laptop portable computer. The following embodiments are illustrated where the device is a desktop computer.
Those skilled in the art will appreciate that the number of devices described above may be greater or fewer. For example, the number of the devices may be only one, or several tens or hundreds, or more. The number and the type of the devices are not limited in the embodiments of the present application.
Referring to fig. 4, a flowchart of a scout interface display method based on a virtual environment provided in an exemplary embodiment of the present application is shown, which is described by taking the method as an example applied to a terminal, and as shown in fig. 4, the method includes:
step 401, displaying a virtual environment interface, where the virtual environment interface includes an environment picture in which the first virtual object observes the virtual environment, and a minimap display area is superimposed and displayed on the environment picture.
Optionally, the small map display area includes a map zoom screen in the virtual environment.
Optionally, the small map display area includes a zoom screen obtained by mapping the three-dimensional model in the virtual environment onto a two-dimensional plane and zooming. Optionally, when a zoom screen is displayed in the small map display area, the zoom screen performs two-dimensional mapping on three-dimensional models of the virtual environment except for the virtual character, and represents the virtual character with a specified mark, such as: an icon with a pointing arrow, with the pointing arrow representing the direction the virtual character is facing in the virtual environment.
Optionally, the environment picture is a picture of observing the virtual environment at a first person perspective of the first virtual object, or the environment picture is a picture of observing the virtual environment at a third person perspective of the first virtual object.
Step 402, receiving a scout release signal, the scout release signal being used for scouting the virtual environment using the scout prop.
Optionally, the triggering mode of the scout aircraft release signal includes at least one of the following modes:
firstly, the virtual environment interface also comprises a scout aircraft release control, and when a selection operation on the scout aircraft release control is received, the scout aircraft release signal is generated;
optionally, the scout release control may be continuously displayed; the data can be displayed after condition matching according to the fighting data; the display can be continuously performed, and the display mode can be switched according to the condition matching result of the fighting data, such as: the fighting data of the first virtual object is matched with the data requirement, when the fighting data do not meet the data requirement, the scout aircraft release control is displayed in a non-clickable mode, when the fighting data meet the data requirement, the scout aircraft release control is switched to a clickable mode, and the scout aircraft release control is used for receiving a scout aircraft release signal.
Referring to fig. 5, which is a schematic view of an interface of a scout release process provided in an exemplary embodiment of the present application, as shown in fig. 5, a scout release control 510 is displayed in an environment interface 500, the scout release control 510 is currently displayed in an unpickable mode, the fighting data 520 of the first virtual object in the game play is 320, and when the fighting data 520 of the first virtual object in the game play jumps to 410, the scout release control 510 is switched to a clickable mode.
Secondly, match the fight data of the first virtual object with the data requirement, and automatically trigger the reconnaissance aircraft to release signals when the fight data meets the data requirement;
thirdly, the virtual environment comprises a scout aircraft parking area, when the first virtual object moves to the scout aircraft parking area, the scout aircraft release control is displayed in an environment interface, and when a click operation on the scout aircraft release control is received, the scout aircraft release signal is generated;
fourthly, the virtual environment comprises a scout aircraft parking area, and when the first virtual object moves to the scout aircraft parking area, the scout aircraft release signal is automatically triggered;
and fifthly, the terminal receives the voice control signal and generates a scout release signal according to the voice control signal.
Optionally, this reconnaissance aircraft prop can realize as the props of unmanned aerial vehicle form flight in the virtual environment, also can realize as the props of reconnaissance car form removal on the ground of virtual environment, and this application embodiment does not restrict the form of this reconnaissance aircraft prop.
And step 403, displaying a scout picture in the virtual environment interface according to the scout release signal, wherein the scout picture comprises a controller held by the first virtual object.
Optionally, the controller is for controlling the movement of the scout aircraft prop in the virtual environment.
Optionally, the controller includes a control button, the first virtual object performs corresponding operation on the controller according to a control operation of the user on the reconnaissance aircraft prop, illustratively, the reconnaissance screen includes a rocker control for controlling the reconnaissance aircraft prop, the controller includes a direction case for indicating control over the reconnaissance aircraft prop, and when a leftward sliding operation on the remote sensing control is received, indicating that the reconnaissance aircraft prop is controlled to move leftward in a virtual environment, the operation that the first virtual object presses the leftward movement case on the controller is displayed on the reconnaissance screen.
Referring to fig. 6, which is a schematic view of an interface for detecting the release of a flight path according to an exemplary embodiment of the present application, as shown in fig. 6, scout screen 600 including hand 610 of the first virtual object after the scout lane is released, the hand 610 holds a controller 620, and the hand 610 of the first virtual object simulates corresponding operations performed on the controller 620 according to the control operation for controlling the scout aircraft prop received by the terminal, as shown in fig. 6, also included in scout view 600 is a rocker control 630, the telemetry control 630 for receiving control operations, and controls the scout aircraft prop to move in the virtual environment according to the control operation, the left movement operation triggered by the user is received on the rocker control 630, the detection screen 600 is used for controlling the detection engine prop to move to the left side in the virtual environment, so that the first virtual object hand 610 in the detection screen 600 presses the left-moving key. Optionally, as shown in fig. 6, a minimap display area 640 is also displayed in the scout screen 600, and an identification point 641 is displayed in the minimap display area 640, and the identification point 641 is used for indicating the position of the second virtual object in the virtual environment.
Referring to fig. 7, when a scout release signal is received, the first virtual object switches the held virtual weapon model to a controller model 710, the controller model 710 is used for simulating the control process of the scout prop, and as shown in fig. 7, the controller model 710 simulates the movement of the scout prop to a target position 720.
Schematically, referring to fig. 8, after the virtual weapon model held by the first virtual object is switched to the controller model, the animation state machine switches to play an animation 810 calling the drone, the animation 810 calling the drone is continuously played in the scanning process of the drone, the playing time length corresponds to the release time length of the drone, and the reconnaissance process of the drone is ended until the animation 810 is played completely. During the state machine duration of the animation 810, other state machines are in a stopped state.
And step 404, marking a second virtual object positioned in the detection range in the map zooming picture according to the position of the detection aircraft prop in the virtual environment in the small map display area.
Optionally, when the second virtual object in the reconnaissance picture is identified according to the reconnaissance aircraft release signal, the reconnaissance aircraft prop is first released to the first position corresponding to the first virtual object according to the reconnaissance aircraft release signal, the first environment range is determined in the virtual environment according to the first position and the reconnaissance range of the reconnaissance aircraft prop, and the second virtual object located in the first environment range is marked in the small map display area.
Optionally, the second virtual object is a hostile virtual object of the first virtual object, and a position of the second virtual object is marked in the map zooming picture in a dot form.
Alternatively, when a second virtual object is marked in the map zoom screen of the first virtual object, a prompt message is sent to the second virtual object and displayed in a terminal that controls the second virtual object, such as "you have been identified on the map by an unmanned scout" or "you have been identified on the map by a scout that has been released by the first virtual object".
In summary, according to the reconnaissance interface display method based on the virtual environment provided in this embodiment, the reconnaissance aircraft prop is released in the virtual environment, and the reconnaissance aircraft track tool marks the second virtual object located in the virtual environment, so that the first virtual object can determine the position of the virtual object outside the sight line range in the virtual environment, the amount of information that the first virtual object can receive when observing the virtual environment is increased, and the observation efficiency of the first virtual object in observing the virtual environment is improved.
In an optional embodiment, after being released, the reconnaissance aircraft prop moves to a first position corresponding to the first virtual object, and determines a second virtual object to be identified according to a reconnaissance range of the reconnaissance aircraft prop, where fig. 9 is a flowchart of a reconnaissance interface display method based on a virtual environment according to another exemplary embodiment of the present application, and taking application of the method to a terminal as an example for explanation, as shown in fig. 9, the method includes:
step 901, displaying a virtual environment interface, where the virtual environment interface includes an environment picture in which the first virtual object observes the virtual environment, and a minimap display area is displayed on the environment picture in an overlapping manner.
Optionally, the small map display area includes a map zoom screen in the virtual environment.
Optionally, the small map display area includes a zoom screen obtained by mapping the three-dimensional model in the virtual environment onto a two-dimensional plane and zooming.
Step 902, receiving a scout release signal, the scout release signal being used for scouting the virtual environment using the scout prop.
Optionally, this reconnaissance aircraft prop can realize as the props of unmanned aerial vehicle form flight in the virtual environment, also can realize as the props of reconnaissance car form removal on the ground of virtual environment, and this application embodiment does not restrict the form of this reconnaissance aircraft prop.
Step 903, releasing the scout aircraft prop to a first position corresponding to the first virtual object according to the scout aircraft release signal.
Optionally, the first position may be a preset position corresponding to the first virtual object, such as: releasing the scout tunnel tool to a preset distance at the top of the first virtual object according to the scout release signal; alternatively, the first location may be a randomly determined location in the virtual environment; alternatively, the first position may be determined according to the distribution of the second virtual object in the virtual environment, such as: determining a first position of the reconnaissance aircraft prop according to an area in the virtual environment where the second virtual object is most concentrated; alternatively, the first position may be a position specified by the user when the reconnaissance aircraft prop is released.
Optionally, when this reconnaissance aircraft prop realizes being the prop of unmanned aerial vehicle form flight in virtual environment, this reconnaissance aircraft prop can correspond and have fixed flying height, also can adjust flying height according to control, optionally, the flying height correspondence of this reconnaissance aircraft prop has height restriction, if: the flight height of the reconnaissance aircraft prop in the virtual environment ranges from 0 m to 10 m.
And step 904, determining a first environment range in the virtual environment according to the first position and the reconnaissance range of the reconnaissance aircraft prop.
Optionally, the reconnaissance aircraft lane is implemented as a flight prop in the form of an unmanned aerial vehicle, and when the reconnaissance aircraft lane corresponds to a fixed flight height, the reconnaissance range may be implemented as a circular range obtained by dividing the reconnaissance aircraft prop by a preset radius; or, the reconnaissance range may also be a rectangular range obtained by dividing the reconnaissance aircraft prop by a preset side length as a center. Alternatively, when the flight height of the reconnaissance aircraft prop is adjustable, the reconnaissance range may be an area within a preset angle range with the reconnaissance aircraft prop as a reconnaissance starting point.
Schematically, taking the reconnaissance range as an example of a region in a preset angle range with the reconnaissance aircraft prop as a reconnaissance starting point, please refer to fig. 10, a reconnaissance aircraft lane 1011 is displayed in a reconnaissance screen 1010, the reconnaissance aircraft lane 1011 is taken as a reconnaissance starting point, and a region in a first angle 1012 range is taken as a reconnaissance range, when the flight height of the reconnaissance aircraft lane 1011 is higher, the reconnaissance range corresponds to a mapping range 1013 in a virtual environment which is larger, and when the flight height of the reconnaissance aircraft prop 1011 is lower, the reconnaissance range corresponds to a mapping range 1013 in a virtual environment which is smaller.
Optionally, a first environment range corresponding to the reconnaissance range in the virtual environment is determined according to the first position where the reconnaissance aircraft prop is located and the reconnaissance range of the reconnaissance aircraft prop. Alternatively, the reconnaissance area may cover the entire virtual environment.
Step 905, marking a second virtual object located in the first environment range in the small map display area.
Optionally, a first coordinate range of the first environment range in the virtual environment is determined, that is, the virtual environment corresponds to a three-dimensional coordinate system, the first environment range is mapped to the three-dimensional coordinate system to obtain a first coordinate range, coordinate data corresponding to a second virtual object in the virtual environment is obtained, and the second virtual object located in the first coordinate range in the coordinate data of the second virtual object is marked at a corresponding position in the small map display area.
Optionally, after the scout picture is displayed according to the scout aircraft release signal, a laser scanning animation is also displayed in the small map display area, that is, the laser scanning animation is played in the small map display area in an overlapping manner, and the laser scanning animation is used for simulating a process of scanning the second virtual object by the scout aircraft prop in the virtual environment. Alternatively, the laser scanning animation may be implemented as an animation of the laser scanning bar moving in a preset reverse direction in the small map display area, such as: the laser scan bar moves from top to bottom in the small map display area. Referring to fig. 11, as shown in fig. 11, a laser scan bar 1111 is displayed in the small map display area 1110, and the laser scan bar 1111 simulates the process of scanning the virtual environment by laser by moving from top to bottom in the small map display area 1110.
In summary, according to the reconnaissance interface display method based on the virtual environment provided in this embodiment, the reconnaissance aircraft prop is released in the virtual environment, and the reconnaissance aircraft track tool marks the second virtual object located in the virtual environment, so that the first virtual object can determine the position of the virtual object outside the sight line range in the virtual environment, the amount of information that the first virtual object can receive when observing the virtual environment is increased, and the observation efficiency of the first virtual object in observing the virtual environment is improved.
According to the method provided by the embodiment, after the reconnaissance range of the reconnaissance aircraft prop is determined, the first environment range corresponding to the reconnaissance range in the virtual environment is determined according to the reconnaissance range and the first position of the reconnaissance aircraft prop, and the second virtual object located in the first environment range is determined, so that the second virtual object is marked, and the observation efficiency of the first virtual object for observing the virtual environment is improved.
In an alternative embodiment, the reconnaissance aircraft may further be controlled to move from the first position to the second position, fig. 12 is a flowchart of a reconnaissance interface display method based on a virtual environment according to another exemplary embodiment of the present application, which is described by taking as an example that the method is applied to a terminal, as shown in fig. 12, the method includes:
step 1201, displaying a virtual environment interface, where the virtual environment interface includes an environment screen in which the first virtual object observes the virtual environment, and a minimap display area is displayed on the environment screen in an overlapping manner.
Optionally, the small map display area includes a map zoom screen in the virtual environment.
Step 1202, receiving a scout release signal, wherein the scout release signal is used for scouting the virtual environment by using a scout prop.
Step 1203, displaying a scout picture in the virtual environment interface according to the scout release signal, wherein the scout picture includes a controller held by the first virtual object.
Optionally, the controller is for controlling the movement of the scout aircraft prop in the virtual environment.
And 1204, marking a second virtual object positioned in the detection range in the map zooming picture according to the position of the detection aircraft prop in the virtual environment in the small map display area.
Optionally, the reconnaissance range corresponds to a first environmental range in the virtual environment, i.e. a second virtual object located within the first environmental range is marked.
Step 1205, receiving a control operation within the control area of the scout, the control operation for controlling the movement of the scout prop to a second location in the virtual environment.
Optionally, a scout control area is also displayed on the scout picture in an overlapping manner.
Optionally, the control operation in the control area includes any one of the following operation modes:
firstly, a rocker control is included in the control area, and the movement of the reconnaissance aircraft prop is correspondingly controlled through dragging operation on the rocker control;
secondly, a coordinate input frame is included in the control area, the scout aircraft prop is moved to a position corresponding to an input coordinate by inputting the corresponding coordinate in the coordinate input frame, namely, a coordinate input operation in the scout control area is received, the coordinate input operation is used for inputting the corresponding coordinate of the second position in the virtual environment, and the scout aircraft prop is moved to the second position according to the coordinate input operation.
Optionally, the coordinate range corresponding to each position in the virtual environment in the small map display area may be combined with the coordinate range displayed in the small map display area to determine the coordinate corresponding to the second position in the virtual environment, and after the coordinate of the second position is input in the coordinate input box, the reconnaissance aircraft prop is moved to the second position.
And step 1206, updating the mark of the second virtual object in the small map display area according to the second position and the reconnaissance range.
Optionally, a second environmental range is determined in the virtual environment according to the second position and the reconnaissance range of the reconnaissance aircraft prop, in a small map display area of the reconnaissance screen, a map zoom screen in the first environmental range is switched to a map zoom screen in the second environmental range, and a second virtual object located in the second environmental range is marked.
In summary, according to the reconnaissance interface display method based on the virtual environment provided in this embodiment, the reconnaissance aircraft prop is released in the virtual environment, and the reconnaissance aircraft track tool marks the second virtual object located in the virtual environment, so that the first virtual object can determine the position of the virtual object outside the sight line range in the virtual environment, the amount of information that the first virtual object can receive when observing the virtual environment is increased, and the observation efficiency of the first virtual object in observing the virtual environment is improved.
According to the method provided by the embodiment, the control operation is received in the control area of the scout, and the first position where the scout aircraft prop is located is moved to the second position according to the control operation, so that the second virtual object in the area ranges corresponding to different positions is scanned, and the observation efficiency of the first virtual object for observing the virtual environment is improved.
Referring to fig. 13, which is a schematic view illustrating an overall flowchart of a process for releasing a reconnaissance aircraft prop according to an exemplary embodiment of the present application, as shown in fig. 13, the method includes:
step 1301, equipping unmanned aerial vehicle skills.
Optionally, the player needs to equip the game play with the drone skill before the game play begins, so that the drone skill can be used in the game play, i.e. the drone can be released in the game play.
Step 1302, determine whether the condition for activating the skill is satisfied.
Optionally, in the game match, whether the condition for activating the unmanned aerial vehicle skill is met is judged according to the game data of the player. Optionally, the game data may be implemented as a point value, and when the point value reaches a required data value, it is determined that a condition for activating the unmanned aerial vehicle skill is satisfied.
Alternatively, the tally value may be added by eliminating opponents in the game play, such as: and (4) eliminating a single enemy virtual object in game play, adding 50 points, and determining that the condition for activating the skill of the unmanned aerial vehicle is met when the point value reaches 400.
And step 1303, highlighting the skill of the unmanned aerial vehicle when the condition of activating the skill is met.
Optionally, the drone skill highlight is used to indicate that the drone skill is releasable. That is, the user can release the unmanned scout in the virtual environment by selecting the unmanned aerial vehicle skill.
Step 1304, determine whether the player clicks on the skill of using the drone.
Optionally, an unmanned aerial vehicle release control is displayed in the environment interface, and whether the player clicks on the unmanned aerial vehicle release control is determined.
Step 1305, when the player clicks on the skill to use the drone, the drone is called.
Optionally, when the player clicks on the skill of using the drone, the drone is released to an initial position in the virtual environment corresponding to the virtual object.
Step 1306, determine whether an enemy is scanned.
Optionally, whether an enemy virtual object exists in the reconnaissance range is determined according to the position of the unmanned reconnaissance plane in the virtual environment and the reconnaissance range of the unmanned reconnaissance plane.
In step 1307, when no enemy is scanned, the scanning is continued.
Step 1308, when an enemy is scanned, red dots are displayed on the minimap.
Optionally, when an enemy virtual object exists in the reconnaissance range, the position of the enemy virtual object is identified on the minimap in the form of a red dot.
Step 1309, determining whether the survival time of the unmanned aerial vehicle is over.
Optionally, the unmanned aerial vehicle corresponds to a reconnaissance time length, and when the survival time reaches the reconnaissance time length, the survival time of the unmanned aerial vehicle is determined to be ended.
At step 1310, when the survival time of the drone is over, the red dots disappear.
In summary, according to the reconnaissance interface display method based on the virtual environment provided in this embodiment, the reconnaissance aircraft prop is released in the virtual environment, and the reconnaissance aircraft track tool marks the second virtual object located in the virtual environment, so that the first virtual object can determine the position of the virtual object outside the sight line range in the virtual environment, and the information amount that the first virtual object can receive when observing the virtual environment is increased.
Fig. 14 is a schematic structural diagram of a virtual environment-based scout interface display apparatus according to an exemplary embodiment of the present application, as shown in fig. 14, the apparatus includes: a display module 1410, a receiving module 1420, and a marking module 1430;
a display module 1410, configured to display a virtual environment interface, where the virtual environment interface includes an environment picture in which a first virtual object observes the virtual environment, a small map display area is displayed on the environment picture in an overlapping manner, and the small map display area includes a map scaling picture in the virtual environment;
a receiving module 1420, configured to receive a scout release signal, where the scout release signal is used to scout the virtual environment using a scout prop;
the display module 1410 is further configured to display a scout picture in the virtual environment interface according to the scout aircraft release signal, where the scout picture includes a controller held by the first virtual object, and the controller is configured to control the scout aircraft prop to move in the virtual environment;
a marking module 1430, configured to mark, in the small map display area, a second virtual object located within a reconnaissance range in the map zoom screen according to the position of the reconnaissance aircraft prop in the virtual environment.
In an alternative embodiment, as shown in FIG. 15, the marking module 1430 includes:
a determining unit 1431, configured to release the scout aircraft prop to a first location corresponding to the first virtual object according to the scout aircraft release signal; determining a first environment range in the virtual environment according to the first position and the reconnaissance range of the reconnaissance aircraft prop;
a marking unit 1432, configured to mark the second virtual object located in the first environment range in the small map display area, where the small map display area is used to display the map zoom screen in the first environment range.
In an optional embodiment, the determining unit is further configured to determine a corresponding first coordinate range of the first environment range in the virtual environment;
the marking module 1430 further includes:
an obtaining unit 1433, configured to obtain coordinate data corresponding to the second virtual object in the virtual environment;
the marking unit 1432 is further configured to mark the second virtual object whose coordinate data is located in the first coordinate range, at a corresponding position in the small map display area.
In an optional embodiment, a scout control area is also displayed on the scout picture in an overlapping mode;
the receiving module 1420 is further configured to receive a control operation within the scout control area, the control operation being configured to control the scout aircraft prop to move to a second location in the virtual environment;
the marking module 1430 is further configured to update the mark of the second virtual object in the small map display area of the scout image according to the second position and the scout range.
In an optional embodiment, the receiving module 1420 is further configured to receive a coordinate input operation in the scout control area, where the coordinate input operation is used to input a corresponding coordinate of the second location in the virtual environment.
In an alternative embodiment, the marking module 1430 further includes:
a determining unit 1431, configured to determine a second environment range in the virtual environment according to the second position and the reconnaissance range of the reconnaissance aircraft prop;
a marking unit 1432, configured to switch the map zoom screen within the first environment range to the map zoom screen within the second environment range in the small map display area, and mark the second virtual object located within the second environment range.
In an optional embodiment, the display module 1410 is further configured to superimpose and play a laser scanning animation in the small map display area, where the laser scanning animation is used to simulate a process of the scout aircraft prop scanning the second virtual object in the virtual environment.
In an optional embodiment, the display module 1410 is further configured to match the engagement data of the first virtual object with data requirements; and when the fighting data meet the data requirements, switching a scout release control into a clickable mode, wherein the scout release control is used for receiving the scout release signal.
In summary, the reconnaissance interface display apparatus based on a virtual environment provided in this embodiment releases the reconnaissance aircraft prop in the virtual environment, and marks the second virtual object located in the virtual environment through the reconnaissance aircraft track tool, so that the first virtual object can determine the position of the virtual object outside the sight line range in the virtual environment, the amount of information that the first virtual object can receive when observing the virtual environment is increased, and the observation efficiency of the first virtual object observing the virtual environment is improved.
It should be noted that: the reconnaissance interface display device based on the virtual environment provided in the above embodiment is only illustrated by the division of the above functional modules, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to complete all or part of the above described functions. In addition, the reconnaissance interface display device based on the virtual environment provided by the above embodiment and the reconnaissance interface display method based on the virtual environment belong to the same concept, and the specific implementation process thereof is detailed in the method embodiment and is not described herein again.
Fig. 16 shows a block diagram of a terminal 1600 according to an exemplary embodiment of the present invention. The terminal 1600 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio layer iii, motion video Experts compression standard Audio layer 3), an MP4 player (Moving Picture Experts Group Audio layer IV, motion video Experts compression standard Audio layer 4), a notebook computer, or a desktop computer. Terminal 1600 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, etc.
Generally, terminal 1600 includes: a processor 1601, and a memory 1602.
Processor 1601 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 1601 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). Processor 1601 may also include a main processor and a coprocessor, where the main processor is a processor for processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1601 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content that the display screen needs to display. In some embodiments, the processor 1601 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 1602 may include one or more computer-readable storage media, which may be non-transitory. The memory 1602 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1602 is configured to store at least one instruction for execution by processor 1601 to implement a virtual environment based scout interface display method provided by method embodiments herein.
In some embodiments, the terminal 1600 may also optionally include: peripheral interface 1603 and at least one peripheral. Processor 1601, memory 1602 and peripheral interface 1603 may be connected by buses or signal lines. Various peripheral devices may be connected to peripheral interface 1603 via buses, signal lines, or circuit boards. Specifically, the peripheral device includes: at least one of a radio frequency circuit 1604, a touch screen display 1605, a camera 1606, audio circuitry 1607, a positioning component 1608, and a power supply 1609.
Peripheral interface 1603 can be used to connect at least one I/O (Input/Output) related peripheral to processor 1601 and memory 1602. In some embodiments, processor 1601, memory 1602, and peripheral interface 1603 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1601, the memory 1602 and the peripheral device interface 1603 may be implemented on a separate chip or circuit board, which is not limited by this embodiment.
The Radio Frequency circuit 1604 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 1604 communicates with communication networks and other communication devices via electromagnetic signals. The rf circuit 1604 converts the electrical signal into an electromagnetic signal to be transmitted, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1604 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1604 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the rf circuit 1604 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1605 is used to display a UI (user interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1605 is a touch display screen, the display screen 1605 also has the ability to capture touch signals on or over the surface of the display screen 1605. The touch signal may be input to the processor 1601 as a control signal for processing. At this point, the display 1605 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 1605 may be one, providing the front panel of the terminal 1600; in other embodiments, the display screens 1605 can be at least two, respectively disposed on different surfaces of the terminal 1600 or in a folded design; in still other embodiments, display 1605 can be a flexible display disposed on a curved surface or a folded surface of terminal 1600. Even further, the display 1605 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The Display 1605 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), or other materials.
The camera assembly 1606 is used to capture images or video. Optionally, camera assembly 1606 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1606 can also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuitry 1607 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1601 for processing or inputting the electric signals to the radio frequency circuit 1604 to achieve voice communication. For stereo sound acquisition or noise reduction purposes, the microphones may be multiple and disposed at different locations of terminal 1600. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1601 or the radio frequency circuit 1604 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuit 1607 may also include a headphone jack.
The positioning component 1608 is configured to locate a current geographic location of the terminal 1600 for navigation or LBS (location based Service). The positioning component 1608 may be a positioning component based on the GPS (global positioning System) of the united states, the beidou System of china, or the galileo System of russia.
Power supply 1609 is used to provide power to the various components of terminal 1600. Power supply 1609 may be alternating current, direct current, disposable or rechargeable. When power supply 1609 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 1600 also includes one or more sensors 1610. The one or more sensors 1610 include, but are not limited to: acceleration sensor 1611, gyro sensor 1612, pressure sensor 1613, fingerprint sensor 1614, optical sensor 1615, and proximity sensor 1616.
Acceleration sensor 1611 may detect acceleration in three coordinate axes of a coordinate system established with terminal 1600. For example, the acceleration sensor 1611 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 1601 may control the touch display screen 1605 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1611. The acceleration sensor 1611 may also be used for acquisition of motion data of a game or a user.
Gyroscope sensor 1612 can detect the organism direction and the turned angle of terminal 1600, and gyroscope sensor 1612 can gather the 3D action of user to terminal 1600 with acceleration sensor 1611 in coordination. From the data collected by the gyro sensor 1612, the processor 1601 may perform the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensors 1613 may be disposed on a side bezel of terminal 1600 and/or underlying touch display 1605. When the pressure sensor 1613 is disposed on the side frame of the terminal 1600, a user's holding signal of the terminal 1600 can be detected, and the processor 1601 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1613. When the pressure sensor 1613 is disposed at the lower layer of the touch display 1605, the processor 1601 controls the operability control on the UI interface according to the pressure operation of the user on the touch display 1605. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1614 is configured to collect a fingerprint of the user, and the processor 1601 is configured to identify the user based on the fingerprint collected by the fingerprint sensor 1614, or the fingerprint sensor 1614 is configured to identify the user based on the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, the processor 1601 authorizes the user to perform relevant sensitive operations including unlocking a screen, viewing encrypted information, downloading software, paying for and changing settings, etc. The fingerprint sensor 1614 may be disposed on the front, back, or side of the terminal 1600. When a physical key or vendor Logo is provided on the terminal 1600, the fingerprint sensor 1614 may be integrated with the physical key or vendor Logo.
The optical sensor 1615 is used to collect ambient light intensity. In one embodiment, the processor 1601 may control the display brightness of the touch display screen 1605 based on the ambient light intensity collected by the optical sensor 1615. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1605 is increased; when the ambient light intensity is low, the display brightness of the touch display 1605 is turned down. In another embodiment, the processor 1601 may also dynamically adjust the shooting parameters of the camera assembly 1606 based on the ambient light intensity collected by the optical sensor 1615.
A proximity sensor 1616, also referred to as a distance sensor, is typically disposed on the front panel of terminal 1600. The proximity sensor 1616 is used to collect the distance between the user and the front surface of the terminal 1600. In one embodiment, the processor 1601 controls the touch display 1605 to switch from the light screen state to the rest screen state when the proximity sensor 1616 detects that the distance between the user and the front surface of the terminal 1600 is gradually decreased; when the proximity sensor 1616 detects that the distance between the user and the front surface of the terminal 1600 is gradually increased, the touch display 1605 is controlled by the processor 1601 to switch from the breath screen state to the bright screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 16 is not intended to be limiting of terminal 1600, and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be employed.
Optionally, the computer-readable storage medium may include: a Read Only Memory (ROM), a Random Access Memory (RAM), a Solid State Drive (SSD), or an optical disc. The Random Access Memory may include a resistive Random Access Memory (ReRAM) and a Dynamic Random Access Memory (DRAM). The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (15)

1. A spy interface display method based on a virtual environment, the method comprising:
displaying a virtual environment interface, wherein the virtual environment interface comprises an environment picture for observing the virtual environment by a first virtual object, a small map display area is superposed and displayed on the environment picture, and the small map display area comprises a map zooming picture in the virtual environment;
receiving a scout release signal, the scout release signal being used to scout the virtual environment using a scout prop;
displaying a scout picture in the virtual environment interface according to the scout aircraft release signal, wherein the scout picture comprises a controller held by the first virtual object, and the controller is used for controlling the scout aircraft prop to move in the virtual environment;
and marking a second virtual object positioned in a reconnaissance range in the map zoom picture according to the position of the reconnaissance aircraft prop in the virtual environment in the small map display area.
2. The method of claim 1, wherein marking a second virtual object within a reconnaissance range in the map zoom based on the position of the reconnaissance aircraft prop in the virtual environment in the small map display area comprises:
releasing the scout aircraft prop to a first position corresponding to the first virtual object according to the scout aircraft release signal;
determining a first environment range in the virtual environment according to the first position and the reconnaissance range of the reconnaissance aircraft prop;
and marking the second virtual object positioned in the first environment range in the small map display area, wherein the small map display area is used for displaying the map zooming picture in the first environment range.
3. The method of claim 2, wherein marking the second virtual object located within the first environmental area in the small map display area of the scout view comprises:
determining a corresponding first coordinate range of the first environment range in the virtual environment;
acquiring coordinate data corresponding to the second virtual object in the virtual environment;
marking the second virtual object with the coordinate data positioned in the first coordinate range at the corresponding position in the small map display area.
4. The method according to claim 2 or 3, characterized in that a scout control area is also displayed on the scout picture in an overlapping manner;
after the marking the second virtual object located in the first environment range, the method further includes:
receiving a control operation within the scout control area, the control operation to control the movement of the scout aircraft prop to a second location in the virtual environment;
and updating the mark of the second virtual object in a small map display area of the scout picture according to the second position and the scout range.
5. The method of claim 4, wherein receiving control operations within the scout control area comprises:
receiving a coordinate input operation within the scout control area, the coordinate input operation being used to input corresponding coordinates of the second location in the virtual environment.
6. The method according to claim 4, wherein updating the marker of the second virtual object in the small map display area of the scout picture according to the second position and the scout range comprises:
determining a second environment range in the virtual environment according to the second position and the reconnaissance range of the reconnaissance aircraft prop;
and in the small map display area, switching the map zooming picture in the first environment range to the map zooming picture in the second environment range, and marking the second virtual object positioned in the second environment range.
7. The method according to any one of claims 1 to 3, wherein after displaying the scout picture in the virtual environment interface according to the scout release signal, the method further comprises:
and superposing and playing laser scanning animation in the small map display area, wherein the laser scanning animation is used for simulating the process of scanning the second virtual object by the reconnaissance aircraft prop in the virtual environment.
8. The method according to any one of claims 1 to 3, wherein before receiving the scout release signal, the method further comprises:
matching the engagement data of the first virtual object with data requirements;
and when the fighting data meet the data requirements, switching a scout release control into a clickable mode, wherein the scout release control is used for receiving the scout release signal.
9. A scout interface display apparatus based on a virtual environment, the apparatus comprising:
the display module is used for displaying a virtual environment interface, wherein the virtual environment interface comprises an environment picture for observing the virtual environment by a first virtual object, a small map display area is superposed and displayed on the environment picture, and the small map display area comprises a map zooming picture in the virtual environment;
the receiving module is used for receiving a scout aircraft release signal, and the scout aircraft release signal is used for scouting the virtual environment by using a scout aircraft prop;
the display module is further configured to display a scout picture in the virtual environment interface according to the scout aircraft release signal, the scout picture includes a controller held by the first virtual object, and the controller is configured to control the scout aircraft prop to move in the virtual environment;
and the marking module is used for marking a second virtual object positioned in a reconnaissance range in the map zooming picture according to the position of the reconnaissance aircraft prop in the virtual environment in the small map display area.
10. The apparatus of claim 9, wherein the marking module comprises:
the determining unit is used for releasing the scout aircraft prop to a first position corresponding to the first virtual object according to the scout aircraft release signal; determining a first environment range in the virtual environment according to the first position and the reconnaissance range of the reconnaissance aircraft prop;
a marking unit, configured to mark the second virtual object located in the first environment range in the small map display area, where the small map display area is used to display the map zoom screen in the first environment range.
11. The apparatus according to claim 10, wherein the determining unit is further configured to determine a corresponding first coordinate range of the first environment range in the virtual environment;
the marking module further comprises:
an obtaining unit, configured to obtain coordinate data corresponding to the second virtual object in the virtual environment;
the marking unit is further configured to mark the second virtual object whose coordinate data is located in the first coordinate range at a corresponding position in the small map display area.
12. The apparatus according to claim 10 or 11, wherein a scout control area is also displayed superimposed on the scout picture;
the receiving module is further used for receiving a control operation in the scout control area, wherein the control operation is used for controlling the scout aircraft prop to move to a second position in the virtual environment;
the marking module is further configured to update the mark of the second virtual object in the small map display area of the scout picture according to the second position and the scout range.
13. The apparatus of claim 12, wherein the receiving module is further configured to receive a coordinate input operation within the scout control area, the coordinate input operation configured to input corresponding coordinates of the second location in the virtual environment.
14. A computer device comprising a processor and a memory, the memory having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, the at least one instruction, the at least one program, the set of codes, or the set of instructions being loaded and executed by the processor to implement the virtual environment based scout interface display method of any one of claims 1 to 8.
15. A computer-readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by a processor to implement the virtual environment based scout interface display method of any one of claims 1 to 8.
CN201911143019.0A 2019-11-20 2019-11-20 Reconnaissance interface display method and device based on virtual environment and readable storage medium Active CN111035918B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911143019.0A CN111035918B (en) 2019-11-20 2019-11-20 Reconnaissance interface display method and device based on virtual environment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911143019.0A CN111035918B (en) 2019-11-20 2019-11-20 Reconnaissance interface display method and device based on virtual environment and readable storage medium

Publications (2)

Publication Number Publication Date
CN111035918A true CN111035918A (en) 2020-04-21
CN111035918B CN111035918B (en) 2023-04-07

Family

ID=70231837

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911143019.0A Active CN111035918B (en) 2019-11-20 2019-11-20 Reconnaissance interface display method and device based on virtual environment and readable storage medium

Country Status (1)

Country Link
CN (1) CN111035918B (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111589146A (en) * 2020-04-27 2020-08-28 腾讯科技(深圳)有限公司 Prop operation method, device, equipment and storage medium based on virtual environment
CN111603770A (en) * 2020-05-21 2020-09-01 腾讯科技(深圳)有限公司 Virtual environment picture display method, device, equipment and medium
CN111821691A (en) * 2020-07-24 2020-10-27 腾讯科技(深圳)有限公司 Interface display method, device, terminal and storage medium
CN112007360A (en) * 2020-08-28 2020-12-01 腾讯科技(深圳)有限公司 Processing method and device for monitoring functional prop and electronic equipment
CN112057861A (en) * 2020-09-11 2020-12-11 腾讯科技(深圳)有限公司 Virtual object control method and device, computer equipment and storage medium
CN112107857A (en) * 2020-09-17 2020-12-22 腾讯科技(深圳)有限公司 Control method and device of virtual prop, storage medium and electronic equipment
CN112121414A (en) * 2020-09-29 2020-12-25 腾讯科技(深圳)有限公司 Tracking method and device in virtual scene, electronic equipment and storage medium
CN112402965A (en) * 2020-11-20 2021-02-26 腾讯科技(深圳)有限公司 Position monitoring and anti-monitoring method, device, terminal and storage medium
WO2021218516A1 (en) * 2020-04-28 2021-11-04 腾讯科技(深圳)有限公司 Virtual object control method and apparatus, device and storage medium
WO2022227915A1 (en) * 2021-04-30 2022-11-03 腾讯科技(深圳)有限公司 Method and apparatus for displaying position marks, and device and storage medium
WO2022267528A1 (en) * 2021-06-23 2022-12-29 网易(杭州)网络有限公司 Multi-layer map display method and apparatus
WO2023019976A1 (en) * 2021-08-19 2023-02-23 网易(杭州)网络有限公司 Method for controlling virtual object, and electronic device and readable medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
炮艇船长: "《bilibili》", 7 January 2019 *

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111589146A (en) * 2020-04-27 2020-08-28 腾讯科技(深圳)有限公司 Prop operation method, device, equipment and storage medium based on virtual environment
WO2021218516A1 (en) * 2020-04-28 2021-11-04 腾讯科技(深圳)有限公司 Virtual object control method and apparatus, device and storage medium
CN111603770A (en) * 2020-05-21 2020-09-01 腾讯科技(深圳)有限公司 Virtual environment picture display method, device, equipment and medium
CN111603770B (en) * 2020-05-21 2023-05-05 腾讯科技(深圳)有限公司 Virtual environment picture display method, device, equipment and medium
KR20220013486A (en) * 2020-07-24 2022-02-04 텐센트 테크놀로지(센젠) 컴퍼니 리미티드 Interface display method, apparatus, terminal and storage medium
KR102631813B1 (en) * 2020-07-24 2024-01-30 텐센트 테크놀로지(센젠) 컴퍼니 리미티드 Interface display method, device, terminal and storage medium
JP7387758B2 (en) 2020-07-24 2023-11-28 ▲騰▼▲訊▼科技(深▲セン▼)有限公司 Interface display method, device, terminal, storage medium and computer program
CN111821691A (en) * 2020-07-24 2020-10-27 腾讯科技(深圳)有限公司 Interface display method, device, terminal and storage medium
JP2022544888A (en) * 2020-07-24 2022-10-24 ▲騰▼▲訊▼科技(深▲セン▼)有限公司 Interface display method, device, terminal, storage medium and computer program
CN112007360A (en) * 2020-08-28 2020-12-01 腾讯科技(深圳)有限公司 Processing method and device for monitoring functional prop and electronic equipment
CN112057861A (en) * 2020-09-11 2020-12-11 腾讯科技(深圳)有限公司 Virtual object control method and device, computer equipment and storage medium
CN112057861B (en) * 2020-09-11 2022-04-26 腾讯科技(深圳)有限公司 Virtual object control method and device, computer equipment and storage medium
CN112107857A (en) * 2020-09-17 2020-12-22 腾讯科技(深圳)有限公司 Control method and device of virtual prop, storage medium and electronic equipment
CN112107857B (en) * 2020-09-17 2022-06-03 腾讯科技(深圳)有限公司 Control method and device of virtual prop, storage medium and electronic equipment
CN112121414B (en) * 2020-09-29 2022-04-08 腾讯科技(深圳)有限公司 Tracking method and device in virtual scene, electronic equipment and storage medium
CN112121414A (en) * 2020-09-29 2020-12-25 腾讯科技(深圳)有限公司 Tracking method and device in virtual scene, electronic equipment and storage medium
CN112402965A (en) * 2020-11-20 2021-02-26 腾讯科技(深圳)有限公司 Position monitoring and anti-monitoring method, device, terminal and storage medium
WO2022227915A1 (en) * 2021-04-30 2022-11-03 腾讯科技(深圳)有限公司 Method and apparatus for displaying position marks, and device and storage medium
WO2022267528A1 (en) * 2021-06-23 2022-12-29 网易(杭州)网络有限公司 Multi-layer map display method and apparatus
WO2023019976A1 (en) * 2021-08-19 2023-02-23 网易(杭州)网络有限公司 Method for controlling virtual object, and electronic device and readable medium

Also Published As

Publication number Publication date
CN111035918B (en) 2023-04-07

Similar Documents

Publication Publication Date Title
CN111035918B (en) Reconnaissance interface display method and device based on virtual environment and readable storage medium
CN110755841B (en) Method, device and equipment for switching props in virtual environment and readable storage medium
WO2019153750A1 (en) Method, apparatus and device for view switching of virtual environment, and storage medium
CN109529319B (en) Display method and device of interface control and storage medium
CN110613938B (en) Method, terminal and storage medium for controlling virtual object to use virtual prop
CN110917616B (en) Orientation prompting method, device, equipment and storage medium in virtual scene
CN111589142A (en) Virtual object control method, device, equipment and medium
CN110694273A (en) Method, device, terminal and storage medium for controlling virtual object to use prop
CN108786110B (en) Method, device and storage medium for displaying sighting telescope in virtual environment
CN111481934B (en) Virtual environment picture display method, device, equipment and storage medium
CN111399639B (en) Method, device and equipment for controlling motion state in virtual environment and readable medium
CN111672106B (en) Virtual scene display method and device, computer equipment and storage medium
CN112083848B (en) Method, device and equipment for adjusting position of control in application program and storage medium
CN111389005B (en) Virtual object control method, device, equipment and storage medium
CN110448908B (en) Method, device and equipment for applying sighting telescope in virtual environment and storage medium
CN113577765B (en) User interface display method, device, equipment and storage medium
JP2021535806A (en) Virtual environment observation methods, devices and storage media
CN111589127A (en) Control method, device and equipment of virtual role and storage medium
CN111603770A (en) Virtual environment picture display method, device, equipment and medium
CN111273780B (en) Animation playing method, device and equipment based on virtual environment and storage medium
CN108744511B (en) Method, device and storage medium for displaying sighting telescope in virtual environment
CN111026318A (en) Animation playing method, device and equipment based on virtual environment and storage medium
CN111330278B (en) Animation playing method, device, equipment and medium based on virtual environment
CN111013137B (en) Movement control method, device, equipment and storage medium in virtual scene
CN112330823A (en) Virtual item display method, device, equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40022014

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant