WO2019205838A1 - 虚拟场景中的距离信息显示方法、终端及计算机设备 - Google Patents

虚拟场景中的距离信息显示方法、终端及计算机设备 Download PDF

Info

Publication number
WO2019205838A1
WO2019205838A1 PCT/CN2019/078742 CN2019078742W WO2019205838A1 WO 2019205838 A1 WO2019205838 A1 WO 2019205838A1 CN 2019078742 W CN2019078742 W CN 2019078742W WO 2019205838 A1 WO2019205838 A1 WO 2019205838A1
Authority
WO
WIPO (PCT)
Prior art keywords
icon
location point
scene
virtual scene
target location
Prior art date
Application number
PCT/CN2019/078742
Other languages
English (en)
French (fr)
Inventor
杨槿
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to JP2020568588A priority Critical patent/JP7098001B2/ja
Publication of WO2019205838A1 publication Critical patent/WO2019205838A1/zh
Priority to US16/910,469 priority patent/US11224810B2/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5372Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for tagging characters, objects or locations in the game scene, e.g. displaying a circle under the character controlled by the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5378Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for displaying an additional top view, e.g. radar screens or maps
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/303Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Definitions

  • the present application relates to the field of computer application technologies, and in particular, to a distance information display method, a terminal, and a computer device in a virtual scene.
  • an application that constructs a virtual scene can indicate a distance between two user-controlled virtual objects through a map.
  • a map interface can be opened and the map is The interface displays the distance between the virtual object controlled by the teammate and the current control object of the user.
  • the user needs to open the map interface to view the distance between the virtual object controlled by the teammate and the current control object of the user.
  • the operation of opening the map requires a certain operation time, and inevitably affects the user in the virtual scene. Other operations in the process, resulting in poor display of distance information.
  • the embodiment of the present application provides a distance information display method, a terminal, and a computer device in a virtual scenario, which can be used to solve the problem in the related art that a user needs to open a map interface to view a virtual object controlled by a teammate and a current control object of the user.
  • the distance requires a certain operation time, and inevitably affects other operations of the user in the virtual scene, thereby causing a problem that the display effect of the distance information is poor, thereby improving the display effect of the distance information.
  • the technical solution is as follows:
  • a method for displaying distance information in a virtual scene comprising:
  • Obtaining location information of a target location point in the virtual scene where the target location point is a location point where an indication icon exists in a scene screen of the virtual scene, and the scene screen is observed according to a perspective of a current control object.
  • the indication information corresponding to the target location point displays the distance information.
  • a distance information display terminal in a virtual scene where the terminal includes:
  • a location information obtaining module configured to acquire location information of a target location point in the virtual scene, where the target location point is a location point where an indication icon exists in a scene screen of the virtual scene, where the scene image is current Controlling the view of the object when viewing the virtual scene;
  • a distance information acquisition module configured to acquire distance information according to the location information of the target location point, where the distance information is used to indicate a distance between the target location point and the current control object;
  • the first information display module is configured to display the distance information corresponding to the indication icon of the target location point in the scene screen of the virtual scene.
  • the terminal further includes:
  • a size determining module configured to acquire, according to the distance information, a target size of the indication icon of the target location point
  • a resizing module configured to adjust a size of the indication icon of the target location point to the target size.
  • the first information display module is configured to: when the distance display condition is met, execute the indication icon corresponding to the target location point in the scene screen of the virtual scene to display the distance information A step of;
  • the distance display condition comprises at least one of the following conditions:
  • the current control object is in a second designated area in the virtual scene
  • the distance between the target location point and the current control object is greater than a distance threshold
  • the target location point is within a range of viewing angles in front of the current control object
  • the target location point is outside the scene picture of the virtual scene.
  • a computer apparatus comprising a processor and a memory, the memory storing at least one instruction, at least one program, a code set or a set of instructions, the at least one instruction, the at least one A program, the set of codes, or a set of instructions is loaded and executed by the processor to implement a distance information display method in the virtual scene described above.
  • a computer readable storage medium stores at least one instruction, at least one program, a code set, or a set of instructions, the at least one instruction, the at least one program, the code
  • the set or set of instructions is loaded and executed by the processor to implement the distance information display method in the virtual scene described above.
  • the terminal may display the distance information between the current control object and the target location point in the scene screen of the virtual scene.
  • the user does not need to open the virtual scene map, so that the display of the distance information is more direct, and does not affect other operations of the user in the virtual scene, thereby improving the display effect of the distance information.
  • FIG. 1 is a schematic structural diagram of a terminal provided by an exemplary embodiment of the present application.
  • FIG. 2 is a schematic diagram of a scene picture of a virtual scene provided by an exemplary embodiment of the present application
  • FIG. 3 is a schematic flowchart of a distance information display in a virtual scenario according to an exemplary embodiment of the present application
  • FIG. 4 is a schematic diagram showing distance information display according to the embodiment shown in FIG. 3;
  • FIG. 5 is a schematic diagram showing another distance information display according to the embodiment shown in FIG. 3;
  • FIG. 5 is a schematic diagram showing another distance information display according to the embodiment shown in FIG. 3;
  • FIG. 6 is a flowchart of a method for displaying distance information in a virtual scene according to an exemplary embodiment of the present application
  • FIG. 7 is a schematic diagram showing distance information display according to the embodiment shown in FIG. 6; FIG.
  • FIG. 8 is a schematic diagram showing another distance information display according to the embodiment shown in FIG. 6; FIG.
  • FIG. 9 is a schematic diagram showing the display of an indication icon according to the embodiment shown in FIG. 6;
  • FIG. 10 is a flow chart showing distance display between teammate control objects provided by an exemplary embodiment of the present application.
  • FIG. 11 is a schematic diagram showing a teammate distance display according to the embodiment shown in FIG. 10;
  • FIG. 12 is a flowchart showing a safe zone distance display provided by an exemplary embodiment of the present application.
  • FIG. 13 is a schematic diagram showing the distance display of a safe area according to the embodiment shown in FIG. 12; FIG.
  • FIG. 14 is a flowchart showing a marker position point distance display provided by an exemplary embodiment of the present application.
  • Figure 15 is a schematic view showing the distance display of a mark position point according to the embodiment shown in Figure 14;
  • 16 is a flow chart showing a marker position point distance display provided by an exemplary embodiment of the present application.
  • FIG. 17 is a schematic diagram showing the distance display of a mark position point according to the embodiment shown in FIG. 16;
  • FIG. 18 is a structural block diagram of a terminal according to an exemplary embodiment of the present application.
  • FIG. 19 is a structural block diagram of a computer device according to an exemplary embodiment of the present application.
  • Virtual scene A virtual scene that is displayed (or provided) when the application is running on the terminal.
  • the virtual scene may be a real-world simulated environment scene, a semi-simulated semi-fictional three-dimensional environment scene, or a purely fictitious three-dimensional environment scene.
  • the virtual scene may be any one of a two-dimensional virtual scene, a two-dimensional virtual scene, and a three-dimensional virtual scene.
  • the virtual scene is also used for virtual scene matching between at least two virtual characters.
  • the virtual scene is further used for playing a virtual firearm between at least two virtual characters.
  • the virtual scene is further configured to use a virtual firearm to compete between at least two virtual characters within a target area, and the target area range is continuously smaller as time passes in the virtual scene.
  • Virtual object refers to an active object in a virtual scene.
  • the movable object may be at least one of a virtual character, a virtual creature, and an anime character.
  • the virtual object is a three-dimensional model created based on animated bone technology.
  • Each virtual object has its own shape and volume in a three-dimensional virtual scene, occupying a part of the space in the three-dimensional virtual scene.
  • a virtual scene is typically displayed by an application in a computer device such as a terminal based on hardware (such as a screen) in the terminal.
  • the terminal may be a mobile terminal such as a smart phone, a tablet computer or an e-book reader; or the terminal may also be a personal computer device of a notebook computer or a stationary computer.
  • FIG. 1 is a schematic structural diagram of a terminal provided by an exemplary embodiment of the present application.
  • the terminal includes a main board 110 , an external output/input device 120 , a memory 130 , an external interface 140 , a capacitive touch system 150 , and a power supply 160 .
  • the processing element such as a processor and a controller is integrated in the main board 110.
  • the external output/input device 120 may include a display component such as a display screen, a sound playback component such as a speaker, a sound collection component such as a microphone, and various types of keys and the like.
  • Program code and data are stored in the memory 130.
  • the external interface 140 can include a headphone interface, a charging interface, a data interface, and the like.
  • the capacitive touch system 150 can be integrated in a display component or button of the external output/input device 120 for detecting a touch operation performed by the user on the display component or the button.
  • Power source 160 is used to power other various components in the terminal.
  • the processor in the main board 110 may generate a virtual scene by executing or calling program code and data stored in the memory, and display the generated virtual scene through the external output/input device 120.
  • the capacitive touch system 150 can detect the touch operation performed when the user interacts with the virtual scene.
  • the virtual scene may be a three-dimensional virtual scene, or the virtual scene may also be a two-dimensional virtual scene.
  • a virtual scene is a three-dimensional virtual scene.
  • FIG. 2 is a schematic diagram of a scene of a virtual scene provided by an exemplary embodiment of the present application.
  • the scene screen 200 of the virtual scene includes a virtual object 210, an environment screen 220 of a three-dimensional virtual scene, at least one set of virtual control buttons 230, and a virtual object 240.
  • the virtual object 210 may be a current control object of the terminal corresponding to the user, and the virtual control button 230 is an optional control element, that is, the user may manipulate the virtual object 210 through the virtual control button 230; and the virtual object 240 may be a non-user controlled object, that is, The virtual object 240 is controlled by the application itself. Alternatively, the virtual object 240 may be a virtual object controlled by the user corresponding to the other terminal. The user may interact with the virtual object 240 by controlling the virtual object 210, for example, controlling the virtual object 210 to the virtual object 240. Attack.
  • the virtual object 210 and the virtual object 240 are three-dimensional models in a three-dimensional virtual scene, and the environmental images of the three-dimensional virtual scene displayed in the scene screen 200 are objects observed by the angle of view of the virtual object 210, examples.
  • the environment image 220 of the displayed three-dimensional virtual scene is the earth 224, the sky 225, the horizon 223, the hill 221, and the factory 222.
  • the virtual object 210 can be moved immediately under the control of the user.
  • the virtual control button 230 shown in FIG. 2 is a virtual button for controlling the movement of the virtual object 210.
  • the virtual object 210 can be in the virtual object 210.
  • the movement of the touch point with respect to the center of the virtual control button 230 is moved.
  • FIG. 3 is a schematic diagram showing a flow of distance information display in a virtual scene provided by an exemplary embodiment of the present application.
  • the terminal that runs the application corresponding to the virtual scenario can display the distance information between the current control object and the target location point by performing the following steps.
  • Step 31 Obtain location information of a target location point in the virtual scene, where the target location point is a location point where an indication icon exists in a scene screen of the virtual scene, and the scene screen is a virtual scene when viewing the virtual scene from a perspective of the current control object. Picture.
  • the current control object refers to a virtual object controlled by a terminal that generates the virtual scene in the virtual scenario.
  • the virtual scene is a shooting game scene
  • the current control object may be a virtual soldier who is in the game scene and is controlled by the user corresponding to the current terminal through the terminal.
  • Step 32 Acquire distance information according to the location information of the target location point, where the distance information is used to indicate a distance between the target location point and the current control object.
  • the distance between the target location point and the current control object may refer to the target location point and the virtual distance of the current control object in the virtual scenario.
  • Step 33 In the scene screen of the virtual scene, the distance information corresponding to the target location point is displayed.
  • the terminal may display the current control object and the target in the scene screen of the virtual scene corresponding to the icon.
  • the distance information between the location points does not require the user to open the virtual scene map, so that the display of the distance information is more direct, and does not affect other operations of the user in the virtual scene, thereby improving the display effect of the distance information.
  • the distance information when the terminal displays the distance information corresponding to the indication icon of the target location point, the distance information may be displayed in a text form at a specified position around the target location point in the scene screen of the virtual scene.
  • the designated position around the target position point may be a left side, a right side, an upper side or a lower side of the target position point, and the like.
  • FIG. 4 illustrates a schematic diagram of distance information display according to an embodiment of the present application.
  • the virtual scene 40 displayed by the terminal includes a current control object 41, and an indication icon 42 of the target location point (ie, an inverted triangle icon in FIG. 4), and the left side of the indication icon 42 is also displayed.
  • a display frame 43 is displayed with numerical text indicating the distance between the target position point corresponding to the icon 42 and the current control object 41 (shown as 568 m in FIG. 4).
  • the distance information when the terminal displays the distance information corresponding to the indication icon of the target location point, the distance information may be displayed in a graphic form at a specified position around the target location point in the scene screen of the virtual scene. .
  • FIG. 5 shows another schematic diagram of distance information display according to an embodiment of the present application.
  • the virtual scene 50 displayed by the terminal includes a current control object 51, and an indication icon 52 of the target location point (ie, an inverted triangle icon in FIG. 5), and the left side of the indication icon 52 is also displayed.
  • a distance indicating graphic 53 is composed of one or more horizontal bars, and the distance indicating the number of horizontal bars in the graphic 53 indicates the length of the distance between the corresponding target position point and the current control object 51, for example, the distance The greater the number of horizontal bars in the indication graphic 53, the longer the distance between the target position point and the current control object 51 is indicated.
  • a certain condition may be set for the display of the distance information, and the distance information is displayed in the scene picture of the virtual scene only when the set condition is satisfied.
  • FIG. 6 is a flowchart of a distance information display method in a virtual scene provided by an exemplary embodiment of the present application, for example, a certain condition is set for displaying the distance information, as shown in FIG. 6 .
  • the distance information display method in the virtual scene may include the following steps:
  • Step 601 Acquire location information of a target location point in a virtual scene, where the target location point is a location point where an indication icon exists in a scene picture of the virtual scene, where the scene picture is a virtual scene when viewed from a perspective of the current control object. Picture.
  • the terminal may display the indication icons of the location points in the scene scene of the virtual scene.
  • the location point corresponding to the indication icon in the virtual scene can be used as the target location point.
  • the developer of the application corresponding to the virtual scene may pre-set which of the location points corresponding to the indication icon in the virtual scene, and which types of location points are the target location points.
  • the application program developer may preset the specified virtual location. The position point where the object is located is the target position point, and other position points corresponding to the indication icon are not used as the position point.
  • the user may also set which type of location point in the virtual scene corresponding to the indication icon is the target location point.
  • the terminal may display the target location point setting interface to the user, where the target location setting interface includes At least one option that can be selected, each option corresponding to a location point corresponding to the indication icon in the virtual scene, when the user selects an option corresponding to the location point of the specified virtual object in the target location setting interface, The terminal can set the location point where the specified virtual object is located as the target location point.
  • the terminal may obtain the location information of the target location point in the virtual scene, where the location information may be the coordinate in the coordinate system corresponding to the virtual scene.
  • the coordinate system corresponding to the virtual scene may be a horizontal coordinate system, or the coordinate system corresponding to the virtual scene may also be a three-dimensional spatial coordinate system.
  • the terminal may acquire the location information of the target location point in the virtual scene in real time, or the location of the target location point in the virtual scenario is immutable (eg, the target location point is the location of the marked building) At the time of the point, the terminal may also acquire the location information of the target location point and cache it in advance, and then directly read the cached location information when the distance information needs to be displayed.
  • Step 602 Acquire distance information according to location information of the target location point, where the distance information is used to indicate a distance between the target location point and a current control object.
  • the terminal may also obtain the location information of the current control object in the virtual scene in real time, and calculate the current control object and the target according to the location information of the target location point and the location information of the current control object. The distance between the position points to obtain the distance information corresponding to the target position point.
  • the terminal can obtain a straight line between the current control object and the target position point by simple geometric calculation according to the coordinates of the current control object and the coordinates of the target position point.
  • the horizontal distance is calculated as the distance information corresponding to the target position point.
  • the terminal may obtain a linear three-dimensional space between the current control object and the target position point by geometric calculation according to the coordinates of the current control object and the coordinates of the target position point.
  • the distance is calculated as the distance information corresponding to the target position point.
  • Step 603 when the distance display condition is satisfied, in the scene screen of the virtual scene, the distance information corresponding to the target position point is displayed.
  • the terminal before displaying the distance information, the terminal first determines whether the preset distance display condition is satisfied. If the determination result is that the distance display condition is satisfied, the step of displaying the subsequent distance information is performed, and if the determination result is the distance If the information condition is not satisfied, the terminal does not execute the distance information.
  • the specific display conditions may include the following:
  • a virtual scene may be divided into multiple blocks, and different regions may have different functions. Accordingly, distance information needs to be displayed in some regions, and distance information may not be displayed in other regions. Therefore, in the embodiment of the present application, before displaying the distance information, the terminal may first determine whether the current control object is in the second designated area where the distance information needs to be displayed, and if yes, perform the step of displaying the distance information subsequently; otherwise, the terminal The above distance information may not be displayed.
  • the virtual scene of the competitive game includes a birth area and a competitive area
  • the player's character is born in the birth area
  • the airdrop is uniformly dropped into the competitive area for competitive activities, wherein the birth area does not need to be displayed.
  • the terminal distance step is performed when the terminal determines that the player's character is in the competitive area.
  • the terminal can determine whether the distance between the current control object and the target position point is greater than the distance threshold before displaying the distance information. If yes, the step of displaying the distance information is performed. Otherwise, the terminal may not be the distance. The information is displayed.
  • the distance threshold may be preset by the developer. For example, if the developer sets the distance threshold to 10 m according to experience, the terminal corresponds to the target location when the distance between the current control object and the target location corresponding to the terminal is greater than 10 m.
  • the indication icon displays the distance information; otherwise, the terminal does not display the distance information.
  • the distance threshold may be set to 1, that is, when the distance between the current control object corresponding to the terminal and the target position point is greater than 1 m, The terminal displays the distance information corresponding to the indication icon of the target location point; otherwise, the terminal does not display the distance information.
  • the distance threshold may be set by the user.
  • the terminal may display a distance threshold setting interface to the user. After the user selects or fills in a value in the distance threshold setting interface, the terminal sets the distance corresponding to the value to the distance. Threshold.
  • the target position point is within the range of the viewing angle in front of the current control object.
  • the terminal Before displaying the distance information between the current control object and the target position point, the terminal It may be first determined whether the target position point is within the visible angle range in front of the current control object, and if so, the indication icon corresponding to the target position point displays the distance information; otherwise, the terminal may not display the distance information of the target position point.
  • FIG. 7 illustrates a schematic diagram of distance information display according to an embodiment of the present application.
  • the virtual scene 70 displayed by the terminal includes a current control object 71, and an indication icon 72 (ie, an inverted triangle icon in FIG. 7) of the target position point A, in FIG.
  • the target position point A is in the range of the viewing angle directly in front of the current control object 71.
  • the terminal is closely displayed on the left side of the indication icon 72 to display a display frame 73, and the display frame 73 displays the target position point and A current control.
  • the numerical text of the distance of the object 71 is closely displayed on the left side of the indication icon 72 to display a display frame 73, and the display frame 73 displays the target position point and A current control.
  • the target position point B there is also a target position point B in the virtual scene 70, the target position point B corresponding to the indication icon 74, and the target position point B is outside the viewing angle directly in front of the current control object 71 (for example, the target position point B may At the side or the rear of the current control object 71, at this time, the indication icon 74 is displayed on the edge of the scene screen of the virtual scene 70, and the orientation of the target position point B with respect to the current control object 71 is indicated by an arrow, and the terminal does not correspond to the indication.
  • the icon 74 displays the distance information between the current control object 71 and the target position point B.
  • the target position point is outside the scene picture of the virtual scene.
  • the target location point is outside the scene image of the virtual scene, and the target location point may not be directly displayed in the scene image of the virtual scene, for example, when the target location point is in front of the current control object.
  • the angle of view is out of range, or when the target point is within the range of the visible angle in front of the current control object and outside the horizon, or when the target point is within the range of the visible angle in front of the current control object and is within the horizon
  • other virtual objects such as houses, stones, trees, or hillsides
  • they can be considered to be outside the scene of the virtual scene.
  • FIG. 8 is a schematic diagram showing another distance information display according to an embodiment of the present application.
  • the virtual scene 80 displayed by the terminal includes a current control object 81, and an indication icon 82 (ie, an inverted triangle icon in FIG. 8) of the target location point, which is controlled by a computer or other user.
  • the location of the humanoid virtual object At time a, the location point corresponding to the humanoid virtual object is in the scene picture of the virtual scene, that is, it can be directly observed by the user.
  • the terminal does not display the distance information of the location point where the humanoid virtual object is located.
  • the humanoid virtual object moves to the back of the house.
  • the position point corresponding to the humanoid virtual object is outside the scene picture of the virtual scene, and cannot be directly observed by the user.
  • the terminal corresponding indication icon 82 A display frame 83 is displayed, and the distance information of the position point where the humanoid virtual object is located is displayed in text form in the display frame 83.
  • the above four kinds of distance display conditions may be used separately, that is, when the terminal determines that any one of the above four kinds of distance display conditions is satisfied, the step of displaying the distance information may be performed; or the above four types
  • the distance display condition may also be partially used in combination, that is, when the terminal determines that the specified two types of the above four kinds of distance display conditions or the specified three types of conditions are satisfied, the step of displaying the distance information may be performed; or the above four kinds of distance display
  • the conditions may also be used in combination, that is, when the terminal determines that all of the above four kinds of distance display conditions are satisfied, the step of displaying the distance information described above may be performed.
  • the terminal may further adjust the indication icon corresponding to the target location point according to the distance information to further indicate the distance of the distance between the target location point and the current control object. For example, the terminal may acquire the target size of the indication icon of the target location point according to the distance information of the target location point; and adjust the size of the indication icon of the target location point to the target size.
  • the terminal may set a correspondence between the indication icon of the target location point and the distance information in advance, and when adjusting the indication icon, the terminal may query the corresponding relationship to obtain the target size of the indication icon according to the distance information of the target location point, and In the scene screen of the virtual scene, the indication icon is displayed according to the target size obtained by the query.
  • FIG. 9 is a schematic diagram showing the display of an indicator icon according to an embodiment of the present application.
  • the virtual scene 90 displayed by the terminal includes a current control object 91, and an indication icon 92 (ie, an inverted triangle icon in FIG. 9) of the target location point, which is controlled by a computer or other user.
  • the location of the humanoid virtual object At time a, the distance of the position point corresponding to the humanoid virtual object from the current control object is 281 m.
  • the terminal displays the indication icon 92 in the first size.
  • the terminal displays the indication icon 92 in the second size.
  • the size of the indication icon 92 at time a is significantly larger than the size of the indication icon 92 at time b.
  • the terminal displays the indication icon in a size corresponding to the distance information.
  • the terminal may not display the distance information, that is, the terminal is in the target scene of the virtual scene.
  • the size of the pointer icon indicates the distance between the target location point and the current control object.
  • the terminal in a virtual scene, for a target location point indicating an icon in a scene scene of the virtual scene, the terminal may correspond to the icon in the scene screen of the virtual scene. Displaying the distance information between the current control object and the target location point in the form of text or graphics, without requiring the user to open the virtual scene map, so that the display of the distance information is more direct, and does not affect other operations of the user in the virtual scene. Thereby improving the display effect of the distance information.
  • the terminal may adjust the size of the indication icon of the target location point according to the distance information of the target location point, thereby further improving the indication effect of the distance between the target location point and the current control object.
  • the display form of the distance information and the display logic may also be different for different types of target position points.
  • a subsequent embodiment of the present application will be described by taking the above-mentioned target position point as a position point where the specified virtual object is located, a position point closest to the current control object on the boundary of the designated area, or a marked position point in the virtual scene as an example.
  • the terminal may display the distance information of the specified virtual object relative to the current control object in the scene screen of the virtual scene.
  • the first icon the distance information is displayed for the first icon.
  • the specified virtual object may be of a type, for example, the specified virtual object may be a virtual object controlled by another user in the virtual scene corresponding to the user in the same team, or the specified virtual object may be marked.
  • the specific virtual object is not limited, and the specified virtual object may be a specified object object (such as a tagged vehicle item or a weapon item).
  • the specified virtual object is a virtual object controlled by another user whose user is in the same team as the terminal.
  • FIG. 10 the distance display between the teammate control objects provided by an exemplary embodiment of the present application is shown. flow chart. Take a shooting game scene that was born from the birth area and unified into the competitive area for competitive activities. As shown in Figure 10, in the case of team formation, the current control object and teammates are not displayed in the birth area. Distance, after the current control object is placed in the competitive area, the terminal first judges the survival condition of the teammate control object (ie, the virtual object controlled by the teammate). If no teammate survives, the current control object and the teammate control object are not displayed.
  • the survival condition of the teammate control object ie, the virtual object controlled by the teammate
  • the distance D between the current control object and the virtual object controlled by the surviving teammate is calculated (for example, D can Is the horizontal distance between the coordinate A and the coordinate B, and the distance D is rounded), and determines whether the distance D is greater than 1 m (ie, the distance threshold is 1 m), and if so, the first icon of the virtual object controlled by the surviving teammate is displayed.
  • Distance D The distance between the current control object and the teammate may not be displayed in the thumbnail map.
  • FIG. 11 is a schematic diagram of a teammate distance display according to an embodiment of the present application.
  • the currently surviving teammate control object is the No. 3 teammate control object (corresponding to the circular icon 1101 labeled 3) and the No. 4 teammate control object (corresponding to the circular icon 1102 labeled 4), wherein 3
  • the teammate control object is within the visible angle range of the front of the current control object, and the teammate control object of the No. 4 team is outside the visible angle range of the front of the current control object, then the terminal controls the object in the current control object and teammate 3
  • the icon 1101 is displayed in the straight line direction, and the text box 1103 on the right side of the icon 1101 displays the distance between the current control object and the No. 3 teammate control object (290 m), while the icon 1102 does not display the current control object and the No. 3 teammate control.
  • the distance between objects is the No. 3 teammate control object (corresponding to the circular icon 1101 labeled 3) and the No. 4 teammate control object
  • the target location point includes a nearest boundary point, where the closest boundary point is the closest location to the current control object among the boundary points of the respective regions.
  • the boundary point of the area is a location point on a boundary of the first designated area in the virtual scene, the current control object is outside the first designated area;
  • the indication icon of the target location point includes an indication object for indicating the current control object a second icon corresponding to the relative position of the first designated area; when the terminal displays the distance information of the target position point on the scene picture of the virtual scene, the terminal may display on the boundary of the thumbnail map in the scene picture of the virtual scene
  • the second icon, and the position of the second icon on the boundary of the thumbnail map indicates the direction of the nearest boundary point with respect to the current control object; and the distance information is displayed for the second icon.
  • the first designated area may be a partial area in the virtual scene.
  • the virtual scene is a certain shooting game scene, and the first designated area may be a “safe area” in the shooting game scene.
  • FIG. 12 shows a safe zone distance display flowchart provided by an exemplary embodiment of the present application.
  • the terminal first determines the object state (ie, whether the current control object is in the security zone), and if the current control object is in the security zone, the distance between the security zone and the security zone is not displayed, if the current control object is in the security zone.
  • the coordinate A of the current control object and the coordinate B of the center of the circular safety zone are counted, and then the distance D between the current control object and the closest point of the safety zone edge to the current control object is calculated (for example, D may be coordinate A) And the horizontal distance between the coordinate B and the safety zone radius r, and the distance D is rounded), and determine whether the distance D is greater than 1 m (ie, the distance threshold is 1 m), and if so, the corresponding indication between the safety zone and the current control object
  • the second icon of the opposite direction displays the distance D.
  • FIG. 13 is a schematic diagram showing a safe zone distance display according to an embodiment of the present application.
  • a thumbnail map 1301 is displayed in the upper right corner of the game interface, and a running icon 1302 is displayed at the lower edge of the thumbnail map 1301.
  • the position of the running icon 1302 at the lower edge of the thumbnail map 1301 may indicate that the security zone is opposite to the security zone.
  • the orientation of the current control object the terminal displays the closest distance (805m) between the current control object and the security zone in the text box 1303 on the side of the running icon 1302.
  • the distance between the text box 1303 and the running icon 1302 is fixed to 20 pixels, and the text box 1303 can move with the icon 1302.
  • the text box 1303 appears at the icon 1302. Right side; when the running icon 1302 moves to a certain pixel from the right edge, the text box 1303 appears to the left of the icon 1302.
  • the target location point includes a marked location point
  • the indication icon of the target location point includes a location point for indicating the marker location a third icon
  • the third icon may be displayed in the orientation display area in the scene screen of the virtual scene, and the third icon is in the orientation
  • the position in the display area indicates the direction of the marked position point with respect to the current control object; the distance information is displayed for the third icon.
  • the above-mentioned azimuth display area may also be referred to as a virtual compass.
  • the marked location point may be a location point that the terminal corresponds to the user's own location, or the marked location point may also be a location point of other user markings that are in the same team as the corresponding user of the terminal.
  • FIG. 14 shows a flow chart of a marker position point distance display provided by an exemplary embodiment of the present application.
  • the terminal first performs a mark position state (in which the mark position point is the marked position point) (ie, it is determined whether there is a marked position point), if the game scene If there is no marker position point, the distance from the marker position point is not displayed. If there is a marker position point in the game scene, the coordinate A of the current control object and the coordinate B of the marker position point are counted, and then the current control object is calculated.
  • the distance D between the position points may be the horizontal distance between the coordinates A and the coordinates B, and the distance D is rounded), and determining whether the distance D is greater than 1 m (ie, the distance threshold is 1 m), and if so,
  • the icon corresponding to the marked position point in the virtual compass displays the distance D.
  • the distance between the current control object and the marker position point may not be displayed in the thumbnail map.
  • FIG. 15 is a schematic diagram showing the distance display of a marker position point according to an embodiment of the present application.
  • a virtual compass 1501 is displayed on the upper edge of the game interface, and a virtual compass 1501 is displayed.
  • the water drop type icon 1502, the position of the icon 1302 in the virtual compass 1501 may indicate the orientation of the mark position point relative to the current control object, and the terminal displays the current control object and the mark position point in the text box 1503 on the side of the icon 1502.
  • the closest distance 1413m.
  • the text box 1503 can move with the icon 1502.
  • the text box 1503 appears to the right of the icon 1502; when the icon 1502 moves to the right edge of the virtual compass 1501 When a certain pixel is present, the text box 1503 appears to the left of the icon 1502.
  • the mark position point set on the birth area map may not display the distance, and the mark position point set by the player on the birth area map will be cleared after the current control object is boarded the aircraft.
  • the terminal may display the map interface when receiving the operation of expanding the thumbnail map.
  • the map interface is configured to display a complete map of the virtual scene; and display a fourth icon in the map interface, the fourth icon is used to indicate the location of the marked location point in the complete map;
  • the four icons show the distance information.
  • FIG. 16 illustrates a flow chart of a marker position point distance display provided by an exemplary embodiment of the present application.
  • the terminal After the terminal displays the map interface, the terminal first performs a state determination of the marked position point (ie, determining whether there is a marked position point), and if there is no marked position point in the game scene, The distance between the marker position point and the marker position point is not displayed in the map interface. If there is a marker position point in the game scene, the coordinate A of the current control object and the coordinate B of the marker position point are counted, and then the current control object and the marker position point are calculated.
  • the distance D (for example, D may be the horizontal distance between the coordinate A and the coordinate B, and the distance D is rounded), and determine whether the distance D is greater than 1 m (ie, the distance threshold is 1 m), and if so, corresponding in the map interface
  • the icon marking the location point shows the distance D.
  • FIG. 17 illustrates a schematic diagram of a marker position point distance display according to an embodiment of the present application.
  • the game interface upper layer displays a map interface 1701.
  • the map interface 1701 displays an inverted water drop type icon 1702.
  • the position of the icon 1702 in the map interface 1701 can indicate the position of the mark position point in the game scene.
  • the closest distance (1440 m) between the current control object and the mark position point is displayed in the text box 1703 on the side of the icon 1702.
  • the mark position point set on the birth area map may not display the distance, and the mark position point set by the player on the birth area map will be cleared after the current control object is boarded the aircraft.
  • the distance grid may also be displayed on the upper layer of the map interface, and the side length of each grid corresponds to a fixed distance in the virtual scene, and the icon of the marker position point and the current control are displayed in the map interface.
  • the icon of the object so that the user roughly estimates the distance between the mark position point and the current control object according to the number of grids between the icon of the mark position point and the icon of the current control object.
  • the distance display in the field of view not only simplifies the operation of the player, reduces the frequency of the player's operation, but also makes the display of the information more direct, convenient, and greatly enhanced.
  • the player's current shortest distance from the safe zone will be displayed in real time, assisting the player to make decisions according to the current situation, greatly enhancing the strategic and interactive game.
  • the above scheme can also be applied to other similar multiplayer cooperative games.
  • the above solution of the present application it is possible to effectively solve the player's knowledge of the teammate information in the cooperative type game, quickly and accurately help the player to locate the position of the teammate, and make a decision; and, significantly simplify the cumbersome operation of the player to obtain the distance information, and improve
  • the player's game fluency reduces the game's entry barrier and allows the player to get started quickly.
  • the program can also establish effective communication channels and connections between players to help build trust and interaction between players and improve the player's game. Recognition and skill growth.
  • FIG. 18 is a block diagram showing the structure of a terminal according to an exemplary embodiment.
  • the terminal may perform all or part of the steps performed by the terminal in the method shown in the corresponding embodiment of FIG. 3 or FIG. 6.
  • the terminal can include:
  • the location information obtaining module 1801 is configured to acquire location information of a target location point in the virtual scene, where the target location point is a location point where an indication icon exists in a scene screen of the virtual scene, where the scene image is current Controlling the perspective of the object when viewing the virtual scene;
  • the distance information acquiring module 1802 is configured to acquire distance information according to the location information of the target location point, where the distance information is used to indicate a distance between the target location point and a current control object;
  • the first information display module 1803 is configured to display the distance information corresponding to the indication icon of the target location point in the scene screen of the virtual scene.
  • the target location point includes a location point where the specified virtual object is located, and the indication icon of the target location point includes a first icon for indicating the specified virtual object;
  • the first information display module 1803 includes :
  • a first icon display unit configured to display the first icon in a line direction of the specified virtual object relative to the current control object in a scene picture of the virtual scene;
  • the first information display unit is configured to display the distance information corresponding to the first icon.
  • the target location point includes a nearest boundary point, where the closest boundary point is a location point closest to the current control object among the boundary points of the respective regions, where the regional boundary point is in the virtual scenario a location point on a boundary of the first designated area, the current control object is outside the first designated area;
  • the indication icon of the target location point includes an indication of the current control object and the first designated area a second icon of a relative position between the first information display module 1803, comprising:
  • a second icon display unit configured to display the second icon on a boundary of a thumbnail map in a scene picture of the virtual scene, and a position indication of the second icon on a boundary of the thumbnail map Describe the direction of the nearest boundary point relative to the current control object;
  • the second information display unit is configured to display the distance information corresponding to the second icon.
  • the target location point includes a marked location point
  • the indication icon of the target location point includes a third icon for indicating the marked location point
  • the first information display module 1803 includes :
  • a third icon display unit configured to display the third icon in an azimuth display area in a scene picture of the virtual scene, and a position of the third icon in the azimuth display area indicates the marked a direction of a position point relative to the current control object;
  • the third information display unit is configured to display the distance information corresponding to the third icon.
  • the terminal further includes:
  • a map display module configured to display a complete map when the operation of expanding the thumbnail map is received
  • a second information display module configured to display a fourth icon in the complete map, where the fourth icon is used to indicate a location of the marked location point in the complete map; corresponding to the fourth icon display The distance information.
  • the first information display module 1803 is configured to display the distance information in a text form at a specified position around the target location point in a scene picture of the virtual scene.
  • the terminal further includes:
  • a size determining module configured to acquire, according to the distance information, a target size of the indication icon of the target location point
  • a resizing module configured to adjust a size of the indication icon of the target location point to the target size.
  • the first information display module 1803 is configured to: when the distance display condition is met, execute the indication icon corresponding to the target location point in the scene screen of the virtual scene, and display the distance Steps of information;
  • the distance display condition comprises at least one of the following conditions:
  • the current control object is in a second designated area in the virtual scene
  • the distance between the target location point and the current control object is greater than a distance threshold
  • the target location point is within a range of viewing angles in front of the current control object
  • the target location point is outside the scene picture of the virtual scene.
  • FIG. 19 is a block diagram showing the structure of a computer device 1900, according to an exemplary embodiment.
  • the computer device 1900 can be a user terminal, such as a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III), and an MP4 (Moving Picture Experts Group Audio Layer IV) motion picture. Experts compress standard audio layers 4) players, laptops or desktops.
  • Computer device 1900 may also be referred to as a user device, a portable terminal, a laptop terminal, a desktop terminal, and the like.
  • computer device 1900 includes a processor 1901 and a memory 1902.
  • the processor 1901 can include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like.
  • the processor 1901 may be configured by at least one of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). achieve.
  • the processor 1901 may also include a main processor and a coprocessor.
  • the main processor is a processor for processing data in an awake state, which is also called a CPU (Central Processing Unit); the coprocessor is A low-power processor for processing data in standby.
  • the processor 1901 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and rendering of content that needs to be displayed on the display screen.
  • the processor 1901 may also include an AI (Artificial Intelligence) processor for processing computational operations related to machine learning.
  • AI Artificial Intelligence
  • Memory 1902 can include one or more computer readable storage media, which can be non-transitory. Memory 1902 can also include high speed random access memory, as well as non-volatile memory, such as one or more disk storage devices, flash storage devices. In some embodiments, the non-transitory computer readable storage medium in the memory 1902 is configured to store at least one instruction for execution by the processor 1901 to implement the virtual scene provided by the method embodiment of the present application. The distance information display method in .
  • computer device 1900 can also optionally include a peripheral device interface 1903 and at least one peripheral device.
  • the processor 1901, the memory 1902, and the peripheral device interface 1903 may be connected by a bus or a signal line.
  • Each peripheral device can be connected to the peripheral device interface 1903 via a bus, signal line, or circuit board.
  • the peripheral device includes at least one of a radio frequency circuit 1904, a touch display screen 1905, a camera 1906, an audio circuit 1907, a positioning component 1908, and a power source 1909.
  • Peripheral device interface 1903 can be used to connect at least one peripheral device associated with I/O (Input/Output) to processor 1901 and memory 1902.
  • the processor 1901, the memory 1902, and the peripheral device interface 1903 are integrated on the same chip or circuit board; in some other embodiments, any one of the processor 1901, the memory 1902, and the peripheral device interface 1903 or The two can be implemented on a separate chip or circuit board, which is not limited in this embodiment.
  • the RF circuit 1904 is configured to receive and transmit an RF (Radio Frequency) signal, also referred to as an electromagnetic signal.
  • Radio frequency circuit 1904 communicates with the communication network and other communication devices via electromagnetic signals.
  • the radio frequency circuit 1904 converts the electrical signal into an electromagnetic signal for transmission, or converts the received electromagnetic signal into an electrical signal.
  • the radio frequency circuit 1904 includes an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and the like.
  • Radio frequency circuit 1904 can communicate with other terminals via at least one wireless communication protocol.
  • the wireless communication protocols include, but are not limited to, the World Wide Web, a metropolitan area network, an intranet, generations of mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity) networks.
  • the RF circuit 1904 may also include NFC (Near Field Communication) related circuitry, which is not limited in this application.
  • the display 1905 is used to display a UI (User Interface).
  • the UI can include graphics, text, icons, video, and any combination thereof.
  • the display 1905 also has the ability to capture touch signals over the surface or surface of the display 1905.
  • the touch signal can be input to the processor 1901 as a control signal for processing.
  • display 1905 can also be used to provide virtual buttons and/or virtual keyboards, also referred to as soft buttons and/or soft keyboards.
  • display screen 1905 can be one, providing a front panel of computer device 1900; in other embodiments, display screen 1905 can be at least two, respectively disposed on different surfaces of computer device 1900 or in a folded configuration In still other embodiments, the display screen 1905 can be a flexible display screen disposed on a curved surface or a folded surface of the computer device 1900. Even the display screen 1905 can be set to a non-rectangular irregular pattern, that is, a profiled screen.
  • the display screen 1905 can be prepared by using a material such as an LCD (Liquid Crystal Display) or an OLED (Organic Light-Emitting Diode).
  • Camera component 1906 is used to capture images or video.
  • camera assembly 1906 includes a front camera and a rear camera.
  • the front camera is placed on the front panel of the terminal, and the rear camera is placed on the back of the terminal.
  • the rear camera is at least two, which are respectively a main camera, a depth camera, a wide-angle camera, and a telephoto camera, so as to realize the background blur function of the main camera and the depth camera, and the main camera Combine with a wide-angle camera for panoramic shooting and VR (Virtual Reality) shooting or other integrated shooting functions.
  • camera assembly 1906 can also include a flash.
  • the flash can be a monochrome temperature flash or a two-color temperature flash.
  • the two-color temperature flash is a combination of a warm flash and a cool flash that can be used for light compensation at different color temperatures.
  • the audio circuit 1907 can include a microphone and a speaker.
  • the microphone is used to collect sound waves of the user and the environment, and convert the sound waves into electrical signals for processing into the processor 1901 for processing, or input to the radio frequency circuit 1904 for voice communication.
  • the microphones may be multiple, disposed separately at different locations of the computer device 1900.
  • the microphone can also be an array microphone or an omnidirectional acquisition microphone.
  • the speaker is then used to convert electrical signals from the processor 1901 or the RF circuit 1904 into sound waves.
  • the speaker can be a conventional film speaker or a piezoelectric ceramic speaker.
  • audio circuit 1907 can also include a headphone jack.
  • the location component 1908 is used to locate the current geographic location of the computer device 1900 to implement navigation or LBS (Location Based Service).
  • the positioning component 1908 can be a positioning component based on a US-based GPS (Global Positioning System), a Chinese Beidou system, or a Russian Galileo system.
  • a power supply 1909 is used to power various components in the computer device 1900.
  • the power source 1909 can be an alternating current, a direct current, a disposable battery, or a rechargeable battery.
  • the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery.
  • a wired rechargeable battery is a battery that is charged by a wired line
  • a wireless rechargeable battery is a battery that is charged by a wireless coil.
  • the rechargeable battery can also be used to support fast charging technology.
  • computer device 1900 also includes one or more sensors 1910.
  • the one or more sensors 1910 include, but are not limited to, an acceleration sensor 1911, a gyro sensor 1912, a pressure sensor 1913, a fingerprint sensor 1914, an optical sensor 1915, and a proximity sensor 1916.
  • the acceleration sensor 1911 can detect the magnitude of the acceleration on the three coordinate axes of the coordinate system established by the computer device 1900.
  • the acceleration sensor 1911 can be used to detect components of gravity acceleration on three coordinate axes.
  • the processor 1901 can control the touch display screen 1905 to display the user interface in a landscape view or a portrait view according to the gravity acceleration signal collected by the acceleration sensor 1911.
  • the acceleration sensor 1911 can also be used for the acquisition of game or user motion data.
  • the gyro sensor 1912 can detect the body direction and angle of rotation of the computer device 1900, and the gyro sensor 1912 can cooperate with the acceleration sensor 1911 to collect 3D motions of the user on the computer device 1900. Based on the data collected by the gyro sensor 1912, the processor 1901 can implement functions such as motion sensing (such as changing the UI according to the user's tilting operation), image stabilization at the time of shooting, game control, and inertial navigation.
  • functions such as motion sensing (such as changing the UI according to the user's tilting operation), image stabilization at the time of shooting, game control, and inertial navigation.
  • Pressure sensor 1913 can be disposed on a side frame of computer device 1900 and/or a lower layer of touch display 1905.
  • the pressure sensor 1913 When the pressure sensor 1913 is disposed on the side frame of the computer device 1900, the user's holding signal to the computer device 1900 can be detected, and the processor 1901 performs left and right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1913.
  • the operability control on the UI interface is controlled by the processor 1901 according to the user's pressure operation on the touch display screen 1905.
  • the operability control includes at least one of a button control, a scroll bar control, an icon control, and a menu control.
  • the fingerprint sensor 1914 is configured to collect the fingerprint of the user, and the processor 1901 identifies the identity of the user according to the fingerprint collected by the fingerprint sensor 1914, or the fingerprint sensor 1914 identifies the identity of the user according to the collected fingerprint. Upon identifying that the user's identity is a trusted identity, the processor 1901 authorizes the user to perform related sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying and changing settings, and the like. Fingerprint sensor 1914 can be provided with the front, back or side of computer device 1900. When a physical button or vendor logo is provided on computer device 1900, fingerprint sensor 1914 can be integrated with a physical button or vendor logo.
  • Optical sensor 1915 is used to collect ambient light intensity.
  • the processor 1901 can control the display brightness of the touch display 1905 based on the ambient light intensity acquired by the optical sensor 1915. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1905 is raised; when the ambient light intensity is low, the display brightness of the touch display screen 1905 is lowered.
  • the processor 1901 can also dynamically adjust the shooting parameters of the camera assembly 1906 based on the ambient light intensity acquired by the optical sensor 1915.
  • Proximity sensor 1916 also referred to as a distance sensor, is typically disposed on the front panel of computer device 1900. Proximity sensor 1916 is used to capture the distance between the user and the front of computer device 1900. In one embodiment, when the proximity sensor 1916 detects that the distance between the user and the front of the computer device 1900 is gradually decreasing, the touch screen 1905 is controlled by the processor 1901 to switch from the bright screen state to the screen state; when the proximity sensor 1916 When it is detected that the distance between the user and the front side of the computer device 1900 gradually becomes larger, the processor 1901 controls the touch display screen 1905 to switch from the state of the screen to the bright state.
  • FIG. 19 does not constitute a limitation to computer device 1900, and may include more or fewer components than those illustrated, or some components may be combined, or different component arrangements may be employed.
  • a non-transitory computer readable storage medium comprising instructions, such as a memory comprising at least one instruction, at least one program, a code set or a set of instructions, at least one instruction, at least one segment
  • the program, code set, or set of instructions may be executed by the processor to perform all or part of the steps of the method illustrated in the above-described FIG. 3 or FIG.
  • the non-transitory computer readable storage medium may be a ROM, a random access memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, and an optical data storage device.

Abstract

一种虚拟场景中的距离信息显示方法,包括:步骤(31)、获取虚拟场景中的目标位置点的位置信息;步骤(32)、根据目标位置点的位置信息获取距离信息;步骤(33)、在虚拟场景的场景画面中,对应目标位置点的指示图标显示距离信息。通过该方法,在虚拟场景中,对于在虚拟场景的场景画面中存在指示图标的目标位置点,终端可以在虚拟场景的场景画面中,对应该指示图标显示当前控制对象与该目标位置点之间的距离信息,从而提高了距离信息的显示效果。还公开了一种虚拟场景中的终端机计算机设备。

Description

虚拟场景中的距离信息显示方法、终端及计算机设备
本申请要求于2018年04月27日提交的申请号为201810392855.1、发明名称为“虚拟场景中的距离信息显示方法、装置及计算机设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及计算机应用技术领域,特别涉及一种虚拟场景中的距离信息显示方法、终端及计算机设备。
背景技术
在很多构建虚拟场景的应用程序(比如虚拟现实应用程序、三维地图程序、军事仿真程序、第一人称射击游戏、多人在线战术竞技游戏等)中,用户有获取当前控制对象与目标点之间的距离的需求。
在相关技术中,构建虚拟场景的应用程序可以通过地图来指示两个用户控制的虚拟对象之间的距离。具体比如,在某个虚拟场景中,当两个或多个用户组队时,若其中一个用户需要获知当前控制对象与队友控制的虚拟对象之间的距离时,可以打开地图界面,并在地图界面中显示队友控制的虚拟对象与该用户当前控制对象之间的距离。
然而,相关技术中用户需要打开地图界面才能查看到队友控制的虚拟对象与该用户当前控制对象之间的距离,上述打开地图的操作需要耗费一定的操作时间,且不可避免的影响用户在虚拟场景中的其它操作,从而导致距离信息的显示效果较差。
发明内容
本申请实施例提供了一种虚拟场景中的距离信息显示方法、终端及计算机设备,可以用于解决相关技术中用户需要打开地图界面才能查看到队友控制的虚拟对象与该用户当前控制对象之间的距离,需要耗费一定的操作时间,且不可避免的影响用户在虚拟场景中的其它操作,从而导致距离信息的显示效果较差的问题,从而提高距离信息的显示效果,技术方案如下:
一方面,提供了一种虚拟场景中的距离信息显示方法,所述方法由终端执行,所述方法包括:
获取所述虚拟场景中的目标位置点的位置信息,所述目标位置点是在所述虚拟场景的场景画面中存在指示图标的位置点,所述场景画面是以当前控制对象的视角观察所述虚拟场景时的画面;
根据所述目标位置点的位置信息获取距离信息,所述距离信息用于指示所述目标位置点与所述当前控制对象之间的距离;
在所述虚拟场景的场景画面中,对应所述目标位置点的指示图标显示所述 距离信息。
一方面,提供了一种虚拟场景中的距离信息显示终端,所述终端包括:
位置信息获取模块,用于获取所述虚拟场景中的目标位置点的位置信息,所述目标位置点是在所述虚拟场景的场景画面中存在指示图标的位置点,所述场景画面是以当前控制对象的视角观察所述虚拟场景时的画面;
距离信息获取模块,用于根据所述目标位置点的位置信息获取距离信息,所述距离信息用于指示所述目标位置点与所述当前控制对象之间的距离;
第一信息显示模块,用于在所述虚拟场景的场景画面中,对应所述目标位置点的指示图标显示所述距离信息。
可选的,可选的,所述终端还包括:
尺寸确定模块,用于根据所述距离信息获取所述目标位置点的指示图标的目标尺寸;
尺寸调整模块,用于将所述目标位置点的指示图标的尺寸调整为所述目标尺寸。
可选的,所述第一信息显示模块,具体用于当距离显示条件被满足时,执行所述在所述虚拟场景的场景画面中,对应所述目标位置点的指示图标显示所述距离信息的步骤;
其中,所述距离显示条件包括以下条件中的至少一种:
所述当前控制对象处于所述虚拟场景中的第二指定区域内;
所述目标位置点与所述当前控制对象之间的距离大于距离阈值;
所述目标位置点处于所述当前控制对象前方的可视角度范围内;
所述目标位置点处于所述虚拟场景的场景画面之外。
另一方面,提供了一种计算机设备,所述计算机设备包含处理器和存储器,所述存储器中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现上述虚拟场景中的距离信息显示方法。
又一方面,提供了一种计算机可读存储介质,所述存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由处理器加载并执行以实现上述虚拟场景中的距离信息显示方法。
本申请提供的技术方案可以包括以下有益效果:
在虚拟场景中,对于在虚拟场景的场景画面中存在指示图标的目标位置点,终端可以在虚拟场景的场景画面中,对应该指示图标显示当前控制对象与该目标位置点之间的距离信息,不需要用户打开虚拟场景地图,使得距离信息的显示更为直接,同时不影响用户在虚拟场景中的其它操作,从而提高了距离信息的显示效果。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本申请。
附图说明
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本申请的实施例,并与说明书一起用于解释本申请的原理。
图1是本申请一个示例性的实施例提供的终端的结构示意图;
图2是本申请一个示例性实施例提供的虚拟场景的场景画面示意图;
图3是本申请一个示例性实施例提供的虚拟场景中的距离信息显示流程示意图;
图4是图3所示实施例涉及的一种距离信息显示示意图;
图5是图3所示实施例涉及的另一种距离信息显示示意图;
图6是本申请一个示例性实施例提供的一种虚拟场景中的距离信息显示方法的流程图;
图7是图6所示实施例涉及的一种距离信息显示示意图;
图8是图6所示实施例涉及的另一种距离信息显示示意图;
图9是图6所示实施例涉及的一种指示图标的显示示意图;
图10是本申请一个示例性的实施例提供的队友控制对象之间的距离显示流程图;
图11是图10所示实施例涉及的一种队友距离显示示意图;
图12是本申请一个示例性的实施例提供的安全区距离显示流程图;
图13是图12所示实施例涉及的一种安全区距离显示示意图;
图14是本申请一个示例性的实施例提供的标记位置点距离显示流程图;
图15是图14所示实施例涉及的一种标记位置点距离显示示意图;
图16是本申请一个示例性的实施例提供的标记位置点距离显示流程图;
图17是图16所示实施例涉及的一种标记位置点距离显示示意图;
图18是本申请一示例性实施例提供的终端的结构方框图;
图19是本申请一示例性实施例提供的计算机设备的结构框图。
具体实施方式
这里将详细地对示例性实施例进行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本申请相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本申请的一些方面相一致的实体设备和方法的例子。
虚拟场景:是应用程序在终端上运行时显示(或提供)的虚拟场景。该虚拟场景可以是对真实世界的仿真环境场景,也可以是半仿真半虚构的三维环境场景,还可以是纯虚构的三维环境场景。虚拟场景可以是二维虚拟场景、2.5维 虚拟场景和三维虚拟场景中的任意一种,下述实施例以虚拟场景是三维虚拟场景来举例说明,但对此不加以限定。可选地,该虚拟场景还用于至少两个虚拟角色之间的虚拟场景对战。可选地,该虚拟场景还用于至少两个虚拟角色之间使用虚拟枪械进行对战。可选地,该虚拟场景还用于在目标区域范围内,至少两个虚拟角色之间使用虚拟枪械进行对战,该目标区域范围会随虚拟场景中的时间推移而不断变小。
虚拟对象:是指在虚拟场景中的可活动对象。该可活动对象可以是虚拟人物、虚拟动物、动漫人物中的至少一种。可选地,当虚拟场景为三维虚拟环境时,虚拟对象是基于动画骨骼技术创建的三维立体模型。每个虚拟对象在三维虚拟场景中具有自身的形状和体积,占据三维虚拟场景中的一部分空间。
虚拟场景通常由终端等计算机设备中的应用程序生成基于终端中的硬件(比如屏幕)进行展示。该终端可以是智能手机、平板电脑或者电子书阅读器等移动终端;或者,该终端也可以是笔记本电脑或者固定式计算机的个人计算机设备。
请参考图1,其示出了本申请一个示例性的实施例提供的终端的结构示意图。如图1所示,该终端包括主板110、外部输出/输入设备120、存储器130、外部接口140、电容触控系统150以及电源160。
其中,主板110中集成有处理器和控制器等处理元件。
外部输出/输入设备120可以包括显示组件(比如显示屏)、声音播放组件(比如扬声器)、声音采集组件(比如麦克风)以及各类按键等。
存储器130中存储有程序代码和数据。
外部接口140可以包括耳机接口、充电接口以及数据接口等。
电容触控系统150可以集成在外部输出/输入设备120的显示组件或者按键中,电容触控系统150用于检测用户在显示组件或者按键上执行的触控操作。
电源160用于对终端中的其它各个部件进行供电。
在本申请实施例中,主板110中的处理器可以通过执行或者调用存储器中存储的程序代码和数据生成虚拟场景,并将生成的虚拟场景通过外部输出/输入设备120进行展示。在展示虚拟场景的过程中,可以通过电容触控系统150检测用户与虚拟场景进行交互时执行的触控操作。
其中,虚拟场景可以是三维的虚拟场景,或者,虚拟场景也可以是二维的虚拟场景。以虚拟场景是三维的虚拟场景为例,请参考图2,其示出了本申请一个示例性的实施例提供的虚拟场景的场景画面示意图。如图1所示,虚拟场景的场景画面200包括虚拟对象210、三维的虚拟场景的环境画面220、至少一组虚拟控制按钮230以及虚拟对象240。其中,虚拟对象210可以是终端对应用户的当前控制对象,虚拟控制按钮230为可选的控制元素,即用户可通过虚拟控制按钮230操控虚拟对象210;而虚拟对象240可以是非用户控制对象,即虚拟对象240由应用程序自行控制,或者,虚拟对象240也可以是其它终端对应用户控制的虚拟对象,用户可以通过控制虚拟对象210与虚拟对象240进行交互,比如,控制虚拟对象210对虚拟对象240进行攻击。
在图2中,虚拟对象210与虚拟对象240是在三维的虚拟场景中的三维模型,在场景画面200中显示的三维的虚拟场景的环境画面为虚拟对象210的视角所观察到的物体,示例性的,如图2所示,在虚拟对象210的视角观察下,显示的三维虚拟场景的环境画面220为大地224、天空225、地平线223、小山221以及厂房222。
虚拟对象210可以在用户的控制下即时移动,比如,图2示出的虚拟控制按钮230是用于控制虚拟对象210移动的虚拟按钮,用户触控该虚拟控制按钮230时,虚拟对象210可以在虚拟场景中,向触控点相对于虚拟控制按钮230的中心的方向移动。
请参考图3,其示出了本申请一个示例性的实施例提供的虚拟场景中的距离信息显示流程的示意图。如图3所示,运行上述虚拟场景对应的应用程序的终端(比如上述图1所示的终端),可以通过执行以下步骤来显示当前控制对象与目标位置点之间的距离信息。
步骤31,获取该虚拟场景中的目标位置点的位置信息,该目标位置点是在该虚拟场景的场景画面中存在指示图标的位置点,该场景画面是以当前控制对象的视角观察虚拟场景时的画面。
在本申请实施例中,当前控制对象是指上述虚拟场景中,由生成该虚拟场景的终端进行控制的虚拟对象。比如,以虚拟场景是某射击类游戏场景为例,该当前控制对象可以是处于游戏场景中,且由当前终端对应的用户通过该终端进行控制的虚拟士兵。
步骤32,根据该目标位置点的位置信息获取距离信息,该距离信息用于指示该目标位置点与当前控制对象之间的距离。
在本申请实施例中,上述目标位置点与当前控制对象之间的距离,可以是指目标位置点以及当前控制对象在上述虚拟场景中的虚拟距离。
步骤33,在该虚拟场景的场景画面中,对应该目标位置点的指示图标显示该距离信息。
通过图3所示的方案,在虚拟场景中,对于在虚拟场景的场景画面中存在指示图标的目标位置点,终端可以在虚拟场景的场景画面中,对应该指示图标显示当前控制对象与该目标位置点之间的距离信息,不需要用户打开虚拟场景地图,使得距离信息的显示更为直接,同时不影响用户在虚拟场景中的其它操作,从而提高了距离信息的显示效果。
在本申请实施例中,终端对应目标位置点的指示图标显示距离信息时,可以在该虚拟场景的场景画面中,在该目标位置点周围的指定位置处以文本形式显示该距离信息。
其中,上述目标位置点周围的指定位置,可以是紧贴目标位置点的左侧、右侧、上方或者下方等等。
比如,请参考图4,其示出了本申请实施例涉及的一种距离信息显示示意图。如图4所示,终端显示的虚拟场景40中包含有当前控制对象41,以及目标位置点的指示图标42(即图4中的倒三角形图标),紧贴指示图标42的左侧还显示 有一个显示框43,该显示框43中显示有指示图标42对应的目标位置点与当前控制对象41的距离的数值文本(图4中显示为568m)。
在另一种可能的实现方式中,终端对应目标位置点的指示图标显示距离信息时,可以在该虚拟场景的场景画面中,在该目标位置点周围的指定位置处以图形的形式显示该距离信息。
比如,请参考图5,其示出了本申请实施例涉及的另一种距离信息显示示意图。如图5所示,终端显示的虚拟场景50中包含有当前控制对象51,以及目标位置点的指示图标52(即图5中的倒三角形图标),紧贴指示图标52的左侧还显示有一个距离指示图形53,该距离指示图形53由一至多条横杠组成,且该距离指示图形53中的横杠的数量指示对应的目标位置点与当前控制对象51的距离的长短,比如,距离指示图形53中的横杠的数量越多,表示目标位置点与当前控制对象51的距离越长。
在本申请所示的方案中,为了保证虚拟场景的场景画面的简洁,可以对距离信息的显示设置一定的条件,只有当设置的条件被满足时,才在虚拟场景的场景画面中显示距离信息。
请参考图6,其示出了本申请一个示例性的实施例提供的一种虚拟场景中的距离信息显示方法的流程图,以对距离信息的显示设置一定的条件为例,如图6所示,该虚拟场景中的距离信息显示方法可以包括如下几个步骤:
步骤601,获取虚拟场景中的目标位置点的位置信息,该目标位置点是在该虚拟场景的场景画面中存在指示图标的位置点,该场景画面是以当前控制对象的视角观察虚拟场景时的画面。
在本申请实施例中,虚拟场景中可能存在一些需要用户特别注意的位置点,比如指定虚拟对象所在的位置点、指定区域的边界上距离当前控制对象最近的位置点或者虚拟场景中被标记的位置点等等。为了方便用户及时了解这些需要特别注意的位置点的方位,终端可以在虚拟场景的场景画面中显示这些位置点的指示图标。而在虚拟场景中对应有指示图标的位置点都可以作为上述目标位置点。
可选的,虚拟场景对应的应用程序的开发人员可以预先设置虚拟场景中对应有指示图标的位置点中,哪些类型的位置点为目标位置点,比如,应用程序程序开发人员可以预先设置指定虚拟对象所在的位置点为目标位置点,而其它对应有指示图标的位置点不作为位置点。
或者,用户也可以自行设置虚拟场景中对应有指示图标的位置点中,哪些类型的位置点为目标位置点,比如,终端可以向用户展示目标位置点设置界面,该目标位置点设置界面中包含可勾选的至少一个选项,每个选项对应虚拟场景中对应有指示图标的一种位置点,当用户在该目标位置点设置界面中勾选了指定虚拟对象所在的位置点对应的选项后,终端可以将该指定虚拟对象所在的位置点设置为目标位置点。
终端在显示虚拟场景的场景画面时,可以获取目标位置点在虚拟场景中的位置信息,其中,该位置信息可以是虚拟场景对应的坐标系中的坐标。其中, 该虚拟场景对应的坐标系可以是水平坐标系,或者,该虚拟场景对应的坐标系也可以是三维空间坐标系。
在本申请实施例中,终端可以实时获取目标位置点在虚拟场景中的位置信息,或者,当目标位置点在虚拟场景中的位置不可变(比如目标位置点是被标注的建筑物所在的位置点时),终端也可以预先获取该目标位置点的位置信息并进行缓存,后续需要显示距离信息时,直接读取缓存的位置信息。
步骤602,根据该目标位置点的位置信息获取距离信息,该距离信息用于指示该目标位置点与当前控制对象之间的距离。
终端除了获取目标位置点的位置信息之外,还可以实时获取当前控制对象在虚拟场景中的位置信息,并根据目标位置点的位置信息和当前控制对象的位置信息,计算出当前控制对象与目标位置点之间的距离,以获得该目标位置点对应的距离信息。
比如,以距离信息是虚拟场景对应的坐标系中的坐标为例,终端可以根据当前控制对象的坐标和目标位置点的坐标,通过简单的几何计算获得当前控制对象与目标位置点之间的直线水平距离,并将计算出的该直线水平距离作为目标位置点对应的距离信息。
或者,当上述虚拟场景对应的坐标系是三维空间坐标系时,终端也可以根据当前控制对象的坐标和目标位置点的坐标,通过几何计算获得当前控制对象与目标位置点之间的直线三维空间距离,并将计算出的直线三维空间距离作为目标位置点对应的距离信息。
步骤603,当距离显示条件被满足时,在该虚拟场景的场景画面中,对应该目标位置点的指示图标显示该距离信息。
在本申请实施例中,终端在显示距离信息之前,首先判断预先设置距离显示条件是否被满足,若判断结果为距离显示条件被满足,则执行后续的距离信息显示的步骤,若判断结果为距离信息条件不被满足,则终端对该距离信息不予执行。
其中,具体显示条件可以包括以下几种:
1)当前控制对象处于虚拟场景中的第二指定区域内。
在实际应用中,虚拟场景可能被分为多块区域,不同区域的作用也不相同,相应的,在某些区域内需要显示距离信息,而在另一些区域内可能并不需要显示距离信息。因此,在本申请实施例中,终端在显示距离信息之前,可以先判断当前控制对象是否处于需要显示距离信息的第二指定区域内,如果是,则执行后续显示距离信息的步骤,否则,终端可以不对上述距离信息进行显示。
比如,以某一竞技游戏为例,假设竞技游戏的虚拟场景中包含出生区域和竞技区域,玩家的角色在出生区域出生,并统一空投到竞技区域进行竞技活动,其中,出生区域并不需要显示上述距离信息,为了提高玩家角色处于出生区域时的场景画面简洁性,终端在判断出玩家的角色处于竞技区域时,才执行后续的显示距离信息的步骤。
2)目标位置点与该当前控制对象之间的距离大于距离阈值。
在虚拟场景中,当终端对应的当前控制对象处于目标位置点附近时,用户通常不需要知晓当前控制对象与目标位置点之间的准确距离,因此,在本申请实施例中,为了提高界面显示的简洁性,终端在显示上述距离信息之前,可以首先判断当前控制对象与目标位置点之间的距离是否大于距离阈值,如果是,则执行后续显示距离信息的步骤,否则,终端可以不对上述距离信息进行显示。
其中,上述距离阈值可以由开发人员预先设置,比如,开发人员根据经验设置距离阈值为10m,则当终端对应的当前控制对象与目标位置点之间的距离大于10m时,终端对应该目标位置点的指示图标显示该距离信息,否则,终端不显示该距离信息。
或者,当上述距离信息以数值文本的形式显示,且数值取整时,上述距离阈值可以设置为1,也就是说,当终端对应的当前控制对象与目标位置点之间的距离大于1m时,终端对应该目标位置点的指示图标显示该距离信息,否则,终端不显示该距离信息。
或者,上述距离阈值也可以由用户自行设置,比如,终端可以向用户展示距离阈值设置界面,用户在该距离阈值设置界面中选择或者填写一数值后,终端将该数值对应的距离设置为上述距离阈值。
3)目标位置点处于该当前控制对象前方的可视角度范围内。
在实际应用中,用户通常只重点关注当前控制对象前方的可视范围内的目标位置点,因此,在本申请实施例中,终端在显示当前控制对象与目标位置点之间的距离信息之前,可以首先判断目标位置点是否处于当前控制对象前方的可视角度范围内,如果是,则对应该目标位置点的指示图标显示距离信息,否则,终端可以不显示该目标位置点的距离信息。
比如,请参考图7,其示出了本申请实施例涉及的一种距离信息显示示意图。如图7所示,如图7所示,终端显示的虚拟场景70中包含有当前控制对象71,以及目标位置点A的指示图标72(即图7中的倒三角形图标),在图7中,目标位置点A处于当前控制对象71正前方的可视角度范围内,此时,终端紧贴指示图标72的左侧显示一个显示框73,该显示框73中显示目标位置点与A当前控制对象71的距离的数值文本。在虚拟场景70中还存在目标位置点B,该目标位置点B对应指示图标74,且该目标位置点B处于当前控制对象71正前方的可视角度之外(比如,该目标位置点B可以处于当前控制对象71的侧面或者后方),此时,指示图标74显示在虚拟场景70的场景画面边缘,并通过箭头指示目标位置点B相对于当前控制对象71的方位,并且,终端不对应指示图标74显示当前控制对象71与目标位置点B之间的距离信息。
4)目标位置点处于该虚拟场景的场景画面之外。
在本申请实施例中,目标位置点处于虚拟场景的场景画面之外,可以泛指目标位置点未在该虚拟场景的场景画面内直接显示,比如,当目标位置点处于当前控制对象前方的可视角度范围外时,或者当目标位置点处于当前控制对象前方的可视角度范围内且在地平线之外,或者当目标位置点处于当前控制对象前方的可视角度范围内且被地平线之内的其它虚拟对象(比如房屋、石块、树 木或者山坡等)遮挡时,都可以认为是处于虚拟场景的场景画面之外。
比如,请参考图8,其示出了本申请实施例涉及的另一种距离信息显示示意图。如图8所示,终端显示的虚拟场景80中包含有当前控制对象81,以及目标位置点的指示图标82(即图8中的倒三角形图标),该目标位置点是由电脑或者其它用户控制的人形虚拟对象所在的位置点。在时刻a,该人形虚拟对象对应的位置点处于虚拟场景的场景画面内,即可以被用户直接观察到,此时,终端不显示该人形虚拟对象所在的位置点的距离信息。在时刻a之后的时刻b,人形虚拟对象移动至房屋背后,此时,人形虚拟对象对应的位置点处于虚拟场景的场景画面外,不可以被用户直接观察到,此时,终端对应指示图标82显示一个显示框83,并在该显示框83中以文本形式显示该人形虚拟对象所在的位置点的距离信息。
在实际应用中,上述4种距离显示条件可以单独使用,即当终端判断出上述4种距离显示条件中的任意一种条件被满足时,可以执行上述显示距离信息的步骤;或者,上述4种距离显示条件也可以部分结合使用,即当终端判断出上述4种距离显示条件中的指定2种或者指定3种条件被满足时,可以执行上述显示距离信息的步骤;或者,上述4种距离显示条件也可以全部结合使用,即当终端判断出上述4种距离显示条件全部被满足时,可以执行上述显示距离信息的步骤。
在一种可能的实现方式中,终端还可以根据距离信息对目标位置点对应的指示图标进行调整,以进一步的指示目标位置点距离当前控制对象之间的距离的远近。比如,终端可以根据目标位置点的距离信息获取该目标位置点的指示图标的目标尺寸;并将该目标位置点的指示图标的尺寸调整为该目标尺寸。
其中,终端中可以预先设置目标位置点的指示图标与距离信息之间的对应关系,在调整指示图标时,终端可以根据目标位置点的距离信息,查询上述对应关系获得指示图标的目标尺寸,并在虚拟场景的场景画面中,按照查询获得的目标尺寸显示该指示图标。
比如,以距离信息所指示的距离越远,对应的指示图标的目标尺寸越大为例,请参考图9,其示出了本申请实施例涉及的一种指示图标的显示示意图。如图9所示,终端显示的虚拟场景90中包含有当前控制对象91,以及目标位置点的指示图标92(即图9中的倒三角形图标),该目标位置点是由电脑或者其它用户控制的人形虚拟对象所在的位置点。在时刻a,该人形虚拟对象对应的位置点距离当前控制对象的距离为281m,在时刻a,终端以第一尺寸显示指示图标92。在时刻b,该人形虚拟对象移动至远处,距离当前控制对象的距离为568m,此时,终端以第二尺寸显示指示图标92。由图9可见,在时刻a时的指示图标92的尺寸明显大于时刻b时的指示图标92的尺寸。
在上述图9中,在对应目标位置点的指示图标显示该目标位置点的距离信息的同时,终端以该距离信息对应的尺寸显示该指示图标。在另一种可能的实现方式中,终端在以距离信息对应的尺寸显示目标位置点的指示图标时,可以不显示该距离信息,也就是说,终端在虚拟场景的场景画面中,以目标位置点 的指示图标的尺寸来指示目标位置点与当前控制对象之间的距离。
综上所述,本申请实施例所示的方案,在虚拟场景中,对于在虚拟场景的场景画面中存在指示图标的目标位置点,终端可以在虚拟场景的场景画面中,对应该指示图标,以文本或者图形的形式显示当前控制对象与该目标位置点之间的距离信息,不需要用户打开虚拟场景地图,使得距离信息的显示更为直接,同时不影响用户在虚拟场景中的其它操作,从而提高了距离信息的显示效果。
此外,本申请实施例所示的方案,在显示距离信息之前,首先判断是否满足距离显示条件,只有在距离显示条件被满足时,才显示目标位置点的距离信息,避免在虚拟场景的场景画面中进行不必要的信息显示,提高了场景画面的简洁性。
另外,本申请实施例所示的方案中,终端可以根据目标位置点的距离信息调整目标位置点的指示图标的尺寸,从而进一步提高对目标位置点与当前控制对象之间的距离远近的指示效果。
在实际应用中,对于不同类型的目标位置点,其距离信息的显示形式以及显示逻辑也可能不同。本申请后续实施例将以上述目标位置点是指定虚拟对象所在的位置点、指定区域的边界上距离当前控制对象最近的位置点或者虚拟场景中被标记的位置点为例进行说明。
基于上述图3或图6所示的方案,在一种可能的实现方式中,当该目标位置点包括指定虚拟对象所在的位置点,且目标位置点的指示图标包括用于指示该指定虚拟对象的第一图标时;终端在该虚拟场景的场景画面显示目标位置点的距离信息时,可以在该虚拟场景的场景画面中,在该指定虚拟对象相对于该当前控制对象的直线方向上显示该第一图标;并对应该第一图标显示该距离信息。
其中,上述指定虚拟对象的类型可以有多种,比如,指定虚拟对象可以是虚拟场景中与终端对应的用户处于同一队伍的其他用户控制的虚拟对象,或者,该指定虚拟对象可以是被标记的敌对虚拟对象,或者,该指定虚拟对象也可以是指定的道具对象(比如被标记的载具道具或者武器道具等),本申请实施例对于指定虚拟对象的具体形态不做限定。
以指定虚拟对象是与终端对应的用户处于同一队伍的其他用户控制的虚拟对象为例,请参考图10,其示出了本申请一个示例性的实施例提供的队友控制对象之间的距离显示流程图。以某个从出生区域出生,并统一投放至竞技区域进行竞技活动的射击类游戏场景为例,如图10所示,在组队情况下,在出生区域不显示当前控制对象与队友之间的距离,在当前控制对象被投放到竞技区域之后终端首先进行队友控制对象(即队友控制的虚拟对象)的存活情况的判断,若无队友存活,则不显示当前控制对象与队友控制对象之间的距离,若有队友存活,则统计当前控制对象的坐标A以及存活的队友控制的虚拟对象的坐标B,之后计算当前控制对象与存活的队友控制的虚拟对象之间的距离D(比如,D可以是坐标A和坐标B之间的水平距离,且距离D取整),并判断距离D是否 大于1m(即距离阈值为1m),若是,则对应存活的队友控制的虚拟对象的第一图标显示距离D。在缩略地图中可以不显示当前控制对象与队友之间的距离。
请参考图11,其示出了本申请实施例涉及的一种队友距离显示示意图。如图11所示,目前存活的队友控制对象是3号队友控制对象(对应标号为3的圆形图标1101)和4号队友控制对象(对应标号为4的圆形图标1102),其中,3号队友控制对象处于当前控制对象的正前方可视角度范围内,而4号队友控制对象处于当前控制对象的正前方可视角度范围之外,则终端在当前控制对象与3号队友控制对象的直线方向上显示图标1101,并在图标1101右侧的文本框1103显示当前控制对象与3号队友控制对象之间的距离(290m),而图标1102周围则不显示当前控制对象与3号队友控制对象之间的距离。
基于上述图3或图6所示的方案,在另一种可能的实现方式中,目标位置点包括最近边界点,该最近边界点是各个区域边界点中到达当前控制对象的距离最近的位置点,该区域边界点是该虚拟场景中的第一指定区域的边界上的位置点,该当前控制对象处于该第一指定区域之外;该目标位置点的指示图标包括用于指示该当前控制对象与该第一指定区域之间的相对位置的第二图标;终端在该虚拟场景的场景画面显示目标位置点的距离信息时,可以在该虚拟场景的场景画面中的缩略地图的边界上显示该第二图标,且该第二图标在该缩略地图的边界上的位置指示该最近边界点相对于该当前控制对象的方向;并对应该第二图标显示该距离信息。
其中,上述第一指定区域可以是虚拟场景中的部分区域,比如,以虚拟场景是某个射击类游戏场景为例,上述第一指定区域可以是射击游戏场景中的“安全区”。
以指定虚拟对象是与射击游戏场景中的安全区为例,请参考图12,其示出了本申请一个示例性的实施例提供的安全区距离显示流程图。如图12所示,终端首先判断对象状态(即当前控制对象是否处于安全区内),若当前控制对象处于安全区内,则不显示与安全区之间的距离,若当前控制对象处于安全区之外,则统计当前控制对象的坐标A以及圆形安全区的圆心的坐标B,之后计算当前控制对象与安全区边缘距离当前控制对象最近点之间的距离D(比如,D可以是坐标A和坐标B之间的水平距离减去安全区半径r,且距离D取整),并判断距离D是否大于1m(即距离阈值为1m),若是,则对应指示安全区与当前控制对象之间相对方向的第二图标显示距离D。
请参考图13,其示出了本申请实施例涉及的一种安全区距离显示示意图。如图13所示,游戏界面右上角显示有缩略地图1301,缩略地图1301下方边缘处显示有一个奔跑图标1302,该奔跑图标1302在缩略地图1301下方边缘的位置可以指示安全区相对于当前控制对象的方位,终端在奔跑图标1302一侧的文本框1303中显示当前控制对象与安全区之间的最近距离(805m)。并且,文本框1303与奔跑图标1302之间的距离固定为20像素,且文本框1303可以随图标1302移动,当奔跑图标1302移动到距离左侧边缘一定像素点时,文本框1303出现在图标1302右侧;当奔跑图标1302移动到距离右侧边缘一定像素点时, 文本框1303出现在图标1302左侧。
基于上述图3或图6所示的方案,在又一种可能的实现方式中,该目标位置点包括被标记的位置点,该目标位置点的指示图标包括用于指示该被标记的位置点的第三图标;终端在该虚拟场景的场景画面显示目标位置点的距离信息时,可以在该虚拟场景的场景画面中的方位显示区域中显示该第三图标,且该第三图标在该方位显示区域中的位置指示该被标记的位置点相对于该当前控制对象的方向;并对应该第三图标显示该距离信息。
其中,上述方位显示区域也可以称为虚拟罗盘。上述被标记的位置点可以是终端对应用户自行标注的位置点,或者,该被标记的位置点也可以是与终端对应用户处于同一队伍的其他用户标记的位置点。
请参考图14,其示出了本申请一个示例性的实施例提供的一种标记位置点距离显示流程图。以游戏场景为例,如图14所示,终端首先进行标记位置点状态(其中,标记位置点即为上述被标记的位置点)判断(即判断是否存在被标记的位置点),若游戏场景中不存在标记位置点,则不显示与标记位置点之间的距离,若游戏场景中存在标记位置点,则统计当前控制对象的坐标A以及标记位置点的坐标B,之后计算当前控制对象与标记位置点之间的距离D(比如,D可以是坐标A和坐标B之间的水平距离,且距离D取整),并判断距离D是否大于1m(即距离阈值为1m),若是,则在虚拟罗盘中对应标记位置点的图标显示该距离D。在缩略地图中可以不显示当前控制对象与标记位置点之间的距离。
请参考图15,其示出了本申请实施例涉及的一种标记位置点距离显示示意图。以某个从出生区域出生,并统一投放至竞技区域进行竞技活动的射击类游戏场景为例,如图15所示,游戏界面上侧边缘显示有虚拟罗盘1501,虚拟罗盘1501中显示有一个倒水滴型图标1502,该图标1302在虚拟罗盘1501中的位置可以指示标记位置点相对于当前控制对象的方位,终端在图标1502一侧的文本框1503中显示当前控制对象与标记位置点之间的最近距离(1413m)。并且,文本框1503可以随图标1502移动,当图标1502移动到距离虚拟罗盘1501左侧边缘一定像素点时,文本框1503出现在图标1502右侧;当图标1502移动到距离虚拟罗盘1501右侧边缘一定像素点时,文本框1503出现在图标1502左侧。在图15所示的方案中,只要玩家在竞技区域地图上设置了标记位置点,都需要在虚拟罗盘中显示出距离。而在出生区域地图上设置的标记位置点则可以不显示距离,并且,玩家在出生区域地图上设置的标记位置点会在当前控制对象登上飞机后被清除。
基于上述图3或图6所示的方案,在又一种可能的实现方式中,当目标位置点包括被标记的位置点时,终端在接收到展开缩略地图的操作时,可以展示地图界面,该地图界面用于展示该虚拟场景的完整地图;并在该地图界面中显示第四图标,该第四图标用于指示该被标记的位置点在该完整地图中的位置;终端对应该第四图标显示该距离信息。
请参考图16,其示出了本申请一个示例性的实施例提供的一种标记位置点 距离显示流程图。以游戏场景为例,如图16所示,当终端展示地图界面后,终端首先进行标记位置点状态判断(即判断是否存在被标记的位置点),若游戏场景中不存在标记位置点,则不在地图界面中显示与标记位置点之间的距离,若游戏场景中存在标记位置点,则统计当前控制对象的坐标A以及标记位置点的坐标B,之后计算当前控制对象与标记位置点之间的距离D(比如,D可以是坐标A和坐标B之间的水平距离,且距离D取整),并判断距离D是否大于1m(即距离阈值为1m),若是,则在地图界面中对应标记位置点的图标显示该距离D。
请参考图17,其示出了本申请实施例涉及的一种标记位置点距离显示示意图。如图17所示,游戏界面上层显示地图界面1701,地图界面1701中显示有一个倒水滴型图标1702,该图标1702在地图界面1701中的位置可以指示标记位置点在游戏场景中的位置,终端在图标1702一侧的文本框1703中显示当前控制对象与标记位置点之间的最近距离(1440m)。在图17所示的方案中,只要玩家在竞技区域地图上设置了标记位置点,都需要在地图界面1701中显示出距离。而在出生区域地图上设置的标记位置点则可以不显示距离,并且,玩家在出生区域地图上设置的标记位置点会在当前控制对象登上飞机后被清除。
可选的,终端展示地图界面时,还可以在地图界面上层展示距离网格,每个网格的边长对应虚拟场景中的固定距离,同时在地图界面中展示标记位置点的图标以及当前控制对象的图标,以便用户根据标记位置点的图标以及当前控制对象的图标之间的网格数大致的估算标记位置点与当前控制对象之间的距离。
在虚拟场景,尤其是竞技类射击游戏场景中,组队游戏可以让玩家体验到更完整的游戏乐趣,是游戏不可或缺的重要部分。组队游戏时,对于队友的位置信息和距离信息,无疑是玩家做出战术决策的重要依据。通过本申请上述方案,可以进行有效的距离显示,不仅能有助于玩家快速的定位队友的具体位置,更能帮助玩家有效的做出决定,提高游戏的互动性和策略性。同时,玩家在地图界面和虚拟罗盘上可以实时查看自己和队友的标点之间的距离,逐步的给玩家提供目标感和方向感,辅助玩家在游戏中有更好的体验。而视野内的距离显示,不仅简化了玩家的操作,降低玩家操作频度,也让信息的展示更加直接、方便,有效性大大增强。同时,每当玩家处于安全区外时,会实时显示玩家当前与安全区的最短距离,辅助玩家根据当前形势作出决策,大大加强游戏的策略性、互动性。
除了竞技类射击游戏场景外,上述方案也可以应用于其它类似的多人合作类游戏中。通过本申请上述方案,能够有效的解决玩家在合作类型游戏内的队友信息知晓需求,快速而准确的帮助玩家定位队友位置,做出决策;并且,显著的简化玩家获取距离信息的繁琐操作,提高玩家的游戏流畅度,降低游戏的准入门槛,使玩家快速上手;此外,本方案还能够在玩家间建立有效的沟通渠道和联系,帮助玩家间建立信任感和互动感,提高玩家对游戏的认同和技巧成长。
图18是根据一示例性实施例示出的一种终端的结构方框图。该终端可以执行图3或图6对应实施例所示的方法中,由终端执行的全部或者部分步骤。该终端可以包括:
位置信息获取模块1801,用于获取所述虚拟场景中的目标位置点的位置信息,所述目标位置点是在所述虚拟场景的场景画面中存在指示图标的位置点,该场景画面是以当前控制对象的视角观察虚拟场景时的画面;
距离信息获取模块1802,用于根据所述目标位置点的位置信息获取距离信息,所述距离信息用于指示所述目标位置点与当前控制对象之间的距离;
第一信息显示模块1803,用于在所述虚拟场景的场景画面中,对应所述目标位置点的指示图标显示所述距离信息。
可选的,所述目标位置点包括指定虚拟对象所在的位置点,所述目标位置点的指示图标包括用于指示所述指定虚拟对象的第一图标;所述第一信息显示模块1803,包括:
第一图标显示单元,用于在所述虚拟场景的场景画面中,在所述指定虚拟对象相对于所述当前控制对象的直线方向上显示所述第一图标;
第一信息显示单元,用于对应所述第一图标显示所述距离信息。
可选的,所述目标位置点包括最近边界点,所述最近边界点是各个区域边界点中到达所述当前控制对象的距离最近的位置点,所述区域边界点是所述虚拟场景中的第一指定区域的边界上的位置点,所述当前控制对象处于所述第一指定区域之外;所述目标位置点的指示图标包括用于指示所述当前控制对象与所述第一指定区域之间的相对位置的第二图标;所述第一信息显示模块1803,包括:
第二图标显示单元,用于在所述虚拟场景的场景画面中的缩略地图的边界上显示所述第二图标,且所述第二图标在所述缩略地图的边界上的位置指示所述最近边界点相对于所述当前控制对象的方向;
第二信息显示单元,用于对应所述第二图标显示所述距离信息。
可选的,所述目标位置点包括被标记的位置点,所述目标位置点的指示图标包括用于指示所述被标记的位置点的第三图标;所述第一信息显示模块1803,包括:
第三图标显示单元,用于在所述虚拟场景的场景画面中的方位显示区域中显示所述第三图标,且所述第三图标在所述方位显示区域中的位置指示所述被标记的位置点相对于所述当前控制对象的方向;
第三信息显示单元,用于对应所述第三图标显示所述距离信息。
可选的,所述终端还包括:
地图展示模块,用于接收到展开所述缩略地图的操作时,展示完整地图;
第二信息显示模块,用于在所述完整地图中显示第四图标,所述第四图标用于指示所述被标记的位置点在所述完整地图中的位置;对应所述第四图标显示所述距离信息。
可选的,所述第一信息显示模块1803,具体用于在所述虚拟场景的场景画 面中,在所述目标位置点周围的指定位置处以文本形式显示所述距离信息。
可选的,所述终端还包括:
尺寸确定模块,用于根据所述距离信息获取所述目标位置点的指示图标的目标尺寸;
尺寸调整模块,用于将所述目标位置点的指示图标的尺寸调整为所述目标尺寸。
可选的,所述第一信息显示模块1803,具体用于当距离显示条件被满足时,执行所述在所述虚拟场景的场景画面中,对应所述目标位置点的指示图标显示所述距离信息的步骤;
其中,所述距离显示条件包括以下条件中的至少一种:
所述当前控制对象处于所述虚拟场景中的第二指定区域内;
所述目标位置点与所述当前控制对象之间的距离大于距离阈值;
所述目标位置点处于所述当前控制对象前方的可视角度范围内;
所述目标位置点处于所述虚拟场景的场景画面之外。
图19是根据一示例性实施例示出的计算机设备1900的结构框图。该计算机设备1900可以是用户终端,比如智能手机、平板电脑、MP3播放器(Moving Picture Experts Group Audio Layer III,动态影像专家压缩标准音频层面3)、MP4(Moving Picture Experts Group Audio Layer IV,动态影像专家压缩标准音频层面4)播放器、笔记本电脑或台式电脑。计算机设备1900还可能被称为用户设备、便携式终端、膝上型终端、台式终端等其他名称。
通常,计算机设备1900包括有:处理器1901和存储器1902。
处理器1901可以包括一个或多个处理核心,比如4核心处理器、8核心处理器等。处理器1901可以采用DSP(Digital Signal Processing,数字信号处理)、FPGA(Field-Programmable Gate Array,现场可编程门阵列)、PLA(Programmable Logic Array,可编程逻辑阵列)中的至少一种硬件形式来实现。处理器1901也可以包括主处理器和协处理器,主处理器是用于对在唤醒状态下的数据进行处理的处理器,也称CPU(Central Processing Unit,中央处理器);协处理器是用于对在待机状态下的数据进行处理的低功耗处理器。在一些实施例中,处理器1901可以在集成有GPU(Graphics Processing Unit,图像处理器),GPU用于负责显示屏所需要显示的内容的渲染和绘制。一些实施例中,处理器1901还可以包括AI(Artificial Intelligence,人工智能)处理器,该AI处理器用于处理有关机器学习的计算操作。
存储器1902可以包括一个或多个计算机可读存储介质,该计算机可读存储介质可以是非暂态的。存储器1902还可包括高速随机存取存储器,以及非易失性存储器,比如一个或多个磁盘存储设备、闪存存储设备。在一些实施例中,存储器1902中的非暂态的计算机可读存储介质用于存储至少一个指令,该至少一个指令用于被处理器1901所执行以实现本申请中方法实施例提供的虚拟场景中的距离信息显示方法。
在一些实施例中,计算机设备1900还可选包括有:外围设备接口1903和 至少一个外围设备。处理器1901、存储器1902和外围设备接口1903之间可以通过总线或信号线相连。各个外围设备可以通过总线、信号线或电路板与外围设备接口1903相连。具体地,外围设备包括:射频电路1904、触摸显示屏1905、摄像头1906、音频电路1907、定位组件1908和电源1909中的至少一种。
外围设备接口1903可被用于将I/O(Input/Output,输入/输出)相关的至少一个外围设备连接到处理器1901和存储器1902。在一些实施例中,处理器1901、存储器1902和外围设备接口1903被集成在同一芯片或电路板上;在一些其他实施例中,处理器1901、存储器1902和外围设备接口1903中的任意一个或两个可以在单独的芯片或电路板上实现,本实施例对此不加以限定。
射频电路1904用于接收和发射RF(Radio Frequency,射频)信号,也称电磁信号。射频电路1904通过电磁信号与通信网络以及其他通信设备进行通信。射频电路1904将电信号转换为电磁信号进行发送,或者,将接收到的电磁信号转换为电信号。可选地,射频电路1904包括:天线系统、RF收发器、一个或多个放大器、调谐器、振荡器、数字信号处理器、编解码芯片组、用户身份模块卡等等。射频电路1904可以通过至少一种无线通信协议来与其它终端进行通信。该无线通信协议包括但不限于:万维网、城域网、内联网、各代移动通信网络(2G、3G、4G及5G)、无线局域网和/或WiFi(Wireless Fidelity,无线保真)网络。在一些实施例中,射频电路1904还可以包括NFC(Near Field Communication,近距离无线通信)有关的电路,本申请对此不加以限定。
显示屏1905用于显示UI(User Interface,用户界面)。该UI可以包括图形、文本、图标、视频及其它们的任意组合。当显示屏1905是触摸显示屏时,显示屏1905还具有采集在显示屏1905的表面或表面上方的触摸信号的能力。该触摸信号可以作为控制信号输入至处理器1901进行处理。此时,显示屏1905还可以用于提供虚拟按钮和/或虚拟键盘,也称软按钮和/或软键盘。在一些实施例中,显示屏1905可以为一个,设置计算机设备1900的前面板;在另一些实施例中,显示屏1905可以为至少两个,分别设置在计算机设备1900的不同表面或呈折叠设计;在再一些实施例中,显示屏1905可以是柔性显示屏,设置在计算机设备1900的弯曲表面上或折叠面上。甚至,显示屏1905还可以设置成非矩形的不规则图形,也即异形屏。显示屏1905可以采用LCD(Liquid Crystal Display,液晶显示屏)、OLED(Organic Light-Emitting Diode,有机发光二极管)等材质制备。
摄像头组件1906用于采集图像或视频。可选地,摄像头组件1906包括前置摄像头和后置摄像头。通常,前置摄像头设置在终端的前面板,后置摄像头设置在终端的背面。在一些实施例中,后置摄像头为至少两个,分别为主摄像头、景深摄像头、广角摄像头、长焦摄像头中的任意一种,以实现主摄像头和景深摄像头融合实现背景虚化功能、主摄像头和广角摄像头融合实现全景拍摄以及VR(Virtual Reality,虚拟现实)拍摄功能或者其它融合拍摄功能。在一些实施例中,摄像头组件1906还可以包括闪光灯。闪光灯可以是单色温闪光灯,也可以是双色温闪光灯。双色温闪光灯是指暖光闪光灯和冷光闪光灯的组合, 可以用于不同色温下的光线补偿。
音频电路1907可以包括麦克风和扬声器。麦克风用于采集用户及环境的声波,并将声波转换为电信号输入至处理器1901进行处理,或者输入至射频电路1904以实现语音通信。出于立体声采集或降噪的目的,麦克风可以为多个,分别设置在计算机设备1900的不同部位。麦克风还可以是阵列麦克风或全向采集型麦克风。扬声器则用于将来自处理器1901或射频电路1904的电信号转换为声波。扬声器可以是传统的薄膜扬声器,也可以是压电陶瓷扬声器。当扬声器是压电陶瓷扬声器时,不仅可以将电信号转换为人类可听见的声波,也可以将电信号转换为人类听不见的声波以进行测距等用途。在一些实施例中,音频电路1907还可以包括耳机插孔。
定位组件1908用于定位计算机设备1900的当前地理位置,以实现导航或LBS(Location Based Service,基于位置的服务)。定位组件1908可以是基于美国的GPS(Global Positioning System,全球定位系统)、中国的北斗系统或俄罗斯的伽利略系统的定位组件。
电源1909用于为计算机设备1900中的各个组件进行供电。电源1909可以是交流电、直流电、一次性电池或可充电电池。当电源1909包括可充电电池时,该可充电电池可以是有线充电电池或无线充电电池。有线充电电池是通过有线线路充电的电池,无线充电电池是通过无线线圈充电的电池。该可充电电池还可以用于支持快充技术。
在一些实施例中,计算机设备1900还包括有一个或多个传感器1910。该一个或多个传感器1910包括但不限于:加速度传感器1911、陀螺仪传感器1912、压力传感器1913、指纹传感器1914、光学传感器1915以及接近传感器1916。
加速度传感器1911可以检测以计算机设备1900建立的坐标系的三个坐标轴上的加速度大小。比如,加速度传感器1911可以用于检测重力加速度在三个坐标轴上的分量。处理器1901可以根据加速度传感器1911采集的重力加速度信号,控制触摸显示屏1905以横向视图或纵向视图进行用户界面的显示。加速度传感器1911还可以用于游戏或者用户的运动数据的采集。
陀螺仪传感器1912可以检测计算机设备1900的机体方向及转动角度,陀螺仪传感器1912可以与加速度传感器1911协同采集用户对计算机设备1900的3D动作。处理器1901根据陀螺仪传感器1912采集的数据,可以实现如下功能:动作感应(比如根据用户的倾斜操作来改变UI)、拍摄时的图像稳定、游戏控制以及惯性导航。
压力传感器1913可以设置在计算机设备1900的侧边框和/或触摸显示屏1905的下层。当压力传感器1913设置在计算机设备1900的侧边框时,可以检测用户对计算机设备1900的握持信号,由处理器1901根据压力传感器1913采集的握持信号进行左右手识别或快捷操作。当压力传感器1913设置在触摸显示屏1905的下层时,由处理器1901根据用户对触摸显示屏1905的压力操作,实现对UI界面上的可操作性控件进行控制。可操作性控件包括按钮控件、滚动条控件、图标控件、菜单控件中的至少一种。
指纹传感器1914用于采集用户的指纹,由处理器1901根据指纹传感器1914采集到的指纹识别用户的身份,或者,由指纹传感器1914根据采集到的指纹识别用户的身份。在识别出用户的身份为可信身份时,由处理器1901授权该用户执行相关的敏感操作,该敏感操作包括解锁屏幕、查看加密信息、下载软件、支付及更改设置等。指纹传感器1914可以被设置计算机设备1900的正面、背面或侧面。当计算机设备1900上设置有物理按键或厂商Logo时,指纹传感器1914可以与物理按键或厂商Logo集成在一起。
光学传感器1915用于采集环境光强度。在一个实施例中,处理器1901可以根据光学传感器1915采集的环境光强度,控制触摸显示屏1905的显示亮度。具体地,当环境光强度较高时,调高触摸显示屏1905的显示亮度;当环境光强度较低时,调低触摸显示屏1905的显示亮度。在另一个实施例中,处理器1901还可以根据光学传感器1915采集的环境光强度,动态调整摄像头组件1906的拍摄参数。
接近传感器1916,也称距离传感器,通常设置在计算机设备1900的前面板。接近传感器1916用于采集用户与计算机设备1900的正面之间的距离。在一个实施例中,当接近传感器1916检测到用户与计算机设备1900的正面之间的距离逐渐变小时,由处理器1901控制触摸显示屏1905从亮屏状态切换为息屏状态;当接近传感器1916检测到用户与计算机设备1900的正面之间的距离逐渐变大时,由处理器1901控制触摸显示屏1905从息屏状态切换为亮屏状态。
本领域技术人员可以理解,图19中示出的结构并不构成对计算机设备1900的限定,可以包括比图示更多或更少的组件,或者组合某些组件,或者采用不同的组件布置。
在一示例性实施例中,还提供了一种包括指令的非临时性计算机可读存储介质,例如包括至少一条指令、至少一段程序、代码集或指令集的存储器,上述至少一条指令、至少一段程序、代码集或指令集可由处理器执行以完成上述图3或图6对应实施例所示的方法的全部或者部分步骤。例如,所述非临时性计算机可读存储介质可以是ROM、随机存取存储器(RAM)、CD-ROM、磁带、软盘和光数据存储设备等。
本领域技术人员在考虑说明书及实践这里公开的发明后,将容易想到本申请的其它实施方案。本申请旨在涵盖本申请的任何变型、用途或者适应性变化,这些变型、用途或者适应性变化遵循本申请的一般性原理并包括本申请未公开的本技术领域中的公知常识或惯用技术手段。说明书和实施例仅被视为示例性的,本申请的真正范围和精神由下面的权利要求指出。
应当理解的是,本申请并不局限于上面已经描述并在附图中示出的精确结构,并且可以在不脱离其范围进行各种修改和改变。本申请的范围仅由所附的权利要求来限制。

Claims (16)

  1. 一种虚拟场景中的距离信息显示方法,其特征在于,所述方法由终端执行,所述方法包括:
    获取所述虚拟场景中的目标位置点的位置信息,所述目标位置点是在所述虚拟场景的场景画面中存在指示图标的位置点,所述场景画面是以当前控制对象的视角观察所述虚拟场景时的画面;
    根据所述目标位置点的位置信息获取距离信息,所述距离信息用于指示所述目标位置点与所述当前控制对象之间的距离;
    在所述虚拟场景的场景画面中,对应所述目标位置点的指示图标显示所述距离信息。
  2. 根据权利要求1所述的方法,其特征在于,所述目标位置点包括指定虚拟对象所在的位置点,所述目标位置点的指示图标包括用于指示所述指定虚拟对象的第一图标;所述在所述虚拟场景的场景画面中,对应所述目标位置点的指示图标显示所述距离信息,包括:
    在所述虚拟场景的场景画面中,在所述指定虚拟对象相对于所述当前控制对象的直线方向上显示所述第一图标;
    对应所述第一图标显示所述距离信息。
  3. 根据权利要求1所述的方法,其特征在于,所述目标位置点包括最近边界点,所述最近边界点是各个区域边界点中到达所述当前控制对象的距离最近的位置点,所述区域边界点是所述虚拟场景中的第一指定区域的边界上的位置点,所述当前控制对象处于所述第一指定区域之外;所述目标位置点的指示图标包括用于指示所述当前控制对象与所述第一指定区域之间的相对位置的第二图标;所述在所述虚拟场景的场景画面中,对应所述目标位置点的指示图标显示所述距离信息,包括:
    在所述虚拟场景的场景画面中的缩略地图的边界上显示所述第二图标,且所述第二图标在所述缩略地图的边界上的位置指示所述最近边界点相对于所述当前控制对象的方向;
    对应所述第二图标显示所述距离信息。
  4. 根据权利要求1所述的方法,其特征在于,所述目标位置点包括被标记的位置点,所述目标位置点的指示图标包括用于指示所述被标记的位置点的第三图标;所述在所述虚拟场景的场景画面中,对应所述目标位置点的指示图标显示所述距离信息,包括:
    在所述虚拟场景的场景画面中的方位显示区域中显示所述第三图标,且所述第三图标在所述方位显示区域中的位置指示所述被标记的位置点相对于所述当前控制对象的方向;
    对应所述第三图标显示所述距离信息。
  5. 根据权利要求4所述的方法,其特征在于,所述方法还包括:
    接收到展开所述缩略地图的操作时,展示地图界面,所述地图界面用于展示所述虚拟场景的完整地图;
    在所述地图界面中显示第四图标,所述第四图标用于指示所述被标记的位置点在所述完整地图中的位置;
    对应所述第四图标显示所述距离信息。
  6. 根据权利要求1至5任一所述的方法,其特征在于,所述在所述虚拟场景的场景画面中,对应所述目标位置点的指示图标显示所述距离信息,包括:
    在所述虚拟场景的场景画面中,在所述目标位置点周围的指定位置处以文本形式显示所述距离信息。
  7. 根据权利要求1至5任一所述的方法,其特征在于,所述方法还包括:
    根据所述距离信息获取所述目标位置点的指示图标的目标尺寸;
    将所述目标位置点的指示图标的尺寸调整为所述目标尺寸。
  8. 根据权利要求1至5任一所述的方法,其特征在于,所述在所述虚拟场景的场景画面中,对应所述目标位置点的指示图标显示所述距离信息,包括:
    当距离显示条件被满足时,执行所述在所述虚拟场景的场景画面中,对应所述目标位置点的指示图标显示所述距离信息的步骤;
    其中,所述距离显示条件包括以下条件中的至少一种:
    所述当前控制对象处于所述虚拟场景中的第二指定区域内;
    所述目标位置点与所述当前控制对象之间的距离大于距离阈值;
    所述目标位置点处于所述当前控制对象前方的可视角度范围内;
    所述目标位置点处于所述虚拟场景的场景画面之外。
  9. 一种终端,其特征在于,所述终端包括:
    位置信息获取模块,用于获取所述虚拟场景中的目标位置点的位置信息,所述目标位置点是在所述虚拟场景的场景画面中存在指示图标的位置点,所述场景画面是以当前控制对象的视角观察所述虚拟场景时的画面;
    距离信息获取模块,用于根据所述目标位置点的位置信息获取距离信息,所述距离信息用于指示所述目标位置点与所述当前控制对象之间的距离;
    第一信息显示模块,用于在所述虚拟场景的场景画面中,对应所述目标位置点的指示图标显示所述距离信息。
  10. 根据权利要求9所述的终端,其特征在于,所述目标位置点包括指定虚拟对象所在的位置点,所述目标位置点的指示图标包括用于指示所述指定虚拟对象的第一图标;所述第一信息显示模块,包括:
    第一图标显示单元,用于在所述虚拟场景的场景画面中,在所述指定虚拟对象相对于所述当前控制对象的直线方向上显示所述第一图标;
    第一信息显示单元,用于对应所述第一图标显示所述距离信息。
  11. 根据权利要求9所述的终端,其特征在于,所述目标位置点包括最近边界点,所述最近边界点是各个区域边界点中到达所述当前控制对象的距离最近的位置点,所述区域边界点是所述虚拟场景中的第一指定区域的边界上的位 置点,所述当前控制对象处于所述第一指定区域之外;所述目标位置点的指示图标包括用于指示所述当前控制对象与所述第一指定区域之间的相对位置的第二图标;所述第一信息显示模块,包括:
    第二图标显示单元,用于在所述虚拟场景的场景画面中的缩略地图的边界上显示所述第二图标,且所述第二图标在所述缩略地图的边界上的位置指示所述最近边界点相对于所述当前控制对象的方向;
    第二信息显示单元,用于对应所述第二图标显示所述距离信息。
  12. 根据权利要求9所述的终端,其特征在于,所述目标位置点包括被标记的位置点,所述目标位置点的指示图标包括用于指示所述被标记的位置点的第三图标;所述第一信息显示模块,包括:
    第三图标显示单元,用于在所述虚拟场景的场景画面中的方位显示区域中显示所述第三图标,且所述第三图标在所述方位显示区域中的位置指示所述被标记的位置点相对于所述当前控制对象的方向;
    第三信息显示单元,用于对应所述第三图标显示所述距离信息。
  13. 根据权利要求12所述的终端,其特征在于,所述终端还包括:
    地图展示模块,用于接收到展开所述缩略地图的操作时,展示完整地图;
    第二信息显示模块,用于在所述完整地图中显示第四图标,所述第四图标用于指示所述被标记的位置点在所述完整地图中的位置;对应所述第四图标显示所述距离信息。
  14. 根据权利要求9至13任一所述的终端,其特征在于,所述第一信息显示模块,具体用于在所述虚拟场景的场景画面中,在所述目标位置点周围的指定位置处以文本形式显示所述距离信息。
  15. 一种计算机设备,其特征在于,所述计算机设备包含处理器和存储器,所述存储器中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如权利要求1至8任一所述的虚拟场景中的距离信息显示方法。
  16. 一种计算机可读存储介质,其特征在于,所述存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由处理器加载并执行以实现如权利要求1至8任一所述的虚拟场景中的距离信息显示方法。
PCT/CN2019/078742 2018-04-27 2019-03-19 虚拟场景中的距离信息显示方法、终端及计算机设备 WO2019205838A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2020568588A JP7098001B2 (ja) 2018-04-27 2019-03-19 仮想シーンにおける距離情報表示方法並びにその、端末コンピュータ装置及びコンピュータプログラム
US16/910,469 US11224810B2 (en) 2018-04-27 2020-06-24 Method and terminal for displaying distance information in virtual scene

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810392855.1A CN108619721B (zh) 2018-04-27 2018-04-27 虚拟场景中的距离信息显示方法、装置及计算机设备
CN201810392855.1 2018-04-27

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/910,469 Continuation US11224810B2 (en) 2018-04-27 2020-06-24 Method and terminal for displaying distance information in virtual scene

Publications (1)

Publication Number Publication Date
WO2019205838A1 true WO2019205838A1 (zh) 2019-10-31

Family

ID=63694908

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/078742 WO2019205838A1 (zh) 2018-04-27 2019-03-19 虚拟场景中的距离信息显示方法、终端及计算机设备

Country Status (4)

Country Link
US (1) US11224810B2 (zh)
JP (1) JP7098001B2 (zh)
CN (1) CN108619721B (zh)
WO (1) WO2019205838A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113633976B (zh) * 2021-08-16 2023-06-20 腾讯科技(深圳)有限公司 操作控制方法、装置、设备及计算机可读存储介质

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108245888A (zh) * 2018-02-09 2018-07-06 腾讯科技(深圳)有限公司 虚拟对象控制方法、装置及计算机设备
CN108619721B (zh) 2018-04-27 2020-08-11 腾讯科技(深圳)有限公司 虚拟场景中的距离信息显示方法、装置及计算机设备
CN111383344A (zh) * 2018-12-29 2020-07-07 深圳市优必选科技有限公司 一种虚拟场景的生成方法及装置、计算机设备、存储介质
CN109992175B (zh) 2019-04-03 2021-10-26 腾讯科技(深圳)有限公司 用于模拟盲人感受的物体显示方法、装置及存储介质
CN110115838B (zh) 2019-05-30 2021-10-29 腾讯科技(深圳)有限公司 虚拟环境中生成标记信息的方法、装置、设备及存储介质
CN110738738B (zh) * 2019-10-15 2023-03-10 腾讯科技(深圳)有限公司 三维虚拟场景中的虚拟对象标记方法、设备及存储介质
CN110781263A (zh) * 2019-10-25 2020-02-11 北京无限光场科技有限公司 房源信息展示方法、装置、电子设备及计算机存储介质
CN110833694B (zh) * 2019-11-15 2023-04-07 网易(杭州)网络有限公司 游戏中的显示控制方法及装置
CN110917616B (zh) * 2019-11-28 2022-03-08 腾讯科技(深圳)有限公司 虚拟场景中的方位提示方法、装置、设备及存储介质
CN111672126B (zh) * 2020-05-29 2023-02-10 腾讯科技(深圳)有限公司 信息显示方法、装置、设备及存储介质
CN111672125B (zh) * 2020-06-10 2022-03-01 腾讯科技(深圳)有限公司 一种虚拟对象交互的方法以及相关装置
CN112188402A (zh) * 2020-09-08 2021-01-05 武汉齐物科技有限公司 一种基于蓝牙mesh的骑行位置共享方法、装置及码表
US11458394B2 (en) * 2020-09-11 2022-10-04 Riot Games, Inc. Targeting of a long-range object in a multiplayer game
CN112484678B (zh) * 2020-10-29 2022-03-11 贝壳找房(北京)科技有限公司 一种基于虚拟三维空间的精准测量方法及装置
US20220184506A1 (en) * 2020-11-12 2022-06-16 Tencent Technology (Shenzhen) Company Limited Method and apparatus for driving vehicle in virtual environment, terminal, and storage medium
CN112451969B (zh) * 2020-12-04 2023-04-21 腾讯科技(深圳)有限公司 虚拟对象控制方法、装置、计算机设备及存储介质
CA3107889A1 (en) * 2021-02-02 2022-08-02 Eidos Interactive Corp. Method and system for providing tactical assistance to a player in a shooting video game
CN112891930B (zh) * 2021-03-25 2022-11-29 腾讯科技(深圳)有限公司 虚拟场景中的信息展示方法、装置、设备及存储介质
CN113426131B (zh) * 2021-07-02 2023-06-30 腾讯科技(成都)有限公司 虚拟场景的画面生成方法、装置、计算机设备及存储介质
CN113559508A (zh) * 2021-07-27 2021-10-29 网易(杭州)网络有限公司 虚拟对象的方位提示方法、装置、设备和存储介质
US11484793B1 (en) * 2021-09-02 2022-11-01 Supercell Oy Game control
CN113694529A (zh) * 2021-09-23 2021-11-26 网易(杭州)网络有限公司 游戏画面的显示方法及装置、存储介质、电子设备
CN114042315B (zh) * 2021-10-29 2023-06-16 腾讯科技(深圳)有限公司 基于虚拟场景的图形显示方法、装置、设备以及介质
CN116999824A (zh) * 2022-08-19 2023-11-07 腾讯科技(深圳)有限公司 虚拟场景中的引导方法、装置、设备、介质及程序产品
CN117610794B (zh) * 2024-01-22 2024-04-19 南昌菱形信息技术有限公司 一种用于突发事件的场景模拟训练评价系统及方法

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130172083A1 (en) * 2011-12-29 2013-07-04 Zynga Inc. Apparatus, method and computer readable storage medium for collecting doobers in an electronic game
CN105999703A (zh) * 2016-05-08 2016-10-12 深圳市奇境信息技术有限公司 虚拟现实场景拓展方法
JP2017055995A (ja) * 2015-09-16 2017-03-23 株式会社バンダイナムコエンターテインメント プログラム、ゲーム装置およびサーバシステム
CN107376339A (zh) * 2017-07-18 2017-11-24 网易(杭州)网络有限公司 在游戏中锁定目标的交互方法及装置
CN107596691A (zh) * 2017-10-17 2018-01-19 网易(杭州)网络有限公司 游戏战略交互方法及装置
CN107789837A (zh) * 2017-09-12 2018-03-13 网易(杭州)网络有限公司 信息处理方法、装置和计算机可读存储介质
CN108619721A (zh) * 2018-04-27 2018-10-09 腾讯科技(深圳)有限公司 虚拟场景中的距离信息显示方法、装置及计算机设备

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160187654A1 (en) * 2011-02-28 2016-06-30 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a light transmissive wedge shaped illumination system
US8641526B1 (en) * 2012-10-05 2014-02-04 Warmaging.net LLP Using input from a mouse device to control a video game vehicle
JP2015019642A (ja) 2013-07-23 2015-02-02 幸春 小林 犬の糞回収用具
JP6301707B2 (ja) 2014-04-03 2018-03-28 株式会社カプコン ゲームプログラム及びゲームシステム
US10807001B2 (en) * 2017-09-12 2020-10-20 Netease (Hangzhou) Network Co., Ltd. Information processing method, apparatus and computer readable storage medium
CN107957781B (zh) * 2017-12-13 2021-02-09 北京小米移动软件有限公司 信息显示方法及装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130172083A1 (en) * 2011-12-29 2013-07-04 Zynga Inc. Apparatus, method and computer readable storage medium for collecting doobers in an electronic game
JP2017055995A (ja) * 2015-09-16 2017-03-23 株式会社バンダイナムコエンターテインメント プログラム、ゲーム装置およびサーバシステム
CN105999703A (zh) * 2016-05-08 2016-10-12 深圳市奇境信息技术有限公司 虚拟现实场景拓展方法
CN107376339A (zh) * 2017-07-18 2017-11-24 网易(杭州)网络有限公司 在游戏中锁定目标的交互方法及装置
CN107789837A (zh) * 2017-09-12 2018-03-13 网易(杭州)网络有限公司 信息处理方法、装置和计算机可读存储介质
CN107596691A (zh) * 2017-10-17 2018-01-19 网易(杭州)网络有限公司 游戏战略交互方法及装置
CN108619721A (zh) * 2018-04-27 2018-10-09 腾讯科技(深圳)有限公司 虚拟场景中的距离信息显示方法、装置及计算机设备

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113633976B (zh) * 2021-08-16 2023-06-20 腾讯科技(深圳)有限公司 操作控制方法、装置、设备及计算机可读存储介质

Also Published As

Publication number Publication date
US20200316470A1 (en) 2020-10-08
US11224810B2 (en) 2022-01-18
CN108619721B (zh) 2020-08-11
CN108619721A (zh) 2018-10-09
JP7098001B2 (ja) 2022-07-08
JP2021515348A (ja) 2021-06-17

Similar Documents

Publication Publication Date Title
WO2019205838A1 (zh) 虚拟场景中的距离信息显示方法、终端及计算机设备
US11703993B2 (en) Method, apparatus and device for view switching of virtual environment, and storage medium
US11221726B2 (en) Marker point location display method, electronic device, and computer-readable storage medium
CN108710525B (zh) 虚拟场景中的地图展示方法、装置、设备及存储介质
WO2019205881A1 (zh) 虚拟环境中的信息显示方法、装置、设备及存储介质
WO2019153836A1 (zh) 虚拟环境中虚拟对象的姿态确定方法、装置及介质
CN111921197B (zh) 对局回放画面的显示方法、装置、终端及存储介质
WO2019109778A1 (zh) 游戏对局结果的展示方法、装置及终端
JP2022509634A (ja) 仮想環境における仮想アイテムの観察方法、装置およびコンピュータプログラム
WO2020156252A1 (zh) 在虚拟环境中建造建筑物的方法、装置、设备及存储介质
WO2020151594A1 (zh) 视角转动的方法、装置、设备及存储介质
CN109407959B (zh) 虚拟场景中的虚拟对象控制方法、设备以及存储介质
WO2019184782A1 (zh) 虚拟场景中的对象控制方法、装置及计算机设备
WO2022227915A1 (zh) 显示位置标记的方法、装置、设备及存储介质
JP2023139033A (ja) 視点回転の方法、装置、端末およびコンピュータプログラム
JP7186901B2 (ja) ホットスポットマップの表示方法、装置、コンピュータ機器および読み取り可能な記憶媒体
CN111589141B (zh) 虚拟环境画面的显示方法、装置、设备及介质
US20220291791A1 (en) Method and apparatus for determining selected target, device, and storage medium
WO2022237076A1 (zh) 虚拟对象的控制方法、装置、设备及计算机可读存储介质
CN112755517A (zh) 虚拟对象控制方法、装置、终端及存储介质
WO2018192455A1 (zh) 一种生成字幕的方法及装置
CA3160509A1 (en) Virtual object control method, apparatus, device, and computer-readable storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19791772

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020568588

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19791772

Country of ref document: EP

Kind code of ref document: A1