WO2021184806A1 - 互动道具显示方法、装置、终端及存储介质 - Google Patents

互动道具显示方法、装置、终端及存储介质 Download PDF

Info

Publication number
WO2021184806A1
WO2021184806A1 PCT/CN2020/129816 CN2020129816W WO2021184806A1 WO 2021184806 A1 WO2021184806 A1 WO 2021184806A1 CN 2020129816 W CN2020129816 W CN 2020129816W WO 2021184806 A1 WO2021184806 A1 WO 2021184806A1
Authority
WO
WIPO (PCT)
Prior art keywords
interactive
props
occluded
perspective
prop
Prior art date
Application number
PCT/CN2020/129816
Other languages
English (en)
French (fr)
Chinese (zh)
Inventor
杨金昊
林凌云
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to JP2022532846A priority Critical patent/JP2023504650A/ja
Priority to KR1020227010956A priority patent/KR20220051014A/ko
Publication of WO2021184806A1 publication Critical patent/WO2021184806A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/303Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Definitions

  • This application relates to the field of multimedia technology, and in particular to an interactive prop display method, device, terminal and storage medium.
  • the terminal can display a virtual scene in the interface and display a virtual object in the virtual scene.
  • the virtual object can control interactive props to compete with other virtual objects.
  • the types of interactive props in shooting games are usually diverse, such as machine guns, grenades, aerial gunboats, armed helicopters, etc.
  • the embodiments of the present application provide a method, device, terminal, and storage medium for displaying interactive props.
  • the technical scheme is as follows:
  • a method for displaying interactive props is provided.
  • the method is applied to a terminal, and the method includes:
  • the perspective props In response to the perspective props being in the assembled state, detecting whether the controlled virtual object contains occluded interactive props in the viewing angle range of the virtual scene, the perspective props being used to display the occluded interactive props in a perspective manner;
  • the occluded interactive prop In response to the occluded interactive prop being included in the viewing angle range, it is detected whether the occluded interactive prop meets a perspective condition, and the perspective condition is used to indicate a condition under which the occluded interactive prop is visible relative to the perspective prop ;
  • the occluded interactive props are displayed in a perspective manner in the virtual scene.
  • the displaying the occluded interactive props in a perspective manner in the virtual scene includes:
  • the outline of the occluded interactive prop is displayed, and the target object is an object that occludes the interactive prop.
  • the displaying the outline of the occluded interactive props includes:
  • the display state of the target component is set to the occlusion culling state, and the rendering mode of the target component is set to allow the perspective effect.
  • the displaying the occluded interactive props in a perspective manner in the virtual scene includes:
  • the target object of the virtual scene highlight the mapping area of the occluded interactive prop mapped on the target object, and the target object is an object that occludes the interactive prop.
  • the perspective condition is that at least one component of the prefabricated part of the occluded interactive prop includes a target component whose material is not a semi-transparent material.
  • the detecting whether the controlled virtual object contains the occluded interactive props in the viewing angle range of the virtual scene includes:
  • the method further includes:
  • the displaying the outline of the occluded interactive props on the target object in the virtual scene further includes:
  • the crochet special effect corresponds to the edge color, the width of the hook, the luminous intensity, At least one display parameter of the luminous range and the hook type.
  • an interactive prop display device which includes:
  • the detection module is used to detect whether the controlled virtual object contains occluded interactive props in the perspective range of the virtual scene in response to the perspective props being in the assembled state, and the perspective props are used to display the occluded interactions in a perspective manner Props
  • the detection module is further configured to detect whether the occluded interactive prop meets a perspective condition in response to the occluded interactive prop in the viewing angle range, and the perspective condition is used to indicate that the occluded interactive prop is relatively The conditions under which the perspective props are visible;
  • the perspective display module is configured to display the occluded interactive props in a perspective manner in the virtual scene in response to the occluded interactive props meeting the perspective condition.
  • the see-through display module includes:
  • the outline display unit is configured to display the outline of the occluded interactive prop on the target object in the virtual scene, and the target object is an object that blocks the interactive prop.
  • the display contour unit is used to:
  • the display state of the target component is set to the occlusion culling state, and the rendering mode of the target component is set to allow the perspective effect.
  • the see-through display module is used for:
  • the target object of the virtual scene highlight the mapping area of the occluded interactive prop mapped on the target object, and the target object is an object that occludes the interactive prop.
  • the perspective condition is that at least one component of the prefabricated part of the occluded interactive prop includes a target component whose material is not a semi-transparent material.
  • the detection module is used to:
  • the device further includes:
  • the canceling display module is configured to cancel the display of the shielded interactive prop in the virtual scene in response to the interactive attribute value of the controlled virtual object or the occluded interactive prop being lower than the attribute threshold.
  • the display contour unit is further used for:
  • the crochet special effect corresponds to the edge color, the width of the hook, the luminous intensity, At least one display parameter of the luminous range and the hook type.
  • a terminal in one aspect, includes one or more processors and one or more memories, and at least one piece of program code is stored in the one or more memories.
  • the processor loads and executes to implement the operations performed by the interactive prop display method in any of the above-mentioned possible implementation manners.
  • a storage medium stores at least one piece of program code, and the at least one piece of program code is loaded and executed by a processor to implement what is executed by the interactive prop display method in any of the above-mentioned possible implementation modes. operate.
  • a computer program product or computer program includes computer instructions, and the computer instructions are stored in a computer-readable storage medium.
  • the processor of the computer device reads the computer instruction from the computer-readable storage medium, and the processor executes the computer instruction, so that the computer device executes the interactive item display method provided in the various optional implementations of the foregoing aspects.
  • the controlled virtual object contains occluded interactive props in the perspective of the virtual scene, and the occluded interactive props meet the perspective conditions, it can be in the virtual scene
  • the occluded interactive props are displayed in perspective, so that the controlled virtual object can see through the obstacle to the occluded interactive props, which expands the ability of the controlled virtual object to obtain information, so that the virtual object with different interactive props
  • the living environment is more balanced, which enhances the fun of shooting games provided by the terminal, enriches the interactive methods of shooting games, and optimizes the interactive effects of shooting games; at the same time, it allows players to determine possible threats in the virtual environment
  • the corresponding battle strategy is adopted to encourage the player to control the virtual object to participate in the battle, which improves the use rate of the controlled virtual object’s props, thereby effectively controlling the duration of a single game and further reducing the server To deal with stress.
  • FIG. 1 is a schematic diagram of an implementation environment of an interactive prop display method provided by an embodiment of the present application
  • FIG. 2 is a flowchart of a method for displaying interactive props according to an embodiment of the present application
  • Fig. 3 is a schematic diagram of a prop assembly interface provided by an embodiment of the present application.
  • FIG. 4 is a schematic diagram of an interface for displaying interactive props in a perspective manner according to an embodiment of the present application
  • FIG. 5 is a schematic diagram of an interface for displaying interactive props in a perspective manner according to an embodiment of the present application
  • FIG. 6 is a schematic diagram of an interface for displaying interactive props in a perspective manner according to an embodiment of the present application
  • FIG. 7 is a schematic diagram of an interface for displaying interactive props in a perspective manner according to an embodiment of the present application.
  • FIG. 8 is a schematic diagram of an interface for displaying interactive props in a perspective manner according to an embodiment of the present application.
  • FIG. 9 is a schematic flowchart of an interactive prop display method provided by an embodiment of the present application.
  • FIG. 10 is a schematic structural diagram of an interactive prop display device provided by an embodiment of the present application.
  • FIG. 11 is a schematic structural diagram of a terminal provided by an embodiment of the present application.
  • the term "at least one" refers to one or more, and the meaning of “multiple” refers to two or more than two, for example, multiple first positions refer to two or more first positions.
  • Virtual scene It is the virtual scene displayed (or provided) when the application is running on the terminal.
  • the virtual scene may be a simulation environment of the real world, a semi-simulation and semi-fictional virtual environment, or a purely fictitious virtual environment.
  • the virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, or a three-dimensional virtual scene, and the embodiment of the present application does not limit the dimensions of the virtual scene.
  • the virtual scene may include the sky, land, sea, etc.
  • the land may include environmental elements such as deserts and cities, and the user can control the virtual objects to move in the virtual scene.
  • Virtual object refers to the movable object in the virtual scene.
  • the movable objects may be virtual characters, virtual animals, cartoon characters, etc., such as: characters, animals, plants, oil barrels, walls, stones, etc. displayed in a virtual scene.
  • the virtual object may be a virtual avatar used to represent the user in the virtual scene.
  • the virtual scene may include multiple virtual objects, and each virtual object has its own shape and volume in the virtual scene and occupies a part of the space in the virtual scene.
  • the virtual object can be a player character controlled by operations on the client, or it can be an artificial intelligence (AI) set in a virtual scene battle through training, or it can be set in a virtual scene to interact with each other.
  • the non-player character Non-Player Character, NPC.
  • the virtual object may be a virtual character competing in a virtual scene.
  • the number of virtual objects participating in the interaction in the virtual scene may be preset, or may be dynamically determined according to the number of clients participating in the interaction.
  • the user can control the virtual object to fall freely in the sky of the virtual scene, glide or open a parachute to fall, etc., run, jump, crawl, bend forward, etc. on the land, and can also control The virtual object swims, floats, or dives in the ocean.
  • the user can also control the virtual object to take a virtual vehicle to move in the virtual scene.
  • the virtual vehicle can be a virtual car, a virtual aircraft, a virtual yacht, etc.
  • Users can also control virtual objects to interact with other virtual objects through interactive props.
  • the interactive props can be grenade, cluster mine, sticky grenade ("sticky thunder"), laser trip thunder and other throwing virtual weapons. , It can also be shooting virtual weapons such as machine guns, pistols, rifles, and sentry machine guns, or some virtual soldiers of summoning types (such as mechanical zombies), etc. This application does not specifically limit the types of interactive props.
  • FIG. 1 is a schematic diagram of an implementation environment of a method for displaying interactive props according to an embodiment of the present application.
  • the implementation environment includes: a first terminal 120, a server 140 and a second terminal 160.
  • the first terminal 120 installs and runs an application program supporting virtual scenes.
  • the application can be First-Person Shooting Game (FPS), Third-Person Shooting Game, Multiplayer Online Battle Arena Games (MOBA), Virtual Reality Application, 3D Map Program, Military Either a simulation program or a multiplayer gun battle survival game.
  • the first terminal 120 may be a terminal used by the first user.
  • the first user uses the first terminal 120 to operate the first virtual object in the virtual scene to perform activities, including but not limited to: adjusting body posture, crawling, walking, and running. , At least one of riding, jumping, driving, picking up, shooting, attacking, and throwing.
  • the first virtual object is a first virtual character, such as a simulated character or an animation character.
  • the first terminal 120 and the second terminal 160 are connected to the server 140 through a wireless network or a wired network.
  • the server 140 may include at least one of one server, multiple servers, a cloud computing platform, or a virtualization center.
  • the server 140 is used to provide background services for applications supporting virtual scenes.
  • the server 140 can take on the main calculation work, and the first terminal 120 and the second terminal 160 can take on the secondary calculation work; or the server 140 can take on the secondary calculation work, and the first terminal 120 and the second terminal 160 can take on the main calculation work.
  • the second terminal 160 installs and runs an application program supporting virtual scenes.
  • the application can be any of FPS, third-person shooter games, MOBA, virtual reality applications, three-dimensional map programs, military simulation programs, or multiplayer gun battle survival games.
  • the second terminal 160 may be a terminal used by a second user.
  • the second user uses the second terminal 160 to operate a second virtual object in a virtual scene to perform activities, including but not limited to: adjusting body posture, crawling, walking, running , At least one of riding, jumping, driving, picking up, shooting, attacking, and throwing.
  • the second virtual object is a second virtual character, such as a simulated character or an animation character.
  • the first virtual object controlled by the first terminal 120 and the second virtual object controlled by the second terminal 160 are in the same virtual scene.
  • the first virtual object can interact with the second virtual object in the virtual scene.
  • the first virtual object and the second virtual object may be in a hostile relationship.
  • the first virtual object and the second virtual object may belong to different teams and organizations, and the virtual objects in the hostile relationship may be on land. Interaction in the battle mode is carried out by shooting at each other.
  • the first terminal 120 controls the first virtual object to place interactive props on a position occluded by a target object in the virtual scene.
  • the second terminal 160 is equipped with perspective props before the game starts, then after the game starts, when the second terminal 160 controls the second virtual object to move in the virtual scene, if the second virtual object’s viewing angle range contains
  • the second terminal 160 detects whether the occluded interactive prop meets the perspective condition, and when the occluded interactive prop meets the perspective condition, it is used in the virtual scene
  • the interactive props that are blocked are displayed in perspective, so that the second virtual object can observe the interactive props ambushed by the first virtual object in the game, which can increase the interest of the game process, bring richer interactive methods and more Various interactive effects.
  • first virtual object and the second virtual object may be teammates.
  • first virtual character and the second virtual character may belong to the same team, the same organization, have a friend relationship, or have a temporary relationship. Communication authority.
  • the applications installed on the first terminal 120 and the second terminal 160 are the same, or the applications installed on the two terminals are the same type of application on different operating system platforms.
  • the first terminal 120 may generally refer to one of multiple terminals
  • the second terminal 160 may generally refer to one of multiple terminals. This embodiment only uses the first terminal 120 and the second terminal 160 as examples.
  • the device types of the first terminal 120 and the second terminal 160 are the same or different. The device types include: smart phones, tablet computers, e-book readers, MP3 (Moving Picture Experts Group Audio Layer III, moving picture experts compress standard audio layer 3) ) Players, MP4 (Moving Picture Experts Group Audio Layer IV, Motion Picture Experts compress standard audio layer 4) Players, laptop portable computers, desktop computers, or vehicle-mounted terminals, etc.
  • the first terminal 120 and the second terminal 160 may be smart phones or other handheld portable game devices.
  • the terminal includes a smart phone as an example.
  • the number of the aforementioned terminals may be more or less. For example, there may be only one terminal, or there may be dozens or hundreds of terminals, or more.
  • the embodiments of the present application do not limit the number of terminals and device types.
  • the foregoing implementation environment can be built on a blockchain system, which is a new application mode of computer technology such as distributed data storage, peer-to-peer transmission, consensus mechanism, and encryption algorithm.
  • Blockchain essentially a decentralized database, is a series of data blocks associated with cryptographic methods. Each data block contains a batch of network transaction information for verification. The validity of the information (anti-counterfeiting) and the generation of the next block.
  • both the first terminal 120 and the second terminal 160 may be node devices on the blockchain system, so that whenever any node device controls an interactive item through an application and generates interactive data, Upload interactive data to the blockchain system to achieve persistent storage on the blockchain system. Due to the immutability of the blockchain system, the storage of interactive data has higher security.
  • Fig. 2 is a flowchart of a method for displaying interactive props according to an embodiment of the present application.
  • this embodiment uses the method applied to a terminal as an example.
  • the terminal may be the first terminal 120 or the second terminal 160 shown in FIG. 1.
  • This embodiment includes the following steps:
  • the terminal starts an application, and displays the prop assembly interface in the application.
  • the application program may be any application program that can support virtual scenes.
  • the application program may be a game client on the terminal, or it may be an embedded applet built in a certain application client.
  • the application program may be any one of FPS, third-person shooter games, MOBA, virtual reality applications, three-dimensional map programs, military simulation programs, or multiplayer gun battle survival games. Make specific restrictions.
  • the terminal can start the application in response to the user's startup operation of the application, where the startup operation can be that the user touches the icon of the application on the desktop, or the user gives the intelligent voice assistant Input a startup instruction for the application program, the startup instruction may include a voice instruction or a text instruction, and the embodiment of the present application does not specifically limit the type of the startup instruction.
  • the terminal when the user sets an automatic startup condition for the application, the terminal may automatically start the application when the terminal detects that the automatic startup condition of the application is met.
  • the automatic startup condition may be The application program is started periodically, for example, the application program is started at 8 o'clock every night, or the automatic start condition may also be automatic start after booting.
  • the embodiment of the present application does not specifically limit the automatic start condition of the application program.
  • the prop assembly interface can be used to provide the user with the target props used in the virtual scene.
  • the target prop refers to the passive and persistent virtual props that can assist the user in the battle in the game. Therefore, the target props can be Vividly called the "skill chip".
  • Target props can usually include three types: offensive props, defensive props, and auxiliary props. Users can choose personalized target props in the prop assembly interface according to their favorite or habitual combat methods.
  • the target props can be assembled by the user before the start of the game. Since the target props can be called skill chips vividly, the process of assembling the skill chips in the prop assembly interface before the game can also be vivid It is called adding skill points on the skill tree (perk). Of course, in some embodiments, the target prop may also be assembled by the user after the game match starts, and the embodiment of the present application does not specifically limit the assembly timing of the target prop.
  • the terminal In response to the user's assembling operation of the perspective prop in the prop assembly interface, the terminal sets the perspective prop to an assembled state.
  • the perspective props are used to display the occluded interactive props in perspective, which is equivalent to expanding the user's ability to obtain information in the game match, so it can be called an "engineer skill chip", and the occluded interactive props can be
  • the interactive props of the friendly camp can also be the interactive props of the enemy camp.
  • the occluded interactive props are taken as the interactive props of the enemy camp as an example, but should not constitute the occluded interactive props.
  • the specific limitation of the camp is not be used to display the occluded interactive props in perspective, which is equivalent to expanding the user's ability to obtain information in the game match, so it can be called an "engineer skill chip", and the occluded interactive props can be
  • the interactive props of the friendly camp can also be the interactive props of the enemy camp.
  • the occluded interactive props are taken as the interactive props of the enemy camp as an example, but should not constitute the occluded interactive props.
  • the assembly options of the perspective props and other target props can be displayed in the prop assembly interface.
  • the terminal sets the perspective props to the assembled state.
  • the terminal can be different Display the perspective props in the assembled state in the display mode of other target props, such as changing the background color of the assembly options of the perspective props, such as highlighting the outline of the assembly options of the perspective props, etc.
  • similar operations can be used to complete the assembly process for other target props.
  • Fig. 3 is a schematic diagram of a prop assembly interface provided by an embodiment of the application. Please refer to Fig. 3.
  • a plurality of target props 301 to 306 can be provided for selection.
  • the perspective props 301 can be displayed in a way that is different from other unselected target props. For example, a check mark can be added to the display area of the perspective props 301.
  • the perspective can also be displayed.
  • the background color of the display area of the item 301 is changed to a background color different from that of other target items.
  • the above steps 201-202 show a possible implementation of assembling perspective props, which means that the user autonomously selects the assembling perspective props in the prop assembly interface.
  • the falling animation of the perspective props can also be displayed in the virtual scene.
  • the distance between the controlled virtual object and the perspective props is less than the target threshold, it will be displayed in the virtual scene.
  • the picking option of perspective props in response to the user's triggering operation of the picking option of the perspective props, the terminal controls the controlled virtual object to pick up and assemble the perspective props, thereby increasing the transfer and picking process of the perspective props in the game match. Further increase the fun of shooting games and enrich the interactive ways of shooting games.
  • the terminal In response to the perspective props being in the assembled state, the terminal detects whether the controlled virtual object contains the occluded interactive props in the viewing angle range of the virtual scene.
  • the terminal may first detect whether interactive props are included in the viewing angle range of the controlled virtual object, and in response to the interactive props being included in the viewing angle range, the terminal then obtains the distance between the controlled virtual object and the interactive props In response to the distance being less than the distance threshold, the terminal then detects whether there is a target object between the controlled virtual object and the interactive prop, and the target object is an object that blocks the interactive prop, in response to the controlled virtual object and the interactive prop If the target object is included, the terminal determines that the occluded interactive prop is included in the viewing angle range; otherwise, as long as any of the above conditions is not satisfied, the terminal can determine that the occluded interactive prop is not included in the viewing angle range.
  • the distance threshold may be any value greater than or equal to 0, and the embodiment of the present application does not specifically limit the value of the distance threshold.
  • the terminal can determine the viewing angle range by detecting the interactive props, detecting the distance between the interactive props and the controlled virtual object, and detecting the target object between the interactive props and the controlled virtual object Whether there are interactive props that are occluded by the target object, so as to quickly and accurately locate the objects that the perspective props can see through (that is, the occluded interactive props).
  • the terminal may first detect whether the controlled virtual object and the interactive props contain the target object. If the controlled virtual object and the interactive props contain the target object, Then check whether the distance between the controlled virtual object and the interactive props is less than the distance threshold. When the distance is less than the distance threshold, the terminal determines that the blocked interactive prop is included in the viewing angle range. Otherwise, as long as any of the above conditions is not met, the terminal Make sure that no hidden interactive props are included in the viewing angle range.
  • the terminal may not perform the step of detecting the distance between the controlled virtual object and the interactive props, that is, once it detects that the interactive prop is included in the viewing angle range, it directly detects the controlled virtual object and the interactive props. Whether the interactive props contain the target object, if the controlled virtual object and the interactive prop contain the target object, the terminal determines that the occluded interactive prop is included in the viewing angle range, otherwise, as long as any of the above conditions is not met, the terminal determines the viewing angle The occluded interactive props are not included in the range. In this case, it is equivalent to not specifically limiting the distance range that the perspective prop can see through, which can improve the perspective effect of the perspective prop and simplify the process of the detection operation.
  • the terminal performs the detection operation in step 203 at the beginning of the game match, if the perspective props are assembled after the start of the game match , Then the terminal executes the detection operation in step 203 at the moment when the assembly of the see-through props is completed, the embodiment of the present application does not specifically limit the execution time of the detection operation.
  • the perspective prop has certain prop use conditions, that is, the controlled virtual object needs to set the perspective prop to be in a usable state after meeting preset conditions.
  • the conditions for using the props can be pre-configured by the developer.
  • the conditions for using the props for setting the perspective props are: the consecutive kill reward reaches a preset score, or the number of consecutive kills reaches a preset number, etc.
  • the use conditions of the perspective props are exemplified: If the user completes the assembly of the perspective props on the prop assembly interface for the controlled virtual object, the user controls the controlled virtual object to enter the game. At this time, the perspective prop is in an unavailable state.
  • the preset number is 5, that is, when the number of kills reaches 5 . Make sure that the prop use conditions of the perspective props are met, and set the use state of the perspective props to the available state.
  • the prop display control corresponding to the perspective virtual prop can be displayed in the game interface.
  • the prop display control corresponds to gray or white.
  • the prop display control is highlighted to prompt the user to promptly remind the user of the change in the use state of the perspective props control.
  • the terminal In response to the occluded interactive prop being included in the viewing angle range, the terminal detects whether the occluded interactive prop meets a perspective condition, and the perspective condition is used to indicate a condition under which the occluded interactive prop is visible relative to the perspective prop.
  • the perspective condition may be that at least one component of the prefab of the occluded interactive prop contains a target component whose material is not a semi-transparent material, that is, as long as any material exists in the at least one component If the target component is not a semi-transparent material, then the occluded interactive props are visible relative to the perspective props. In other words, the occluded interactive props can be detected by the perspective props; conversely, if all of the at least one component is The materials of the components are semi-transparent materials, then the occluded interactive props are not visible to the perspective props, and the occluded interactive props cannot be detected by the perspective props.
  • the terminal may obtain the renderer (Renderer) of at least one component of the prefab (Prefab) of the occluded interactive prop, save the renderer of the at least one component in an array, and traverse the above-mentioned array.
  • the material (Material) of each renderer is judged whether it is a semi-transparent material. If all the materials of the renderer are semi-transparent materials, it is determined that the occluded interactive props do not meet the perspective conditions, which is equivalent to the occluded interaction. Add crocheting effects to the props. Otherwise, if there is at least one renderer whose material is not a semi-transparent material, make sure that the occluded interactive props meet the perspective conditions, and you can add crocheting effects to the occluded interactive props.
  • some interactive props may have some components that are semi-transparent materials, and some of the components are not semi-transparent materials, obtain the outline of the components that are not semi-transparent materials, and add edge effects to them.
  • the terminal may also set the perspective condition so that all the components of the prefab of the occluded interactive prop are the above-mentioned target components, that is, when it is determined that the materials of all renderers corresponding to a certain interactive prop are not semi- Through the material, it is determined that the interactive prop meets the perspective conditions, so that the perspective conditions of the interactive props can be more strictly controlled.
  • the terminal may also set the perspective condition so that the occluded interactive prop hits any prop in the see-through list, that is, only the props in the see-through list can be detected by the perspective prop.
  • the see-through list can include anti-aircraft guns, sentry machine guns, armed helicopters and other continuous kill bonus items.
  • soldier skills and tactical items such as explosion-proof devices, laser trip mines, and throwing axes.
  • aerial gunboats Such as virtual vehicles, the embodiment of this application does not specifically limit the content of the see-through list.
  • the see-through list can be preset by the developer, and the see-through list can be distributed and stored in the terminal, so that the terminal can determine whether the interactive props meet the perspective condition according to the see-through list in the future.
  • the see-through list contains 10 interactive props, and each interactive prop corresponds to a preset score or a preset number.
  • the user When the user assembles the perspective prop and enters the game, it records the continuous killing of the accused virtual prop after entering the game. If the number of virtual objects or enemy virtual objects reaches the preset number corresponding to the interactive props, the interactive props in the see-through list are unlocked, and the interactive props are correspondingly determined to meet the perspective condition.
  • the method of unlocking interactive props in the perspective list can be increased, which can increase the threshold and authority for the interactive props to meet the perspective conditions, and prevent the perspective props from being abused and threatening the fairness of previous battles with different virtual objects.
  • the terminal In response to the occluded interactive prop meets the perspective condition, the terminal displays the occluded interactive prop in the virtual scene in a perspective manner.
  • the terminal displays the occluded interactive props in perspective
  • the following steps may be performed: the terminal displays the outline of the occluded interactive props on the target object in the virtual scene, and the target object is Objects that obscure the interactive props.
  • the terminal displays the outline of the occluded interactive props on the target object in the virtual scene
  • the target object is Objects that obscure the interactive props.
  • the terminal when the terminal displays the outline of the occluded interactive prop, it may determine that the material does not belong to the target component of the semi-transparent material in at least one component of the prefabricated part of the occluded interactive prop, and set the target
  • the display state of the component is set to the occlusion culling state, and the rendering mode of the target component is set to allow the perspective effect.
  • the outline of the occluded interactive props can be displayed on the target object, which can accurately add side effects to the interactive props.
  • the terminal obtains the renderers (Renderers) of all the components on the prefab (Prefab) of the interactive props that are occluded through the above step 204.
  • the terminal can obtain that the material (Material) does not belong to the semi- For the renderer of the target component of the transparent material, the display state (passType) of the renderer is set to the occlusion culling state (ForwardPass) at this time, and the rendering mode of the renderer is set to allow the perspective effect.
  • the above setting method for the renderer can be represented by the following code:
  • FIG. 4 is a schematic diagram of an interface for displaying interactive props in perspective according to an embodiment of the present application. Please refer to FIG.
  • the object is equipped with perspective props, because there is a blocked virtual gunship 401 within the perspective of the controlled virtual object, and the virtual gunship 401 meets the perspective conditions, the ceiling 402 of the virtual gunship 401 (also That is, on the target object), the outline of the virtual gunship 401 is displayed.
  • the virtual armed helicopter 401 has greater lethality, the virtual armed helicopter 401 can be used as a reward item for continuous kills to be issued in a triggered manner.
  • FIG. 5 is a schematic diagram of an interface for displaying interactive props in a perspective manner according to an embodiment of the present application. Please refer to FIG.
  • the perspective props have been installed, because there is a blocked anti-aircraft gun 501 in the field of view of the controlled virtual object, and the anti-aircraft gun 501 meets the perspective conditions, the wall 502 (that is, the target object) that blocks the anti-aircraft gun 501 ), the outline of the anti-aircraft gun 501 is displayed.
  • Fig. 6 is a schematic diagram of an interface for displaying interactive props in a perspective manner provided by an embodiment of the present application. Please refer to Fig. 6.
  • the virtual scene 600 taking the occluded interactive props as a sentry machine gun as an example for illustration, the accused virtual object
  • the sentry machine gun 601 meets the perspective conditions, the shadow of the sentry machine gun 601 is blocked 602 (that is, the target On the object), the outline of the sentry machine gun 601 is displayed.
  • FIG. 7 is a schematic diagram of an interface for displaying interactive props in a perspective manner provided by an embodiment of the present application. Please refer to FIG.
  • the explosion-proof device here refers to an interactive prop used to prevent throwing virtual weapons (such as grenade) from exploding.
  • FIG. 8 is a schematic diagram of an interface for displaying interactive props in a perspective manner provided by an embodiment of the present application. Please refer to FIG.
  • the wall 802 also That is, on the target object
  • the outline of the virtual vehicle 801 is displayed.
  • the outline of the occluded interactive props can be obtained, and a side effect is added to the interactive props according to the outline, In this way, the outline of the occluded interactive props is displayed on the target object.
  • the terminal can finely configure the edge color, the short-range crochet width, the long-distance crochet width, the transition distance between far and near, and the luminous intensity during the process of adding crochet effects to the hidden interactive props.
  • all the display parameters can be set to the default value.
  • the edge color in the hooking effect because the hooking effect is displayed on the target object (that is, the opaque obstacle between the interactive item and the controlled virtual item), it is therefore easier to distinguish the target object from the interactive item
  • the special effect of the border makes it easier for the controlled virtual object (user) to find the interactive props.
  • the target color of the target object can be obtained, and it will be different from the target color (or have a large gap The color of) is used as the edge color in the edging effect to avoid that the target object and the edge color of the edging effect are close to cause the controlled virtual object to ignore the edging effect, thereby affecting the function of the perspective props.
  • the hook width of the hooking effect set the hook width to be positively related to the distance between the interactive props and the controlled virtual object, that is, if the distance between the interactive props and the controlled virtual object is farther, for This allows the user to detect the outline of the interactive props more clearly, and set the width of the border of the interactive props to be wider, thereby improving the clarity of the outline of the interactive props relative to the user.
  • the distance between the interactive props and the controlled virtual object is 10m
  • set the relative value of the hook width of the interactive prop's edge effect to 1
  • the distance between the interactive props and the controlled virtual object is 50m
  • the relative value of the crochet width of the prop crochet effect is 10. The larger the relative value of the crochet width, the wider the actual crochet width.
  • the corresponding luminous intensity can be set to have a positive correlation with the distance between the interactive props and the controlled virtual object, that is, the farther the distance between the interactive props and the controlled virtual object is,
  • the stronger the luminous intensity corresponding to the setting of the hook effect on the contrary, the closer the distance between the interactive props and the controlled virtual object, the weaker the corresponding luminous intensity of the setting of the hook special effect, so that when the interactive props are far away from the controlled virtual object, Compared with virtual objects, it still has better resolution and clarity, and further optimizes the function of perspective props.
  • the relative status between the controlled virtual props and the interactive props can be obtained to adjust the display parameters of the interactive props' border effects, for example, to adjust the border according to the relative distance. Width and luminous intensity, etc., improve the clarity and distinguishability of the interactive props' bordering effects relative to the user, thereby further optimizing the function of the perspective props.
  • the terminal when the terminal displays the occluded interactive props in perspective, the following steps may be performed: the terminal highlights the occluded interactive props mapped on the target object on the target object in the virtual scene
  • the target object is the object that blocks the interactive prop.
  • by highlighting the mapping area of the occluded interactive props on the target object it is equivalent to highlighting the position of the occluded interactive props on the target object, which can be vividly referred to as adding the occluded interactive props "Sticker special effects" enable users to intuitively observe hidden interactive props.
  • the terminal may add graphic stickers to the mapping area, or play a perspective animation in the mapping area, or display it in the mapping area.
  • the text prompt information can also display the occluded interactive props in the mapping area in a perspective holographic imaging manner.
  • the embodiment of the present application does not specifically limit the manner of highlighting the mapping area.
  • the terminal can also set different perspective display modes for different types of interactive props. For example, when the interactive prop is a virtual vehicle, only the outline of the virtual vehicle is displayed on the target object. For weapons, highlight the mapping area of the virtual weapon mapped on the target object, which can provide a richer variety of perspective display methods.
  • the terminal cancels displaying the occluded interactive prop in the virtual scene.
  • the terminal can set interactive attribute values for interactive props or controlled virtual objects.
  • the interactive attribute values can be virtual blood volume, virtual integrity, virtual health, etc.
  • the terminal can deduct a certain value of the interactive attribute value for the interactive item or the controlled virtual object.
  • the terminal needs to turn off the perspective effect of the occluded interactive props, that is, cancel the display of the occluded interactive props in the virtual scene.
  • the terminal sets the renderer of the occluded interactive props to a disactive state (disactive).
  • the terminal when the terminal detects that the interactive prop is far from the field of view of the controlled virtual object, or because the controlled virtual object moves, there is no target object (obstacle) between the interactive prop and the controlled virtual object At this time, the terminal needs to turn off the perspective effect of the occluded interactive props, that is, cancel the special effect of the interactive props.
  • the perspective props under the condition that the perspective props are in the assembled state, if it is detected that the controlled virtual object contains the occluded interactive props in the viewing angle range of the virtual scene, and the occluded interactive props conform to The perspective condition can display the occluded interactive props in perspective in the virtual scene, so that the controlled virtual object can see through the obstacle to the occluded interactive props, which expands the ability of the controlled virtual object to obtain information, so that The living environment of virtual objects with different interactive props is more balanced, which enhances the fun of shooting games provided by the terminal, enriches the interactive methods of shooting games, and optimizes the interactive effects of shooting games; at the same time, players can be sure Potential threatening interactive props in the complex virtual environment, and plan the action route based on the location of the interactive props, shorten the time required for users to control the movement of virtual objects in the complex environment, and increase the speed of the virtual objects in the virtual environment. This reduces the duration of a single office, saves terminal
  • FIG. 9 is a schematic flowchart of an interactive prop display method provided by an embodiment of the present application. Taking the perspective props as a perspective detection skill chip as an example for description, the flowchart 900 shows the following steps:
  • Step 1 The user selects the perk (perk) outside the office.
  • the user can select the perspective detection skill chip on the perk chip skill page.
  • Step two the game starts.
  • the user completes the perspective detection technology chip for the controlled virtual object outside the game, correspondingly, when the charged virtual object enters the game, it carries the perspective detection technology chip.
  • Step 3 Determine whether the controlled virtual object (the virtual object currently controlled by the user) is facing the enemy device, if it is facing the enemy device, perform step 4, otherwise, end the process.
  • the terminal judges whether interactive props are included in the viewing angle of the controlled virtual object.
  • the terminal also needs to determine whether the distance between the controlled virtual object and the interactive props is less than the distance threshold, that is, the distance between the controlled virtual object and the interactive props cannot be too far, otherwise the interaction cannot be seen through Props.
  • the interactive attribute value of the controlled virtual object is higher than the attribute threshold, that is, to confirm that the controlled virtual object is in a living state.
  • Step 4 Determine whether there is an invisible obstacle (ie, target object) between the enemy device and the controlled virtual object. If there is an invisible obstacle, perform step 5; otherwise, end the process.
  • an invisible obstacle ie, target object
  • Step 5 Determine whether the material of the enemy device can be displayed in perspective, if it can be displayed in perspective, go to step 6, otherwise, end the process.
  • Step 6 Show the perspective effect for the enemy device.
  • the user can detect the location of the enemy's device through the perspective detection technology chip, so as to see through the enemy's combat layout, and can avoid its edge and choose its own attack method to make the enemy's combat
  • the strategy is self-defeating, which can improve the living environment of the inferior side in the unbalanced game, provide more interesting gameplay for the entire shooting game, and bring users a better gaming experience; at the same time, it allows players to determine the virtual reality.
  • FIG. 10 is a schematic structural diagram of an interactive prop display device provided by an embodiment of the present application. Please refer to FIG. 10, which includes:
  • the detection module 1001 is used for detecting whether the controlled virtual object contains occluded interactive props in the viewing angle range of the virtual scene in response to the perspective props being in the assembled state, and the perspective props are used to display the occluded interactions in a perspective manner Props
  • the detection module 1001 is further configured to detect whether the occluded interactive prop meets the perspective condition in response to the occluded interactive prop in the viewing angle range, and the perspective condition is used to indicate that the occluded interactive prop is relative to the perspective prop Visible condition
  • the perspective display module 1002 is configured to respond to the occluded interactive props meeting the perspective condition, and display the occluded interactive props in a perspective manner in the virtual scene.
  • the device provided by the embodiment of the application through the condition that the perspective props are in the assembled state, if it is detected that the controlled virtual object contains the occluded interactive props in the viewing angle range of the virtual scene, and the occluded interactive props conform to The perspective condition can display the occluded interactive props in perspective in the virtual scene, so that the controlled virtual object can see through the obstacle to the occluded interactive props, which expands the ability of the controlled virtual object to obtain information, so that The living environment of virtual objects with different interactive props is more balanced, which enhances the fun of shooting games provided by the terminal, enriches the interactive methods of shooting games, and optimizes the interactive effects of shooting games; at the same time, players can be sure Threatening interactive props that may exist in the virtual environment, and based on the location of the interactive props, the corresponding battle strategy is adopted to improve the survival rate of virtual objects while prompting players to control the virtual objects to actively participate in the battle, thereby increasing the controlled virtual objects
  • the use rate of the props which can effectively control
  • the see-through display module 1002 includes:
  • the outline display unit is used to display the outline of the occluded interactive prop on the target object in the virtual scene, and the target object is an object that occludes the interactive prop.
  • the display contour unit is used to:
  • the see-through display module 1002 is used to:
  • the target object of the virtual scene highlight the mapping area where the occluded interactive prop is mapped on the target object, and the target object is the object that occludes the interactive prop.
  • the perspective condition is that at least one component of the prefabricated part of the occluded interactive prop includes a target component whose material is not a semi-transparent material.
  • the detection module 1001 is used to:
  • the controlled virtual object and the interactive prop In response to the distance being less than the distance threshold, detecting whether the controlled virtual object and the interactive prop contain a target object, and the target object is an object that blocks the interactive prop;
  • the occluded interactive prop In response to the target object being contained between the controlled virtual object and the interactive prop, it is determined that the occluded interactive prop is included in the viewing angle range; otherwise, it is determined that the occluded interactive prop is not included in the viewing angle range.
  • the device further includes:
  • the cancel display module is used for canceling display of the shielded interactive prop in the virtual scene in response to the interactive attribute value of the controlled virtual object or the occluded interactive prop being lower than the attribute threshold.
  • the display contour unit is further used for:
  • the crochet special effect corresponds to the edge color, the width of the hook, the luminous intensity, At least one display parameter of the luminous range and the hook type.
  • the interactive props display device provided in the above embodiment displays interactive props
  • only the division of the above-mentioned functional modules is used as an example for illustration.
  • the above-mentioned functions can be allocated by different functional modules as required.
  • the internal structure of the terminal is divided into different functional modules to complete all or part of the functions described above.
  • the interactive prop display device provided in the above-mentioned embodiment and the embodiment of the interactive prop display method belong to the same concept.
  • FIG. 11 is a schematic structural diagram of a terminal provided by an embodiment of the present application.
  • the terminal 1100 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, moving picture expert compression standard audio layer 3), MP4 (Moving Picture Experts Group Audio Layer IV, moving picture expert compressing standard audio layer III) 4) Player, laptop or desktop computer.
  • the terminal 1100 may also be called user equipment, portable terminal, laptop terminal, desktop terminal and other names.
  • the terminal 1100 includes a processor 1101 and a memory 1102.
  • the processor 1101 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on.
  • the processor 1101 can adopt at least one hardware form among DSP (Digital Signal Processing), FPGA (Field-Programmable Gate Array), and PLA (Programmable Logic Array, Programmable Logic Array). accomplish.
  • the processor 1101 may also include a main processor and a coprocessor.
  • the main processor is a processor used to process data in the awake state, also called a CPU (Central Processing Unit, central processing unit); the coprocessor is A low-power processor used to process data in the standby state.
  • the processor 1101 may be integrated with a GPU (Graphics Processing Unit, image processor), and the GPU is used for rendering and drawing content that needs to be displayed on the display screen.
  • the processor 1101 may further include an AI (Artificial Intelligence) processor, and the AI processor is used to process computing operations related to machine learning.
  • AI Artificial Intelligence
  • the memory 1102 may include one or more computer-readable storage media, which may be non-transitory.
  • the memory 1102 may also include high-speed random access memory and non-volatile memory, such as one or more magnetic disk storage devices and flash memory storage devices.
  • the non-transitory computer-readable storage medium in the memory 1102 is used to store at least one instruction, and the at least one instruction is used to be executed by the processor 1101 to implement the interactive props provided in the various embodiments of the present application. Display method.
  • the terminal 1100 may optionally further include: a peripheral device interface 1103 and at least one peripheral device.
  • the processor 1101, the memory 1102, and the peripheral device interface 1103 may be connected by a bus or a signal line.
  • Each peripheral device can be connected to the peripheral device interface 1103 through a bus, a signal line, or a circuit board.
  • the peripheral device includes: at least one of a radio frequency circuit 1104, a display screen 1105, a camera component 1106, an audio circuit 1107, a positioning component 1108, and a power supply 1109.
  • the peripheral device interface 1103 may be used to connect at least one peripheral device related to I/O (Input/Output) to the processor 1101 and the memory 1102.
  • the processor 1101, the memory 1102, and the peripheral device interface 1103 are integrated on the same chip or circuit board; in some other embodiments, any one of the processor 1101, the memory 1102, and the peripheral device interface 1103 or The two can be implemented on a separate chip or circuit board, which is not limited in this embodiment.
  • the radio frequency circuit 1104 is used to receive and transmit RF (Radio Frequency, radio frequency) signals, also called electromagnetic signals.
  • the radio frequency circuit 1104 communicates with a communication network and other communication devices through electromagnetic signals.
  • the radio frequency circuit 1104 converts electrical signals into electromagnetic signals for transmission, or converts received electromagnetic signals into electrical signals.
  • the radio frequency circuit 1104 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a user identity module card, and so on.
  • the radio frequency circuit 1104 can communicate with other terminals through at least one wireless communication protocol.
  • the wireless communication protocol includes but is not limited to: World Wide Web, Metropolitan Area Network, Intranet, various generations of mobile communication networks (2G, 3G, 4G, and 5G), wireless local area network and/or WiFi (Wireless Fidelity, wireless fidelity) network.
  • the radio frequency circuit 1104 may also include a circuit related to NFC (Near Field Communication), which is not limited in this application.
  • the display screen 1105 is used to display UI (User Interface, user interface).
  • the UI can include graphics, text, icons, videos, and any combination thereof.
  • the display screen 1105 also has the ability to collect touch signals on or above the surface of the display screen 1105.
  • the touch signal may be input to the processor 1101 as a control signal for processing.
  • the display screen 1105 may also be used to provide virtual buttons and/or virtual keyboards, also called soft buttons and/or soft keyboards.
  • the display screen 1105 there may be one display screen 1105, which is provided with the front panel of the terminal 1100; in other embodiments, there may be at least two display screens 1105, which are respectively arranged on different surfaces of the terminal 1100 or in a folded design; In other embodiments, the display screen 1105 may be a flexible display screen, which is disposed on the curved surface or the folding surface of the terminal 1100. Even the display screen 1105 can also be set as a non-rectangular irregular pattern, that is, a special-shaped screen.
  • the display screen 1105 may be made of materials such as LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode, organic light-emitting diode).
  • the camera assembly 1106 is used to capture images or videos.
  • the camera assembly 1106 includes a front camera and a rear camera.
  • the front camera is set on the front panel of the terminal, and the rear camera is set on the back of the terminal.
  • the camera assembly 1106 may also include a flash.
  • the flash can be a single-color flash or a dual-color flash. Dual color temperature flash refers to a combination of warm light flash and cold light flash, which can be used for light compensation under different color temperatures.
  • the audio circuit 1107 may include a microphone and a speaker.
  • the microphone is used to collect sound waves of the user and the environment, and convert the sound waves into electrical signals and input them to the processor 1101 for processing, or input to the radio frequency circuit 1104 to implement voice communication. For the purpose of stereo collection or noise reduction, there may be multiple microphones, which are respectively set in different parts of the terminal 1100.
  • the microphone can also be an array microphone or an omnidirectional collection microphone.
  • the speaker is used to convert the electrical signal from the processor 1101 or the radio frequency circuit 1104 into sound waves.
  • the speaker can be a traditional thin-film speaker or a piezoelectric ceramic speaker.
  • the speaker When the speaker is a piezoelectric ceramic speaker, it can not only convert the electrical signal into human audible sound waves, but also convert the electrical signal into human inaudible sound waves for distance measurement and other purposes.
  • the audio circuit 1107 may also include a headphone jack.
  • the positioning component 1108 is used to locate the current geographic location of the terminal 1100 to implement navigation or LBS (Location Based Service, location-based service).
  • the positioning component 1108 may be a positioning component based on the GPS (Global Positioning System, Global Positioning System) of the United States, the Beidou system of China, the Granus system of Russia, or the Galileo system of the European Union.
  • the power supply 1109 is used to supply power to various components in the terminal 1100.
  • the power source 1109 may be alternating current, direct current, disposable batteries, or rechargeable batteries.
  • the rechargeable battery may support wired charging or wireless charging.
  • a wired rechargeable battery is a battery charged through a wired line
  • a wireless rechargeable battery is a battery charged through a wireless coil.
  • the rechargeable battery can also be used to support fast charging technology.
  • the terminal 1100 further includes one or more sensors 1110.
  • the one or more sensors 1110 include, but are not limited to: an acceleration sensor 1111, a gyroscope sensor 1112, a pressure sensor 1113, a fingerprint sensor 1114, an optical sensor 1115, and a proximity sensor 1116.
  • the acceleration sensor 1111 can detect the magnitude of acceleration on the three coordinate axes of the coordinate system established by the terminal 1100.
  • the acceleration sensor 1111 can be used to detect the components of gravitational acceleration on three coordinate axes.
  • the processor 1101 may control the display screen 1105 to display the user interface in a horizontal view or a vertical view according to the gravitational acceleration signal collected by the acceleration sensor 1111.
  • the acceleration sensor 1111 may also be used for the collection of game or user motion data.
  • the gyroscope sensor 1112 can detect the body direction and rotation angle of the terminal 1100, and the gyroscope sensor 1112 can cooperate with the acceleration sensor 1111 to collect the user's 3D actions on the terminal 1100. Based on the data collected by the gyroscope sensor 1112, the processor 1101 can implement the following functions: motion sensing (such as changing the UI according to the user's tilt operation), image stabilization during shooting, game control, and inertial navigation.
  • the pressure sensor 1113 may be arranged on the side frame of the terminal 1100 and/or the lower layer of the display screen 1105.
  • the processor 1101 performs left and right hand recognition or quick operation according to the holding signal collected by the pressure sensor 1113.
  • the processor 1101 can control the operability controls on the UI interface according to the pressure operation of the user on the display screen 1105.
  • the operability control includes at least one of a button control, a scroll bar control, an icon control, and a menu control.
  • the fingerprint sensor 1114 is used to collect the user's fingerprint.
  • the processor 1101 can identify the user's identity based on the fingerprint collected by the fingerprint sensor 1114, or the fingerprint sensor 1114 can identify the user's identity based on the collected fingerprints. When it is recognized that the user's identity is a trusted identity, the processor 1101 authorizes the user to perform related sensitive operations, including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings.
  • the fingerprint sensor 1114 may be provided on the front, back or side of the terminal 1100. When a physical button or a manufacturer logo is provided on the terminal 1100, the fingerprint sensor 1114 may be integrated with the physical button or the manufacturer logo.
  • the optical sensor 1115 is used to collect the ambient light intensity.
  • the processor 1101 may control the display brightness of the display screen 1105 according to the ambient light intensity collected by the optical sensor 1115. Specifically, when the ambient light intensity is high, the display brightness of the display screen 1105 is increased; when the ambient light intensity is low, the display brightness of the display screen 1105 is decreased.
  • the processor 1101 may also dynamically adjust the shooting parameters of the camera assembly 1106 according to the ambient light intensity collected by the optical sensor 1115.
  • the proximity sensor 1116 also called a distance sensor, is usually arranged on the front panel of the terminal 1100.
  • the proximity sensor 1116 is used to collect the distance between the user and the front of the terminal 1100.
  • the processor 1101 controls the display screen 1105 to switch from the on-screen state to the off-screen state; when the proximity sensor 1116 detects When the distance between the user and the front of the terminal 1100 gradually increases, the processor 1101 controls the display screen 1105 to switch from the off-screen state to the on-screen state.
  • FIG. 11 does not constitute a limitation on the terminal 1100, and may include more or fewer components than shown in the figure, or combine certain components, or adopt different component arrangements.
  • a computer-readable storage medium such as a memory including at least one program code, which can be executed by a processor in a terminal to complete the interactive item display method in the foregoing embodiment.
  • the computer-readable storage medium may be ROM (Read-Only Memory), RAM (Random-Access Memory, random access memory), CD-ROM (Compact Disc Read-Only Memory, CD-ROM) , Magnetic tapes, floppy disks and optical data storage devices.
  • the embodiments of the present application also provide a computer program product or computer program.
  • the computer program product or computer program includes computer instructions, and the computer instructions are stored in a computer-readable storage medium.
  • the processor of the computer device reads the computer instruction from the computer-readable storage medium, and the processor executes the computer instruction, so that the computer device executes the interactive item display method provided in the various optional implementations of the foregoing aspects.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)
PCT/CN2020/129816 2020-03-17 2020-11-18 互动道具显示方法、装置、终端及存储介质 WO2021184806A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2022532846A JP2023504650A (ja) 2020-03-17 2020-11-18 インタラクション道具表示方法、装置、端末及びコンピュータプログラム
KR1020227010956A KR20220051014A (ko) 2020-03-17 2020-11-18 상호작용 프롭 디스플레이 방법 및 장치, 및 단말 및 저장 매체

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010187990.X 2020-03-17
CN202010187990.XA CN111408133B (zh) 2020-03-17 2020-03-17 互动道具显示方法、装置、终端及存储介质

Publications (1)

Publication Number Publication Date
WO2021184806A1 true WO2021184806A1 (zh) 2021-09-23

Family

ID=71486140

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/129816 WO2021184806A1 (zh) 2020-03-17 2020-11-18 互动道具显示方法、装置、终端及存储介质

Country Status (4)

Country Link
JP (1) JP2023504650A (ja)
KR (1) KR20220051014A (ja)
CN (1) CN111408133B (ja)
WO (1) WO2021184806A1 (ja)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114100128A (zh) * 2021-12-09 2022-03-01 腾讯科技(深圳)有限公司 道具特效显示方法、装置、计算机设备及存储介质
CN114786025A (zh) * 2022-04-01 2022-07-22 北京达佳互联信息技术有限公司 直播数据处理方法、装置、计算机设备及介质

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111408133B (zh) * 2020-03-17 2021-06-29 腾讯科技(深圳)有限公司 互动道具显示方法、装置、终端及存储介质
CN111760285B (zh) * 2020-08-13 2023-09-26 腾讯科技(深圳)有限公司 虚拟场景的显示方法、装置、设备及介质
CN112090069B (zh) * 2020-09-17 2022-09-30 腾讯科技(深圳)有限公司 虚拟场景中的信息提示方法、装置、电子设备及存储介质
CN112107859B (zh) * 2020-09-18 2022-07-01 腾讯科技(深圳)有限公司 道具控制方法和装置、存储介质及电子设备
CN112295234B (zh) * 2020-11-03 2023-07-25 腾讯音乐娱乐科技(深圳)有限公司 获取游戏道具的方法和装置
CN112330823B (zh) * 2020-11-05 2023-06-16 腾讯科技(深圳)有限公司 虚拟道具的显示方法、装置、设备及可读存储介质
CN112684883A (zh) * 2020-12-18 2021-04-20 上海影创信息科技有限公司 多用户对象区分处理的方法和系统
CN113262492B (zh) * 2021-04-28 2024-02-02 网易(杭州)网络有限公司 游戏的数据处理方法、装置以及电子终端

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101681524A (zh) * 2007-06-05 2010-03-24 科乐美数码娱乐株式会社 图像显示控制装置、游戏装置、程序以及记录了该程序的记录介质
US20110306417A1 (en) * 2010-06-14 2011-12-15 Nintendo Co., Ltd. 2d imposters for simplifying processing of plural animation objects in computer graphics generation
US20120302341A1 (en) * 2011-05-23 2012-11-29 Nintendo Co., Ltd. Game system, game process method, game device, and storage medium storing game program
CN105126343A (zh) * 2015-08-27 2015-12-09 网易(杭州)网络有限公司 一种2d游戏的遮罩显示方法及装置
CN109550247A (zh) * 2019-01-09 2019-04-02 网易(杭州)网络有限公司 游戏中虚拟场景调整方法、装置、电子设备及存储介质
CN111408133A (zh) * 2020-03-17 2020-07-14 腾讯科技(深圳)有限公司 互动道具显示方法、装置、终端及存储介质

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4394202B2 (ja) * 1999-07-21 2010-01-06 株式会社バンダイナムコゲームス 画像生成システム及び情報記憶媒体
CN101908232B (zh) * 2010-07-30 2012-09-12 重庆埃默科技有限责任公司 一种交互式场景仿真系统及场景虚拟仿真方法
CN103489214A (zh) * 2013-09-10 2014-01-01 北京邮电大学 增强现实系统中基于虚拟模型预处理的虚实遮挡处理方法
JP6482996B2 (ja) * 2015-09-15 2019-03-13 株式会社カプコン ゲームプログラムおよびゲームシステム
US10540941B2 (en) * 2018-01-30 2020-01-21 Magic Leap, Inc. Eclipse cursor for mixed reality displays

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101681524A (zh) * 2007-06-05 2010-03-24 科乐美数码娱乐株式会社 图像显示控制装置、游戏装置、程序以及记录了该程序的记录介质
US20110306417A1 (en) * 2010-06-14 2011-12-15 Nintendo Co., Ltd. 2d imposters for simplifying processing of plural animation objects in computer graphics generation
US20120302341A1 (en) * 2011-05-23 2012-11-29 Nintendo Co., Ltd. Game system, game process method, game device, and storage medium storing game program
CN105126343A (zh) * 2015-08-27 2015-12-09 网易(杭州)网络有限公司 一种2d游戏的遮罩显示方法及装置
CN109550247A (zh) * 2019-01-09 2019-04-02 网易(杭州)网络有限公司 游戏中虚拟场景调整方法、装置、电子设备及存储介质
CN111408133A (zh) * 2020-03-17 2020-07-14 腾讯科技(深圳)有限公司 互动道具显示方法、装置、终端及存储介质

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114100128A (zh) * 2021-12-09 2022-03-01 腾讯科技(深圳)有限公司 道具特效显示方法、装置、计算机设备及存储介质
CN114100128B (zh) * 2021-12-09 2023-07-21 腾讯科技(深圳)有限公司 道具特效显示方法、装置、计算机设备及存储介质
CN114786025A (zh) * 2022-04-01 2022-07-22 北京达佳互联信息技术有限公司 直播数据处理方法、装置、计算机设备及介质
CN114786025B (zh) * 2022-04-01 2024-01-02 北京达佳互联信息技术有限公司 直播数据处理方法、装置、计算机设备及介质

Also Published As

Publication number Publication date
KR20220051014A (ko) 2022-04-25
JP2023504650A (ja) 2023-02-06
CN111408133A (zh) 2020-07-14
CN111408133B (zh) 2021-06-29

Similar Documents

Publication Publication Date Title
WO2021184806A1 (zh) 互动道具显示方法、装置、终端及存储介质
CN111589131B (zh) 虚拟角色的控制方法、装置、设备及介质
CN108434736B (zh) 虚拟环境对战中的装备显示方法、装置、设备及存储介质
WO2021143259A1 (zh) 虚拟对象的控制方法、装置、设备及可读存储介质
CN110917619B (zh) 互动道具控制方法、装置、终端及存储介质
KR102619439B1 (ko) 가상 객체를 제어하는 방법 및 관련 장치
CN110585710B (zh) 互动道具控制方法、装置、终端及存储介质
WO2020244415A1 (zh) 控制虚拟对象对虚拟物品进行丢弃的方法、装置及介质
CN111589124B (zh) 虚拟对象的控制方法、装置、终端及存储介质
CN110917623B (zh) 互动信息显示方法、装置、终端及存储介质
CN110507990B (zh) 基于虚拟飞行器的互动方法、装置、终端及存储介质
CN113289331B (zh) 虚拟道具的显示方法、装置、电子设备及存储介质
CN111596838B (zh) 业务处理方法、装置、计算机设备及计算机可读存储介质
JP7250403B2 (ja) 仮想シーンの表示方法、装置、端末及びコンピュータプログラム
CN113713383B (zh) 投掷道具控制方法、装置、计算机设备及存储介质
CN112138384A (zh) 虚拟投掷道具的使用方法、装置、终端及存储介质
CN113680060B (zh) 虚拟画面显示方法、装置、设备、介质及计算机程序产品
CN111672106A (zh) 虚拟场景显示方法、装置、计算机设备及存储介质
CN111249726B (zh) 虚拟环境中虚拟道具的操作方法、装置、设备及可读介质
CN111921190A (zh) 虚拟对象的道具装备方法、装置、终端及存储介质
CN111659122B (zh) 虚拟资源显示方法、装置、电子设备及存储介质
JPWO2021143259A5 (ja)
CN112717394A (zh) 瞄准标记的显示方法、装置、设备及存储介质
CN111530075A (zh) 虚拟环境的画面显示方法、装置、设备及介质
CN113144600B (zh) 虚拟对象的控制方法、装置、设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20925526

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20227010956

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2022532846

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 22/02/2023)

122 Ep: pct application non-entry in european phase

Ref document number: 20925526

Country of ref document: EP

Kind code of ref document: A1