WO2020042746A1 - 在虚拟环境中对虚拟物品进行拾取的方法、装置、终端和计算机可读存储介质 - Google Patents

在虚拟环境中对虚拟物品进行拾取的方法、装置、终端和计算机可读存储介质 Download PDF

Info

Publication number
WO2020042746A1
WO2020042746A1 PCT/CN2019/094208 CN2019094208W WO2020042746A1 WO 2020042746 A1 WO2020042746 A1 WO 2020042746A1 CN 2019094208 W CN2019094208 W CN 2019094208W WO 2020042746 A1 WO2020042746 A1 WO 2020042746A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual
item
area
items
virtual item
Prior art date
Application number
PCT/CN2019/094208
Other languages
English (en)
French (fr)
Inventor
张雅
周西洋
李熠琦
文晗
黄荣灏
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Publication of WO2020042746A1 publication Critical patent/WO2020042746A1/zh
Priority to US17/006,358 priority Critical patent/US20200393953A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object

Definitions

  • the present application relates to the field of human-computer interaction, and in particular, to a method, a device, a terminal, and a computer-readable storage medium for picking up virtual items in a virtual environment.
  • a virtual object for example, a virtual character
  • a virtual object can acquire a virtual item in a virtual environment through actions such as picking up and purchasing.
  • a user controls a virtual object to move to a position where the virtual item is located, and the virtual item can be picked up by automatic picking or active picking.
  • the virtual environment includes multiple virtual items located at different positions, the user needs to control the virtual objects to move to the vicinity of each virtual item's location to pick up the virtual items. For example, in the virtual environment displayed on the display interface, virtual items 1, virtual items 2, and virtual items 3 are displayed. The user needs to control the virtual object to move near the location of the virtual item 1, pick up the virtual item 1, and then control the virtual item.
  • the object moves to the vicinity of the location of the virtual item 2, and after picking up the virtual item 2, the virtual object is controlled to move to the location of the virtual item 3, and the virtual item 3 is picked up, thereby completing the virtual item 1 and the virtual item 2. And the pickup of virtual item 3.
  • a method, a device, a terminal, and a computer-readable storage medium for picking up virtual items in a virtual environment are provided.
  • a method for picking up virtual items in a virtual environment, executed by a terminal, the method includes:
  • a device for picking up virtual items in a virtual environment includes:
  • a display module configured to display a user interface, where the user interface displays the virtual environment and virtual objects located in the virtual environment;
  • An obtaining module configured to obtain an operation trajectory formed by the interactive operation on the user interface according to a first instruction triggered by an interactive operation on the user interface; and acquire the virtual environment when the operation trajectory forms a closed area At least two target virtual items located in the closed area;
  • a processing module is configured to gather the target virtual items at a specified position in the virtual environment; when the virtual object moves to the specified position, control the virtual object to pick up the target virtual item.
  • a terminal includes a processor and a memory, and the memory stores at least one instruction, and the instruction is loaded by the processor and executes the method for picking up a virtual item in a virtual environment.
  • a computer-readable storage medium stores at least one instruction in the storage medium, the instruction is loaded by a processor and executes the method for picking up virtual items in a virtual environment.
  • FIG. 1 is a structural block diagram of a terminal provided by an exemplary embodiment of the present application.
  • FIG. 2 is a structural block diagram of a computer system provided by an exemplary embodiment of the present application.
  • FIG. 3 is a schematic diagram of a user interface of a method for picking up a virtual item in a virtual environment provided in the related art
  • FIG. 4 is a flowchart of a method for picking up a virtual item in a virtual environment according to an exemplary embodiment of the present application
  • FIG. 5 is a schematic diagram of a user interface of a method for picking up a virtual item in a virtual environment according to an exemplary embodiment of the present application
  • FIG. 6 is a schematic diagram of an operation track of a method for picking up a virtual item in a virtual environment according to an exemplary embodiment of the present application
  • FIG. 7 is a schematic diagram of virtual item aggregation according to a method for picking up virtual items in a virtual environment according to an exemplary embodiment of the present application.
  • FIG. 8 is a schematic diagram of picking up a virtual item according to a method for picking up a virtual item in a virtual environment according to an exemplary embodiment of the present application;
  • FIG. 9 is a flowchart of a method for picking up a virtual item in a virtual environment according to an exemplary embodiment of the present application.
  • FIG. 10 is a schematic diagram of a pickable area of a virtual item according to an exemplary embodiment of the present application.
  • FIG. 11 is a schematic diagram of a setting page of a virtual item that does not need to be picked up according to an exemplary embodiment of the present application;
  • FIG. 12 is a schematic diagram of acquiring a touch signal according to an exemplary embodiment of the present application.
  • FIG. 13 is a flowchart of touch signal processing steps provided by an exemplary embodiment of the present application.
  • FIG. 14 is a schematic diagram of a touch event provided by an exemplary embodiment of the present application.
  • 16 is a method for picking up virtual items in a two-dimensional multiplayer battle game provided by an exemplary embodiment of the present application
  • FIG. 17 is a structural block diagram of an apparatus for picking up a virtual item in a virtual environment according to an exemplary embodiment of the present application.
  • FIG. 18 is a schematic structural diagram of a terminal provided by an exemplary embodiment of the present application.
  • Virtual environment A virtual environment displayed (or provided) when an application is run on a terminal.
  • the virtual environment can be a real-world simulation environment, a semi-simulation semi-fictional three-dimensional environment, or a purely fictional three-dimensional environment.
  • the virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment.
  • the virtual environment is also used for a virtual environment battle between at least two virtual characters.
  • the virtual environment is further used for at least two virtual characters to use virtual guns for battle.
  • the virtual environment is also used to use at least two virtual characters to play a battle against each other within a target area, and the target area range will continuously decrease as time passes in the virtual environment.
  • the following embodiments take the virtual environment as a two-dimensional virtual environment as an example, but it is not limited thereto.
  • Virtual object refers to a movable object in a virtual environment.
  • the movable object may be at least one of a virtual character, a virtual animal, and an anime character.
  • the virtual environment is a three-dimensional virtual environment
  • the virtual object is a three-dimensional stereo model created based on the animation bone technology.
  • Each virtual object has its own shape and volume in the three-dimensional virtual environment, occupying a part of the space in the three-dimensional virtual environment.
  • Virtual items refers to items assembled or carried by virtual objects.
  • the virtual item may be a backpack equipped with a virtual object, a weapon equipped with a virtual object, or a medicine carried by the virtual object.
  • the virtual item is assembled, and when the virtual object has a backpack stored in the assembly position, the user owns it.
  • Backpack box virtual items that users cannot assemble on virtual objects can be stored in the backpack box, such as medicines, bandages or extra weapons that cannot be assembled.
  • the virtual object carries the Virtual item.
  • Item type It is a property of a virtual item, which corresponds to the type of the virtual item.
  • the item type corresponding to the virtual firearm is weapon
  • the type corresponding to the virtual medicine is supply.
  • Number of items the number of virtual items.
  • a virtual object is equipped with a pistol, and a virtual pistol is equipped with 12 pistol bullets. At the same time, the virtual object also carries 100 pistol bullets. The number of items corresponding to the pistol is 1, and the number of items corresponding to the pistol bullet is 112. .
  • Recovery Item It is a virtual item that can restore the physical value of the virtual object.
  • the physical value is a property of the virtual object.
  • the physical value of the virtual object is 0, the virtual object will lose its combat power in the virtual environment and cannot continue to interact.
  • the upper limit of the physical value of the virtual object is 100.
  • the physical value of the virtual object is reduced to 60. If the virtual object is picked up and the recovery value is used After recovering the item at 30, the physical value of the virtual object returns to 90. Generally, the recovery of the physical value of the virtual object from the restored item cannot exceed the upper limit of the physical value of the virtual object.
  • the virtual object can increase the level by increasing the experience value.
  • the experience value and level are attributes of the virtual object, and each level corresponds to a different experience value.
  • the level of the virtual object is increased.
  • the current level of the virtual object is level 1
  • the upper limit of the experience value corresponding to level 1 is 30, the current experience value of the virtual object is 20, and after the virtual object picks up and uses the upgraded item with the experience value of 35, the experience value Increased to 55.
  • the upper limit of the experience value corresponding to the level 1 level is 30, the level of the virtual object is upgraded to level 2.
  • the terminal in the embodiment of the present application may be a desktop computer, a laptop portable computer, a mobile phone, a tablet computer, an e-book reader, an MP3 (Moving Picture Experts Group Audio Layer III, a moving image expert compression standard audio layer 3) player, MP4 (Moving Picture Experts Group Audio Layer IV, Motion Picture Expert Compression Standard Audio Level 4) player and so on.
  • An application that supports a virtual environment is installed and run in the terminal, such as an application that supports a three-dimensional virtual environment.
  • the application may be any one of a virtual reality application, a three-dimensional map program, a military simulation program, a TPS game, an FPS game, and a MOBA game.
  • the application program may be a stand-alone version of an application program, such as a stand-alone version of a 3D game program, or may be an online version of an application program.
  • FIG. 1 is a structural block diagram of a terminal provided by an exemplary embodiment of the present application.
  • the terminal 100 includes: a processor 110 and a memory 120, and the processor 110 and the memory 120 implement communication through a bus or other manners.
  • the memory 120 stores an operating system 121 and an application program 122.
  • the operating system 121 is basic software that provides application programs 122 with secure access to computer hardware.
  • the application 122 is an application that supports a virtual environment.
  • the application 122 is an application that supports a three-dimensional virtual environment.
  • the application 122 may be a virtual reality application, a three-dimensional map program, a military simulation program, a Third-Personal Shooting Game (TPS), a First-person Shooting Game (FPS), a MOBA game, Any of the multiplayer shooter survival games.
  • the application 122 may be a stand-alone application, such as a stand-alone 3D game program.
  • FIG. 2 shows a structural block diagram of a computer system provided by an exemplary embodiment of the present application.
  • the computer system 200 includes a first terminal 220, a server 240, and a second terminal 260.
  • the first terminal 220 installs and runs an application program that supports a virtual environment.
  • the application can be any one of a virtual reality application, a three-dimensional map application, a military simulation application, a TPS game, an FPS game, a MOBA game, and a multiplayer shooter survival game.
  • the first terminal 220 is a terminal used by a first user.
  • the first user uses the first terminal 220 to control a first virtual object located in a virtual environment to perform activities, including but not limited to: adjusting body posture, crawling, walking, running, At least one of riding, jumping, driving, picking, shooting, attacking, and throwing.
  • the first virtual object is a first virtual character, such as a simulated character or an anime character.
  • the first terminal 220 is connected to the server 240 through a wireless network or a wired network.
  • the server 240 includes at least one of a server, multiple servers, a cloud computing platform, and a virtualization center.
  • the server 240 is configured to provide a background service for an application supporting a three-dimensional virtual environment.
  • the server 240 undertakes the main calculation work, and the first terminal 220 and the second terminal 260 undertake the secondary calculation work; or, the server 240 undertakes the secondary calculation work, and the first terminal 220 and the second terminal 260 undertake the main calculation work;
  • the server 240, the first terminal 220, and the second terminal 260 use a distributed computing architecture for collaborative computing.
  • the second terminal 260 installs and runs an application that supports a virtual environment.
  • the application may be any one of a virtual reality application, a three-dimensional map application, a military simulation application, an FPS game, a MOBA game, and a multiplayer shooter survival game.
  • the second terminal 260 is a terminal used by the second user.
  • the second user uses the second terminal 260 to control the second virtual object located in the virtual environment for activities, including but not limited to: adjusting body posture, crawling, walking, running, At least one of riding, jumping, driving, picking, shooting, attacking, and throwing.
  • the second virtual object is a second virtual character, such as a simulated character or an anime character.
  • first virtual character and the second virtual character are in the same virtual environment.
  • first virtual character and the second virtual character may belong to the same team, the same organization, have a friend relationship, or have temporary communication rights.
  • first virtual character and the second virtual character may also belong to different teams, different organizations, or two groups with hostility.
  • the applications installed on the first terminal 220 and the second terminal 260 are the same, or the applications installed on the two terminals are the same type of applications on different control system platforms.
  • the first terminal 220 may refer to one of a plurality of terminals
  • the second terminal 260 may refer to one of a plurality of terminals.
  • This embodiment only uses the first terminal 220 and the second terminal 260 as examples.
  • the device types of the first terminal 220 and the second terminal 260 are the same or different, and the device types include: game consoles, desktop computers, smartphones, tablets, e-book readers, MP3 players, MP4 players, and laptops. At least one of the computers.
  • the terminal is a desktop computer.
  • the number of the foregoing terminals may be larger or smaller.
  • the foregoing terminal may be only one, or the foregoing terminal may be dozens or hundreds, or a larger number.
  • the embodiment of the present application does not limit the number of terminals and the types of equipment.
  • FIG. 3 illustrates a schematic diagram of a method for picking up virtual items in a virtual environment in the related art.
  • a user interface 330 is displayed on the display screen 320 of the terminal 310, and the user interface 330 is displayed on the display screen 320 of the terminal 310.
  • a virtual environment 340 is displayed.
  • the virtual environment 340 includes a virtual object 350, a first virtual item 341, a second virtual item 342, and a third virtual item 343.
  • a method for picking up a virtual item in a virtual environment in the related art is:
  • the control virtual object 350 is moved from the initial position A 0 to the first position A 1 near the position B where the first virtual item 341 is located, and the first virtual item 341 is picked; then the control virtual object 350 is moved from the first position A 1 to Pick up the second virtual item 342 at the second position A 2 near the position C of the second virtual item 342; then control the virtual object 350 to move from the second position A 2 to the third position near the position D of the third virtual item 342 At the position A 3 , the third virtual item 343 is picked up, thereby completing the first virtual item 341, the second virtual item 342, and the third virtual item 343.
  • the virtual object 350 in FIG. 3 is a virtual character controlled by the terminal 310.
  • the first virtual item 341 may be a restored item and the second virtual item 342. It can be an upgrade item, and the third virtual item 343 can be a weapon.
  • the terminal 310 controls the virtual character 350 to move to the position where the restored item 341 is located, and increases the physical value of the virtual character 350 according to the restored value corresponding to the restored item 341.
  • the terminal 310 controls the virtual character 350 to move to the position where the upgraded item 342 is located, according to the upgraded item 342
  • the corresponding experience value increases the experience value of the virtual character 350
  • the terminal 310 controls the virtual character 350 to move to the position where the weapon 343 is located to pick up and assemble the weapon 343.
  • FIG. 4 illustrates a flowchart of a method for picking up a virtual item in a virtual environment according to an exemplary embodiment of the present application.
  • the method is executed by the first terminal 220 or the second terminal 260 in the embodiment of FIG. 2, and the method includes:
  • Step 401 Display a user interface.
  • the user interface displays a virtual environment and virtual objects located in the virtual environment.
  • a terminal runs an application program that supports a virtual environment
  • a user interface of the application program is displayed on a display screen of the terminal, and the user interface displays a virtual environment and a virtual object located in the virtual environment.
  • an application program supporting the two-dimensional virtual environment runs on the terminal 510, and a user interface 530 is displayed on a display screen 520 of the terminal 510.
  • a virtual environment 540, a virtual object 550, a virtual joystick 560, and a function control 570 are displayed.
  • the virtual environment 540 includes a first virtual item 541, a second virtual item 542, and a third virtual item 543.
  • the virtual object 550 can move in the virtual environment 540 according to the movement signal triggered on the virtual joystick 560; the virtual object 550 can use the operation signal triggered on the function control 570 to other virtual objects or virtual items in the virtual environment 540 Perform actions such as picking up virtual items, hitting other virtual objects, etc.
  • Step 402 Obtain an operation trajectory formed by the interactive operation on the user interface according to a first instruction triggered by the interactive operation on the user interface.
  • An interactive operation is a continuous operation performed by a user in a user interface.
  • an interactive operation may be a continuous sliding operation of a user on a touch screen of a terminal, or it may be released by a user clicking a mouse and dragging a certain distance from the user interface. Operation. This embodiment uses the interaction operation as an example for a sliding operation.
  • the terminal When the user performs a sliding operation on the user interface, the terminal receives a first instruction triggered by the sliding operation, and determines an operation track formed by the sliding operation on the user interface according to the first instruction.
  • the user initiates a sliding operation that the user interface slides from the starting position X 1 to the ending position X 2 to trigger a first instruction.
  • the terminal 510 obtains that the sliding operation is formed in the user interface. Operation track 580.
  • Step 403 When the operation trajectory forms a closed area, obtain at least two target virtual items located in the closed area in the virtual environment.
  • the terminal can detect whether the operation track forms a closed area in any of the following ways:
  • the terminal acquires the starting region 581 formed by the user pressing at the starting position X 1 and the ending region 582 formed by lifting the finger at the ending position X 2 to detect whether there is an intersection between the starting region 581 and the ending region 582, If there is an intersection between the start region 581 and the end region 582, it is determined that the operation trajectory forms a closed region. If there is no intersection between the start region 581 and the end region 582, it is determined that the operation trajectory does not form a closed region.
  • the terminal detects whether there is an intersection between the start area 581 and the end area 582. If there is no intersection between the start area 581 and the end area 582, it is determined that the operation trajectory does not form a closed area; if the start area 581 and the end If there is an intersection between areas 582, the area ratio of the intersection is detected. If the area ratio exceeds the area ratio threshold, it is determined that the operation trajectory forms a closed area. If the area ratio does not exceed the area ratio threshold, it is determined. The operation track does not form a closed area.
  • the area ratio is at least one of the ratio of the intersection area and the area of the starting area, the ratio of the area of the intersection and the area of the ending area, or the ratio of the area of the intersection and the total area (the sum of the areas of the starting area and the ending area) .
  • the terminal obtains a first distance between the center position of the start area 581 and the center position of the end area 582, and detects whether the first distance is less than a first distance threshold. If the first distance is less than the first distance threshold, the operation is determined. The trajectory forms a closed area; if the first distance is not less than the first distance threshold, it is determined that the operation trajectory does not form a closed area.
  • the terminal determines that the operation trajectory forms a closed area, it acquires a virtual item located inside the closed area as a target virtual item.
  • a virtual item located inside the closed area as a target virtual item.
  • the first virtual item 541 and the third virtual item 543 are included in the closed area 583 formed by the operation track 580, and the second virtual item 542 is located outside the closed area 583, and the terminal determines the first virtual item
  • the item 541 and the third virtual item 543 are target virtual items.
  • step 404 the target virtual items are gathered at a specified position in the virtual environment.
  • the specified position can be set by actual needs.
  • the specified position may be any position in the closed area touched by the user, or the position closest to the virtual object in the closed area calculated by the terminal, or the center position of the closed area calculated by the terminal.
  • the target virtual item that is, the first virtual item 541 and the third virtual item 543 are moved to the center position O.
  • the terminal may use the touched position as a designated position by receiving a second instruction triggered by a touch operation on the touched position in the closed area. For example, if the user touches a touch position in the closed area, the touch position can be used as the designated position.
  • Step 405 When the virtual object moves to the specified position, control the virtual object to pick up the target virtual item.
  • the terminal controls the virtual object to obtain the target virtual item by manually or automatically picking it up.
  • the user can control the virtual object 550 to move from the initial position A 0 to the designated position O by touching the virtual operation lever 560 to automatically pick up the first virtual item 541 and the third virtual item 543.
  • the operation trajectory formed by the interactive operation in the user interface is obtained by using the first instruction triggered by the interactive operation in the user interface.
  • the virtual items in the virtual environment are gathered at a specified position.
  • the virtual objects gathered at the specified position are picked up. Therefore, the pickup efficiency of virtual items is improved, thereby improving the efficiency of human-computer interaction.
  • FIG. 9 illustrates a flowchart of a method for picking up a virtual item in a virtual environment according to an exemplary embodiment of the present application.
  • the method is executed by the first terminal 220 or the second terminal 260 in the embodiment of FIG. 2, and the method includes:
  • Step 901 Display a user interface.
  • the user interface displays a virtual environment and virtual objects located in the virtual environment.
  • step 401 For the steps of displaying the user interface by the terminal, refer to step 401 in the embodiment of FIG. 4.
  • Step 902 Obtain an operation trajectory formed by the interactive operation on the user interface according to the first instruction triggered by the interactive operation on the user interface.
  • the terminal when a user performs a sliding operation on a user interface, the terminal receives a first instruction triggered by the sliding operation, and determines the sliding operation based on the first instruction.
  • the operation track formed by the user interface.
  • the user initiates a sliding operation that the user interface slides from the starting position X 1 to the ending position X 2 to trigger a first instruction.
  • the terminal 510 obtains that the sliding operation is formed in the user interface. Operation track 580.
  • the first command triggered by the user's sliding operation on the touch display screen includes three command events: a touch start event when a finger is pressed on the touch display screen, and a finger is on the touch display screen.
  • the terminal obtains the start region 581 according to the touch start event and the end region 582 according to the touch end event.
  • Step 903 It is detected whether the operation track forms a closed area.
  • the manner in which the terminal detects whether the operation trajectory forms a closed area may be any one of three manners in step 403 in the embodiment of FIG. 4.
  • Step 904 When it is determined that the operation trajectory forms a closed area, at least two target virtual items located in the closed area are used as candidate virtual items.
  • the virtual item corresponds to a pickable area.
  • the pickable area is the area occupied by the virtual item in the virtual environment.
  • the terminal can determine the candidate virtual item based on the pickable area according to the needs.
  • the terminal uses the virtual item in the intersection area between the pickable area and the closed area as a candidate virtual item.
  • the first virtual item 541 corresponds to a pickable area 5410
  • the second virtual item 542 corresponds to a pickable area 5420
  • the third virtual item 543 corresponds to a pickable area 5430.
  • the area 5410 and the pickable area 5430 are located in the closed area 583. Therefore, the intersection of the pickable area 5410 and the closed area 583 is the pickable area 5410, and the intersection of the pickable area 5430 and the closed area 583 is the pickable area 5430.
  • the pickup area 5420 is located outside the closed area 583 and has no intersection with the closed area, so the first virtual item 541 and the second virtual item 542 are candidate virtual items.
  • the terminal uses the virtual item located in the pickable area and the center position of the pickable area in the closed area as the candidate virtual item.
  • the center position of the pickable area refers to the center position of the pickable area.
  • the terminal obtains the coordinates of the center position P1 of the pickable area 5410 corresponding to the first virtual item 541, the coordinates of the center position P2 of the pickable area 5420 corresponding to the second virtual item 542, and the third virtual item The coordinates of the central position P3 of the pickable area 5430 of the article 543. Based on the coordinates of P1, P2, and P2 and the coordinates of the pixel points corresponding to the closed area, it is detected that P1, P3 are located in the closed area 583, and P2 is located outside the closed area 583. It is determined that the first virtual item 541 and the second virtual item 542 are candidate virtual items.
  • Step 905 It is detected whether there is a virtual item in the candidate virtual item that does not need to be picked up.
  • the terminal detects whether the candidate virtual item includes a virtual item that does not need to be picked up, and when the candidate virtual item includes a virtual item that does not need to be picked up, it proceeds to step 906a; when the candidate virtual item does not include a virtual item that does not need to be picked up If yes, proceed to step 906b.
  • the virtual item that does not need to be picked up refers to a virtual item whose item type is an item type that is not required to be picked up and is set in the application.
  • Virtual items that do not need to be picked up can be preset by the application or set by the user. For example, if a user is not good at using a pistol, a virtual item of a pistol type may be set as a virtual item that does not need to be picked up.
  • the terminal displays an item setting page 531 in the user interface 530, and the item setting page displays Item type 1, item type 2, and item type 3; after receiving the item type determination signal triggered on item type 1, the terminal determines that the virtual item corresponding to item type 1 is a virtual item that does not need to be picked up.
  • the terminal detects whether there is a virtual item that is the same as the item carried by the virtual object and the number of items exceeds the quantity threshold.
  • the candidate virtual item includes a virtual item whose quantity exceeds the quantity threshold
  • the virtual item whose number of items exceeds the quantity threshold is regarded as an excess item, and the excess item is regarded as a virtual item which does not need to be picked up.
  • the items carried by the virtual object have a corresponding number of items.
  • a virtual object can carry a rifle, a pistol, 120 rifle bullets, and 120 pistol bullets.
  • the type of items corresponding to rifles and pistols is firearms
  • the type of items corresponding to rifle bullets is rifle ammunition
  • the type of items corresponding to pistol bullets is For pistol ammunition, the threshold for the quantity of guns is 1, the threshold for the quantity of rifles ammunition is 119, and the threshold for the quantity of pistol ammunitions is 119.
  • virtual item 2 is a virtual item that does not need to be picked up.
  • step 906a other candidate virtual items from the candidate virtual items that are not required to be picked up are used as target virtual items.
  • the terminal After the terminal determines a virtual item that does not need to be picked up among the candidate virtual items, it uses the candidate virtual item other than the virtual item that does not need to be picked up as the target virtual item.
  • Step 906b The candidate virtual item is used as the target virtual item.
  • the candidate virtual item is taken as the target virtual item.
  • step 907 the center position of the closed area is used as the designated position, and the target virtual items are gathered at the designated position.
  • the target virtual item that is, the first virtual item 541 and the third virtual item 543 are moved to the center position O.
  • the terminal obtains the area of the area occupied by all the target virtual items with the coordinates of the center position O as the center, determines the aggregation area occupied by all the target virtual items, and The target virtual items are randomly moved to any position in the gathering area, or each target virtual item is moved to a position in the nearest gathering area.
  • the terminal displays at least two target virtual items at the specified position in the form of an aggregation icon.
  • All the target virtual items after the aggregation do not need to be all displayed in the designated position, and can be displayed as an aggregation icon. For example, after the target virtual item 1, the target virtual item 2 and the target virtual item 3 are gathered to the designated position O, only one icon of the target virtual item is displayed, or a preset aggregation icon is displayed to refer to the target virtual item 1, the target virtual item 2 And target virtual item 3.
  • the terminal displays virtual items belonging to the same item type among at least two target virtual items as an icon.
  • the item types of the target virtual item 1 and the target virtual item 2 are backpacks, and the item type of the target virtual item 3 is a firearm.
  • the target virtual item 1 the target virtual item 2 and the target virtual item 3 are gathered to the designated position O
  • only the icons of the target virtual item 1 (or the target virtual item 2) and the target virtual item 3 are displayed.
  • the terminal displays the number of target virtual items of the same type on the icons of the target virtual items of the same type. For example, "x2" is displayed on the icon of the target virtual item 1, indicating that the identification of the target virtual item 1 represents two target virtual items of the same type.
  • Step 908 It is detected whether the second distance between the virtual object and the designated position is less than a second distance threshold.
  • the virtual object has corresponding coordinates in the virtual environment.
  • the terminal calculates a second distance between the virtual object and the designated position according to the coordinates of the virtual object and the coordinates of the designated position, and determines whether the virtual object moves to the designated position according to the second distance.
  • Step 909 When the second distance is less than the second distance threshold, the target virtual item is automatically picked up.
  • the terminal automatically picks up the target virtual item when it determines that the virtual object moves to the designated position.
  • the user can control the virtual object 550 to move from the initial position A 0 to the designated position O by touching the virtual operation lever 560 to automatically pick up the first virtual item 541 and the third virtual item 543.
  • Step 910 It is detected whether the target virtual item includes a restored item.
  • the terminal After the target virtual item is picked up by the virtual object, the terminal detects whether the target virtual item picked up by the virtual object includes a restored item. If the target virtual item includes a restored item, proceed to step 911, and if the target virtual item does not include a restored item, proceed to step 912.
  • step 911 the physical value of the virtual object is increased according to the recovery value corresponding to the recovery item.
  • the terminal increases the physical value of the virtual object according to the restored value corresponding to the restored item.
  • the target virtual item includes a recovery item 1 and a recovery item 2, wherein the recovery value corresponding to the recovery item 1 is 10, the recovery value corresponding to the recovery item 2 is 15, the physical value of the virtual object is 40, and the virtual object is The maximum physical strength is 100.
  • the terminal increases the physical value of the virtual object from 40 to 50 according to the restoration value of the restored item 1, and increases the physical value of the virtual object from 50 to 65 according to the restored value of the restored item 2. It should be noted that the order in which the terminal increases the physical value of the virtual object according to the restored items of different restoration values is not limited.
  • Step 912 Detect whether the target virtual item includes an upgraded item.
  • the terminal detects whether the target virtual item obtained by the virtual object includes an upgraded item. If the target virtual item includes an upgraded item, the process proceeds to step 913, and if the target virtual item does not include a restored item, the step is stopped.
  • Step 913 Increase the experience value of the virtual object according to the experience value corresponding to the upgraded item.
  • the terminal increases the experience value of the virtual object according to the experience value corresponding to the upgraded item.
  • the experience value of the virtual object exceeds the upper limit of the experience value of the current level, the virtual object is upgraded according to the increased experience value. To the corresponding level.
  • the target virtual item includes upgrade item 1 and upgrade item 2, wherein the experience value corresponding to upgrade item 1 is 100, the experience value corresponding to upgrade item 2 is 150, the experience value of virtual object is 500, and the virtual object is The level is 1 and the maximum experience value corresponding to level 1 is 600.
  • the terminal increases the experience value of the virtual object from 500 to 600 according to the experience value of the upgraded item 1, and increases the experience value of the virtual object from 600 to 750 according to the experience value of the upgraded item 2. Since the experience value of 750 exceeds the experience of level 1 The value is high, so the level of the virtual object rises from level 1 to level 2. It should be noted that the order in which the terminal increases the experience value of the virtual object according to the upgrade items of different experience values is not limited.
  • the terminal may perform steps 911 and 912 first, and then perform steps 913 and 914, or the terminal may perform steps 913 and 914 before performing steps 911 and 912, which is not limited herein.
  • the operation trajectory formed by the interactive operation in the user interface is obtained by using the first instruction triggered by the interactive operation in the user interface.
  • the virtual items in the virtual environment are gathered at a specified position.
  • the virtual objects gathered at the specified position are picked up. Therefore, the pickup efficiency of virtual items is improved, thereby improving the efficiency of human-computer interaction.
  • virtual items that do not need to be picked out of the candidate virtual items are determined, and virtual items that do not need to be picked up are removed from the candidate virtual items.
  • Other candidate virtual items of the item are used as target virtual items, which prevents the terminal from gathering candidate virtual items that do not need to be picked up at a specified position, and enables the virtual object to pick up virtual items that do not need to be picked up when the virtual object moves to the specified position, thereby improving the virtual item.
  • the picking efficiency further improves the efficiency of human-computer interaction.
  • the target virtual items are aggregated at the specified position
  • at least two target virtual items are displayed in the specified position in the form of an aggregation icon, thereby reducing the number of images displayed on the same screen of the terminal and reducing This occupies the resources of the terminal, which improves the running fluency of the application to a certain extent.
  • the target virtual items of the same item type among at least two target virtual items are displayed as an icon, so that in some cases Under the circumstances, the number of images displayed on the same screen of the terminal can be reduced, and the resource occupation of the terminal is reduced, thereby improving the running fluency of the application to a certain extent.
  • steps in the flowcharts of FIGS. 4 and 9 are sequentially displayed in accordance with the directions of the arrows, these steps are not necessarily performed in the order indicated by the arrows. Unless explicitly stated in this document, the execution of these steps is not strictly limited, and these steps can be performed in other orders. Moreover, at least part of the steps in FIGS. 4 and 9 may include multiple sub-steps or multiple stages. These sub-steps or stages are not necessarily performed at the same time, but may be performed at different times. These sub-steps or stages The execution order of is not necessarily performed in sequence, but can be performed in turn or at least partially with other steps or substeps or stages of other steps.
  • the terminal may acquire the operation track of the touch and slide operation through a combination of the hardware level and the program level.
  • the principle is as follows:
  • the principle of detecting the sliding operation by the touch display screen is: the touch display screen 520 of the terminal 510 is plated with a driving electrode 1210 and a receiving electrode 1220, and a driving pulse is provided between the driving electrode 1210 and the receiving electrode 1220 by a driving buffer. A low-voltage AC electric field is formed.
  • a finger touches the touch display screen 520 a conductive capacitance is formed between the finger and the dielectric layer 1230 of the touch display screen 520 because the human body conducts electricity.
  • the current from the driving electrode 1210 and the receiving electrode 1220 flows to the finger to touch the touch display screen.
  • the touch point of 520 generates a trigger signal between the inner layer and the outer layer of the touch display screen 520 through the middle metal oxide, and the central processing unit of the terminal obtains the operation track of the finger sliding operation through the trigger signal.
  • Step 1301 The terminal obtains the original signal of the contact area.
  • the original signal is the original touch of the contact area.
  • step 1302 the terminal filters the interference signal to obtain a filtered touch signal;
  • step 1303, calculates a pressure point of the filtered touch signal to obtain a pressure distribution of the touch signal;
  • step 1304 Establish a touch area according to the pressure distribution of the touch signal;
  • step 1305 obtain touch coordinates according to the touch area, and according to the touch coordinates, determine the coordinates of the operation track, and then determine the edge coordinates of the closed area and the closed area. Pixel coordinates surrounded by edge coordinates.
  • a touch event is triggered in the operating system of the terminal.
  • the touch event (touch) in the operating system of the terminal is triggered when the user places his finger on the screen, slides on the screen, or moves away from the screen.
  • Touch events can have the following types:
  • touchstart event triggered when a finger starts to touch the screen, even when one finger is already on the screen, the event will be triggered when another finger touches the screen.
  • touchmove event touch movement event: triggered continuously when the finger is swiped on the screen. During this event, calling the preventDefault () event can prevent scrolling.
  • touchend event touch end event: triggered when the finger is removed from the screen.
  • touchcancel event (touch cancel event): triggered when the system stops tracking touch.
  • the application program in the terminal may obtain the operation trajectory of the drawing operation performed in the path drawing interface through the touch event obtained at the program level.
  • FIG. 14 illustrates a schematic diagram of determining an operation track according to a touch event according to an embodiment of the present application.
  • the terminal can obtain a drawing operation performed in the path drawing interface according to the coordinates corresponding to the touch start event, touch end event, and touch movement event between the touch start event and touch end event. Operation track.
  • the application in the embodiment of the present application is a three-dimensional multiplayer shooter survival game.
  • a display 1520 of the terminal 1510 displays a user of the three-dimensional multiplayer shooter survival game.
  • the user interface 1530 displays a virtual environment 1540, a virtual object 1550 corresponding to the terminal 1510, a first weapon bar 1531 and a second weapon bar 1532, a virtual joystick 1560, and function controls 1570 corresponding to the virtual object 1550.
  • the virtual environment 1540 includes a first virtual item 1541 (rifle bullet), a second virtual item 1542 (first step gun), and a first weapon bar 1531 and a second weapon bar 1532 are used to display weapons carried by the virtual object 1550.
  • a second rifle 1543 displayed in a weapon bar 1531 is a rifle carried by the virtual object 1550.
  • the terminal 1510 After receiving the first instruction triggered by the non-stop operation on the user interface 1530, the terminal 1510 obtains the operation trajectory 1580 formed by the interactive operation, and determines that the operation trajectory 1580 forms a closure according to the start region 1581 and the end region 1582 of the operation trajectory. After area 1583, the rifle bullet 1541 and the first step gun 1542 located in the closed area 1583 are gathered at the center of the closed area 1583. When the virtual object 1550 moves to the center of the closed area 1583, the terminal 1510 controls the virtual object 1550 to pick up the rifle After the bullet 1541 and the first step gun 1542, the image of the first step gun 1542 is displayed in the second weapon column 1532, and the number corresponding to the rifle bullet icon is updated from 50 before picking to 120 after picking.
  • the application in the embodiment of the present application is a two-dimensional multiplayer battle game.
  • the display screen 1620 of the terminal 1610 displays the user interface 1630 of the two-dimensional multiplayer battle game.
  • the user interface 1630 displays a virtual environment 1640, a virtual character 1650 corresponding to the terminal 1610, a weapon bar 1631 corresponding to the virtual character 1650, and a physical value display bar 1632, a virtual joystick 1660, and function controls 1670.
  • the virtual environment 1640 includes a first virtual item 1641 (that is, a restored item) and a second virtual item 1642 (a sword).
  • the weapon column 1631 is used to display the weapon carried by the virtual character 1650
  • the physical value display column 1632 is used to display the virtual character.
  • the physical strength value of 1650 where the dark color represents the current physical strength value of the virtual character 1650, and the blank portion represents the difference between the current physical strength value and the upper limit of the physical strength value.
  • the functions of the virtual operation lever 1660 and the function control 1670 refer to the foregoing embodiment.
  • the terminal 1610 After receiving the first instruction triggered by the uninterrupted operation on the user interface 1630, the terminal 1610 obtains the operation trajectory 1680 formed by the interactive operation, and determines that the operation trajectory 1680 forms a closure according to the start area 1681 and the end area 1682 of the operation track After the area 1683, the restored items 1641 and the sword 1642 located in the closed area 1683 are gathered at the center of the closed area 1683.
  • the terminal 1610 controls the virtual character 1650 to pick up the recovered items 1641. After the sword 1642 and the sword 1642 are displayed in the weapon column 1631, the current physical value in the physical value display column 1632 has been increased.
  • FIG. 17 shows a structural block diagram of an apparatus for picking up virtual items in a virtual environment according to an exemplary embodiment of the present application.
  • the device may be implemented as the first terminal 220 or the second terminal 260 in the embodiment of FIG. 2 through software, hardware, or a combination of the two.
  • the device includes a display module 1710, an acquisition module 1720, a processing module 1730, and a receiving module 1740.
  • the display module 1710 is configured to display a user interface.
  • the user interface displays a virtual environment and virtual objects located in the virtual environment.
  • the obtaining module 1720 is configured to obtain, according to a first instruction triggered by an interactive operation on the user interface, an operation track formed by the interactive operation on the user interface.
  • the operation track forms a closed area, at least two target virtual items located in the closed area in the virtual environment are acquired.
  • a processing module 1730 is configured to gather the target virtual items at a specified position in the virtual environment; when the virtual object moves to the specified position, control the virtual object to pick up the target virtual item.
  • the virtual item has a pickable area
  • the obtaining module 1720 is further configured to use as a candidate a virtual item a virtual item having an intersection between a pickable area and a closed area in the virtual item; or, as a candidate, a virtual item whose center position of the pickable area in the virtual item is in the closed area Virtual item; the virtual item that belongs to the candidate virtual item that needs to be picked is the target virtual item.
  • the virtual item has an item type
  • the obtaining module 1720 is further configured to determine whether a candidate virtual item includes a virtual item that does not need to be picked up.
  • the virtual item that does not need to be picked up refers to a virtual item whose item type is set in the application and does not need to be picked up.
  • other candidate virtual items that exclude the virtual item that does not need to be picked up among the candidate virtual items are used as target virtual items; or when the candidate virtual item does not include the virtual item that does not need to be picked up, Use the candidate virtual item as the target virtual item.
  • the display module 1710 is further configured to display an item setting page.
  • the receiving module 1740 is configured to receive an item type determination signal triggered on an item setting page.
  • the processing module 1730 is further configured to determine a signal according to an item type, and determine an item type that does not need to be picked up.
  • the virtual item corresponds to the number of items
  • the obtaining module 1720 is further configured to determine whether a candidate virtual item has a virtual item whose quantity exceeds the quantity threshold; when the candidate virtual item includes a virtual item whose quantity exceeds the quantity threshold, the virtual item whose quantity exceeds the quantity threshold is taken as For an excess item, other candidate virtual items except the excess item among the candidate virtual items are used as the target virtual items; or, when the candidate virtual items do not include the excess items, the candidate virtual items are used as the target virtual items.
  • the receiving module 1740 is further configured to receive a second instruction triggered by a touch operation of a touch position in a closed area.
  • the processing module 1730 is further configured to use the touched position as the specified position according to the second instruction.
  • the obtaining module 1720 is further configured to obtain a center position of the closed area.
  • the processing module 1730 is further configured to use a center position of the closed area as a designated position.
  • the processing module 1730 is further configured to increase the physical value of the virtual object according to the restored value corresponding to the restored item when the target virtual item includes the restored item.
  • the processing module 1730 is further configured to increase the experience value of the virtual object according to the experience value corresponding to the upgraded item when the target virtual item includes the upgraded item.
  • the display module 1710 is further configured to display at least two target virtual items at specified positions in the form of an aggregation icon after the target virtual items are collected at a specified position in the virtual environment.
  • the display module 1710 is further configured to display the target virtual item of the same item type as an icon in at least two target virtual items after the target virtual items are gathered in a specified position in the virtual environment. .
  • the interactive operation is a sliding operation on a touch display screen of the terminal.
  • the operation track of the sliding operation includes a starting area and an ending area.
  • the starting area is a finger pressing operation in the sliding operation.
  • An area formed on the display screen of the terminal, and the end area is an area formed on the display screen of the terminal by a finger lifting operation in a sliding operation;
  • the processing module 1730 is further configured to determine whether the operation track forms a closed area according to the starting area and the ending area.
  • the processing module 1730 is further configured to obtain a first distance between the center position of the start region and the center position of the end region; when the first distance is less than the first distance threshold, determine the operation trajectory Form a closed area.
  • the processing module 1730 is further configured to obtain the area of the intersection between the start area and the end area; when the area ratio of the intersection exceeds the area ratio threshold, determine that the operation trajectory forms a closed area,
  • the area ratio is the ratio of the area of the intersection to the sum area, and the sum area is the sum of the area of the starting area and the area of the ending area.
  • the processing module 1730 is further configured to obtain a second distance between the virtual object and the designated position; when the second distance is less than the second distance threshold, the target virtual item is automatically picked up.
  • FIG. 18 shows a structural block diagram of a terminal 1800 provided by an exemplary embodiment of the present application.
  • the terminal 1800 can be a portable mobile terminal, such as: smartphone, tablet, MP3 player (Moving Picture Experts Group Audio Layer III, moving picture expert compression standard audio level 3), MP4 (Moving Picture Experts Group Audio Audio Layer IV, dynamic Video expert compresses standard audio layer 4) Player.
  • the terminal 1800 may also be called other names such as user equipment and portable terminal.
  • the terminal 1800 includes a processor 1801 and a memory 1802.
  • the processor 1801 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like.
  • the processor 1801 may use at least one hardware form among DSP (Digital Signal Processing), FPGA (Field-Programmable Gate Array, Field Programmable Gate Array), and PLA (Programmable Logic Array). achieve.
  • the processor 1801 may also include a main processor and a co-processor.
  • the main processor is a processor for processing data in the awake state, also referred to as a CPU (Central Processing Unit).
  • the co-processor is Low-power processor for processing data in standby.
  • the processor 1801 may be integrated with a GPU (Graphics Processing Unit, image processor), and the GPU is responsible for rendering and drawing content required to be displayed on the display screen.
  • the processor 1801 may further include an AI (Artificial Intelligence) processor, and the AI processor is configured to process computing operations related to machine learning.
  • AI Artificial Intelligence
  • the memory 1802 may include one or more computer-readable storage media, which may be tangible and non-transitory.
  • the memory 1802 may further include a high-speed random access memory, and a non-volatile memory, such as one or more disk storage devices, flash storage devices.
  • the non-transitory computer-readable storage medium in the memory 1802 is configured to store at least one instruction for execution by the processor 1801 to implement the implementation of the How to pick up virtual items.
  • the terminal 1800 may further include: a peripheral device interface 1803 and at least one peripheral device.
  • the peripheral device includes at least one of a radio frequency circuit 1804, a touch display 1805, a camera 1806, an audio circuit 1807, a positioning component 1808, and a power supply 1809.
  • the terminal 1800 further includes one or more sensors 1810.
  • the one or more sensors 1810 include, but are not limited to, an acceleration sensor 1811, a gyro sensor 1812, a pressure sensor 1813, a fingerprint sensor 1814, an optical sensor 1815, and a proximity sensor 1816.
  • FIG. 18 does not constitute a limitation on the terminal 1800, and may include more or fewer components than shown, or combine certain components, or use different component arrangements.
  • the application also provides a computer-readable storage medium, where the storage medium stores at least one instruction, at least one program, code set, or instruction set, the at least one instruction, the at least one program, the code set, or The instruction set is loaded and executed by the processor to implement the method for picking up virtual items in a virtual environment provided by the foregoing method embodiments.
  • the present application also provides a computer program product containing instructions that, when run on a computer, causes the computer to execute the method for picking up virtual items in a virtual environment as described in the above aspects.
  • the program may be stored in a computer-readable storage medium.
  • the storage medium mentioned may be a read-only memory, a magnetic disk or an optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种在虚拟环境中对虚拟物品进行拾取的方法、装置、终端和计算机可读存储介质,属于人机交互领域。所述方法包括:显示用户界面,该用户界面中显示有虚拟环境和位于虚拟环境中的虚拟对象(401);根据对用户界面的交互操作触发的第一指令,获取该交互操作在用户界面形成的操作轨迹(402);当操作轨迹形成封闭区域时,获取虚拟环境中位于该封闭区域内的至少两个目标虚拟物品(403);将目标虚拟物品聚集在虚拟环境中的指定位置(404);当虚拟对象移动至指定位置时,控制虚拟对象拾取得到目标虚拟物品(405)。

Description

在虚拟环境中对虚拟物品进行拾取的方法、装置、终端和计算机可读存储介质
本申请要求于2018年08月31日提交中国专利局,申请号为2018110149627,发明名称为“在虚拟环境中对虚拟物品进行拾取的方法、装置及终端”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及人机交互领域,特别涉及一种在虚拟环境中对虚拟物品进行拾取的方法、装置、终端和计算机可读存储介质。
背景技术
在诸如智能手机、平板电脑、台式计算机之类的终端上,存在很多基于虚拟环境的应用程序,如虚拟现实应用程序、三维地图应用程序、军事仿真应用程序、第三人称射击游戏(Third-Personal Shooting Game,TPS游戏)、第一人称射击游戏(First-person shooting game,FPS游戏)、多人在线战术竞技游戏(Multiplayer Online Battle Arena Games,MOBA游戏)等。在上述应用程序中,虚拟对象(例如:虚拟人物)可以通过捡拾、购买等动作对虚拟环境中的虚拟物品进行获取。
通常,在虚拟环境中,用户控制虚拟对象移动至虚拟物品所在的位置,可通过自动拾取或主动拾取的方式拾取虚拟物品。当虚拟环境中包括位于不同位置的多个虚拟物品时,用户需要控制虚拟对象移动至每个虚拟物品所在的位置附近对虚拟物品进行拾取。例如,在显示界面显示的虚拟环境中显示有虚拟物品1、虚拟物品2以及虚拟物品3,用户需要控制虚拟对象移动至虚拟物品1所在的位置附近,对虚拟物品1进行拾取后,再控制虚拟对象移动至虚拟物品2所在的位置附近,对虚拟物品2进行拾取后,再控制虚拟对象移动至虚拟物品3所在的位置附近,对虚拟物品3进行拾取,从而完成对虚拟物品1、虚拟物品2以及虚拟物品3的拾取。
由于相关技术中需要控制虚拟对象移动到每一个虚拟物品的所在位置附近 对虚拟物品进行拾取,用户需要多次操作,导致人机交互效率较低。
发明内容
根据本申请的各种实施例,提供一种在虚拟环境中对虚拟物品进行拾取的方法、装置、终端和计算机可读存储介质。
一种在虚拟环境中对虚拟物品进行拾取的方法,由终端执行,所述方法包括:
显示用户界面,所述用户界面中显示有所述虚拟环境和位于所述虚拟环境中的虚拟对象;
根据对所述用户界面的交互操作触发的第一指令,获取所述交互操作在所述用户界面形成的操作轨迹;
当所述操作轨迹形成封闭区域时,获取所述虚拟环境中位于所述封闭区域内的至少两个目标虚拟物品;
将所述目标虚拟物品聚集在所述虚拟环境中的指定位置;
当所述虚拟对象移动至所述指定位置时,控制所述虚拟对象拾取得到所述目标虚拟物品。
一种在虚拟环境中对虚拟物品进行拾取的装置,所述装置包括:
显示模块,用于显示用户界面,所述用户界面中显示有所述虚拟环境和位于所述虚拟环境中的虚拟对象;
获取模块,用于根据对所述用户界面的交互操作触发的第一指令,获取所述交互操作在所述用户界面形成的操作轨迹;当所述操作轨迹形成封闭区域时,获取所述虚拟环境中位于所述封闭区域内的至少两个目标虚拟物品;
处理模块,用于将所述目标虚拟物品聚集在所述虚拟环境中的指定位置;当所述虚拟对象移动至所述指定位置时,控制所述虚拟对象拾取得到所述目标虚拟物品。
一种终端,所述终端包括处理器和存储器,所述存储器中存储有至少一条指令,所述指令由所述处理器加载并执行所述在虚拟环境中对虚拟物品进行拾取的方法。
一种计算机可读存储介质,所述存储介质中存储有至少一条指令,所述指 令由处理器加载并执行所述在虚拟环境中对虚拟物品进行拾取的方法。
本申请的一个或多个实施例的细节在下面的附图和描述中提出。本申请的其它特征、目的和优点将从说明书、附图以及权利要求书变得明显。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请一个示例性实施例提供的终端的结构框图;
图2是本申请一个示例性实施例提供的计算机系统的结构框图;
图3是相关技术中提供的在虚拟环境中对虚拟物品进行拾取的方法的用户界面示意图;
图4是本申请一个示例性实施例提供的在虚拟环境中对虚拟物品进行拾取的方法的流程图;
图5是本申请一个示例性实施例提供的在虚拟环境中对虚拟物品进行拾取的方法的用户界面示意图;
图6是本申请一个示例性实施例提供的在虚拟环境中对虚拟物品进行拾取的方法的操作轨迹示意图;
图7是本申请一个示例性实施例提供的在虚拟环境中对虚拟物品进行拾取的方法的虚拟物品聚集示意图;
图8是本申请一个示例性实施例提供的在虚拟环境中对虚拟物品进行拾取的方法的虚拟物品拾取示意图;
图9是本申请一个示例性实施例提供的在虚拟环境中对虚拟物品进行拾取的方法的流程图;
图10是本申请一个示例性实施例提供的虚拟物品的可拾取区域示意图;
图11是本申请一个示例性实施例提供的不需要拾取的虚拟物品的设置页面示意图;
图12是本申请一个示例性实施例提供的触控信号获取示意图;
图13是本申请一个示例性实施例提供的触控信号处理步骤图;
图14是本申请一个示例性实施例提供的触控事件示意图;
图15是本申请一个示例性实施例提供的三维多人枪战类生存游戏中的虚拟物品拾取方法;
图16是本申请一个示例性实施例提供的二维多人对战游戏中的虚拟物品拾取方法;
图17是本申请一个示例性实施例提供的在虚拟环境中对虚拟物品进行拾取的装置的结构框图;
图18是本申请一个示例性实施例提供的终端的结构示意图。
具体实施方式
为了使本申请的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本申请进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本申请,并不用于限定本申请。
首先,对本申请实施例涉及的若干个名词进行简单介绍:
虚拟环境:是应用程序在终端上运行时显示(或提供)的虚拟环境。该虚拟环境可以是对真实世界的仿真环境,也可以是半仿真半虚构的三维环境,还可以是纯虚构的三维环境。虚拟环境可以是二维虚拟环境、2.5维虚拟环境和三维虚拟环境中的任意一种。可选地,该虚拟环境还用于至少两个虚拟角色之间的虚拟环境对战。可选地,该虚拟环境还用于至少两个虚拟角色之间使用虚拟枪械进行对战。可选地,该虚拟环境还用于在目标区域范围内,至少两个虚拟角色之间使用虚拟枪械进行对战,该目标区域范围会随虚拟环境中的时间推移而不断变小。下述实施例以虚拟环境是二维虚拟环境来举例说明,但对此不加以限定。
虚拟对象:是指在虚拟环境中的可活动对象。该可活动对象可以是虚拟人物、虚拟动物、动漫人物中的至少一种。可选地,当虚拟环境为三维虚拟环境时,虚拟对象是基于动画骨骼技术创建的三维立体模型。每个虚拟对象在三维虚拟环境中具有自身的形状和体积,占据三维虚拟环境中的一部分空间。
虚拟物品:是指虚拟对象装配或携带的物品。例如,虚拟物品可以是虚拟 对象装配的背包,也可以是虚拟对象装配的武器,也可以是虚拟对象携带的药品等。可选的,在虚拟环境的应用程序界面中具有装配位和背包格,用户将虚拟物品存储在装配位中时即装配了该虚拟物品,当虚拟对象的装配位上存储有背包时即拥有了背包格,用户无法装配在虚拟对象上的虚拟物品都可以存储在背包格中,例如,药品、绷带或多余的无法装配的武器,当虚拟物品存储在背包格中时,虚拟对象即携带了该虚拟物品。
物品类型:是虚拟物品具有的一种属性,该属性与虚拟物品的类型相对应。例如,虚拟枪械对应的物品类型是武器,虚拟药品对应的类型是补给。
物品数量:是虚拟物品对应的数量。例如,虚拟对象装配有1把手枪,虚拟手枪中装配有12颗手枪子弹,同时,该虚拟对象还携带有100颗手枪子弹,则手枪对应的物品数量为1,手枪子弹对应的物品数量为112。
恢复物品:是一种能够回复虚拟对象体力值的虚拟物品。其中,体力值是虚拟对象的一种属性,当虚拟对象的体力值为0时,虚拟对象在虚拟环境中将失去战斗力无法继续互动。示例性的,虚拟对象的体力值上限为100,当虚拟对象经历战斗或高处跌落或其它方式造成虚拟对象受到伤害后,虚拟对象的体力值减少为60,若虚拟对象拾取且使用了恢复值为30的恢复物品后,虚拟对象的体力值回复到90。通常恢复物品对虚拟对象的体力值回复不能超过虚拟对象的体力值上限。
升级物品:是一种能够增加虚拟对象经验值的虚拟物品,虚拟对象可通过增加经验值实现等级的提升。其中,经验值和等级是虚拟对象的属性,每个等级对应不同的经验值,当虚拟对象的经验值超过当前等级的经验值上限时,虚拟对象的等级获得提升。示例性的,虚拟对象当前的等级为1级,1级对应的经验值上限为30,虚拟对象当前的经验值为20,当虚拟对象拾取并使用了经验值为35的升级物品后,经验值增加到55,由于1级的等级对应的经验值上限为30,则虚拟对象的等级升级为2级。
本申请实施例中的终端可以是台式计算机、膝上型便携计算机、手机、平板电脑、电子书阅读器、MP3(Moving Picture Experts Group Audio Layer III,动态影像专家压缩标准音频层面3)播放器、MP4(Moving Picture Experts Group  Audio Layer IV,动态影像专家压缩标准音频层面4)播放器等等。该终端中安装和运行有支持虚拟环境的应用程序,比如支持三维虚拟环境的应用程序。该应用程序可以是虚拟现实应用程序、三维地图程序、军事仿真程序、TPS游戏、FPS游戏、MOBA游戏中的任意一种。可选地,该应用程序可以是单机版的应用程序,比如单机版的3D游戏程序,也可以是网络联机版的应用程序。
请参考图1,其示出了本申请一个示例性实施例提供的终端的结构框图。该终端100包括:处理器110以及存储器120,处理器110和存储器120通过总线或其它方式实现通信。其中,存储器120中存储有操作系统121和应用程序122。
操作系统121是为应用程序122提供对计算机硬件的安全访问的基础软件。
应用程序122是支持虚拟环境的应用程序。可选地,应用程序122是支持三维虚拟环境的应用程序。该应用程序122可以是虚拟现实应用程序、三维地图程序、军事仿真程序、第三人称射击游戏(Third-Personal Shooting Game,TPS)、第一人称射击游戏(First-person shooting game,FPS)、MOBA游戏、多人枪战类生存游戏中的任意一种。该应用程序122可以是单机版的应用程序,比如单机版的3D游戏程序。
请参考图2,其示出了本申请一个示例性实施例提供的计算机系统的结构框图。该计算机系统200包括:第一终端220、服务器240和第二终端260。
第一终端220安装和运行有支持虚拟环境的应用程序。该应用程序可以是虚拟现实应用程序、三维地图应用程序、军事仿真应用程序、TPS游戏、FPS游戏、MOBA游戏、多人枪战类生存游戏中的任意一种。第一终端220是第一用户使用的终端,第一用户使用第一终端220控制位于虚拟环境中的第一虚拟对象进行活动,该活动包括但不限于:调整身体姿态、爬行、步行、奔跑、骑行、跳跃、驾驶、拾取、射击、攻击、投掷中的至少一种。示意性的,第一虚拟对象是第一虚拟人物,比如仿真人物角色或动漫人物角色。
第一终端220通过无线网络或有线网络与服务器240相连。
服务器240包括一台服务器、多台服务器、云计算平台和虚拟化中心中的至少一种。服务器240用于为支持三维虚拟环境的应用程序提供后台服务。可 选地,服务器240承担主要计算工作,第一终端220和第二终端260承担次要计算工作;或者,服务器240承担次要计算工作,第一终端220和第二终端260承担主要计算工作;或者,服务器240、第一终端220和第二终端260三者之间采用分布式计算架构进行协同计算。
第二终端260安装和运行有支持虚拟环境的应用程序。该应用程序可以是虚拟现实应用程序、三维地图应用程序、军事仿真应用程序、FPS游戏、MOBA游戏、多人枪战类生存游戏中的任意一种。第二终端260是第二用户使用的终端,第二用户使用第二终端260控制位于虚拟环境中的第二虚拟对象进行活动,该活动包括但不限于:调整身体姿态、爬行、步行、奔跑、骑行、跳跃、驾驶、拾取、射击、攻击、投掷中的至少一种。示意性的,第二虚拟对象是第二虚拟人物,比如仿真人物角色或动漫人物角色。
可选地,第一虚拟人物和第二虚拟人物处于同一虚拟环境中。可选地,第一虚拟人物和第二虚拟人物可以属于同一个队伍、同一个组织、具有好友关系或具有临时性的通讯权限。可选地,第一虚拟人物和第二虚拟人物也可以属于不同队伍、不同组织、或具有敌对性的两个团体。
可选地,第一终端220和第二终端260上安装的应用程序是相同的,或两个终端上安装的应用程序是不同控制系统平台的同一类型应用程序。第一终端220可以泛指多个终端中的一个,第二终端260可以泛指多个终端中的一个,本实施例仅以第一终端220和第二终端260来举例说明。第一终端220和第二终端260的设备类型相同或不同,该设备类型包括:游戏主机、台式计算机、智能手机、平板电脑、电子书阅读器、MP3播放器、MP4播放器和膝上型便携计算机中的至少一种。以下实施例以终端是台式计算机来举例说明。
本领域技术人员可以知晓,上述终端的数量可以更多或更少。比如上述终端可以仅为一个,或者上述终端为几十个或几百个,或者更多数量。本申请实施例对终端的数量和设备类型不加以限定。
请参考图3,其示出了相关技术中的在虚拟环境中对虚拟物品进行拾取的方法的示意图,如图3所示,终端310的显示屏320上显示有用户界面330,用户界面330上显示有虚拟环境340,虚拟环境340中包括虚拟对象350、第一虚拟 物品341、第二虚拟物品342以及第三虚拟物品343。相关技术中的在虚拟环境中对虚拟物品进行拾取的方法为:
控制虚拟对象350由初始位置A 0移动至第一虚拟物品341所在的位置B附近的第一位置A 1,对第一虚拟物品341进行拾取;然后控制虚拟对象350由第一位置A 1移动至第二虚拟物品342所在位置C附近的第二位置A 2,对第二虚拟物品342进行拾取;然后控制虚拟对象350由第二位置A 2移动至第三虚拟物品342所在位置D附近的第三位置A 3,对第三虚拟物品343进行拾取,从而完成对第一虚拟物品341、第二虚拟物品342以及第三虚拟物品343的拾取。
以该虚拟环境对应的应用程序为二维多人对战游戏为例进行说明,图3中的虚拟对象350是终端310控制的虚拟人物,第一虚拟物品341可以是恢复物品,第二虚拟物品342可以是升级物品,第三虚拟物品343可以是武器。终端310控制虚拟人物350移动至恢复物品341所在的位置,根据恢复物品341对应的恢复值增加虚拟人物350的体力值,终端310控制虚拟人物350移动至升级物品342所在的位置,根据升级物品342对应的经验值增加虚拟人物350的经验值,终端310控制虚拟人物350移动至武器343所在的位置拾取并装配该武器343。
不难看出,相关技术中需要控制虚拟对象移动到每一个虚拟物品的所在位置附近对虚拟物品进行拾取,用户需要多次操作,导致人机交互效率较低。
请参考图4,其示出了本申请一个示例性实施例提供的在虚拟环境中对虚拟物品进行拾取的方法的流程图。如图4所示,该方法由图2实施例中的第一终端220或第二终端260执行,该方法包括:
步骤401,显示用户界面,该用户界面中显示有虚拟环境和位于虚拟环境中的虚拟对象。
当终端运行有支持虚拟环境的应用程序时,该终端的显示屏上显示有该应用程序的用户界面,该用户界面中显示有虚拟环境和位于虚拟环境中的虚拟对象。
以虚拟环境为二维虚拟环境为例,如图5所示,终端510中运行有支持该二维虚拟环境的应用程序,终端510的显示屏520中显示有用户界面530,该用 户界面530中显示有虚拟环境540、虚拟对象550、虚拟操作杆560以及功能控件570,其中,虚拟环境540包括第一虚拟物品541、第二虚拟物品542以及第三虚拟物品543。
虚拟对象550可根据在虚拟操作杆560上触发的移动信号在虚拟环境540中进行移动;虚拟对象550可通过在功能控件570上触发的操作信号对虚拟环境540中的其它虚拟对象,或虚拟物品执行动作,例如拾取虚拟物品,击打其它虚拟对象等。
步骤402,根据对用户界面的交互操作触发的第一指令,获取该交互操作在用户界面形成的操作轨迹。
交互操作是用户在用户界面进行的持续性的操作,例如,交互操作可以是用户在终端的触控显示屏上持续性地滑动操作,也可以是用户点击鼠标在用户界面拖动一定距离后释放的操作。本实施例以该交互操作为滑动操作做示例性说明。
当用户在用户界面进行滑动操作时,终端接收该滑动操作触发的第一指令,根据该第一指令确定该滑动操作在用户界面形成的操作轨迹。
如图6所示,用户发起由用户界面由起始位置X 1滑动至结束位置X 2的滑动操作,触发第一指令,终端510接收该第一指令后,获取得到该滑动操作在用户界面形成的操作轨迹580。
步骤403,当操作轨迹形成封闭区域时,获取虚拟环境中位于该封闭区域内的至少两个目标虚拟物品。
终端可通过以下方式中的任意一种检测操作轨迹是否形成封闭区域:
(1)终端获取用户在起始位置X 1按下形成的起始区域581和在结束位置X 2抬起手指形成的结束区域582,检测起始区域581和结束区域582之间是否具有交集,若起始区域581和结束区域582之间具有交集,则确定操作轨迹形成封闭区域,若起始区域581和结束区域582之间不具有交集,则确定操作轨迹没有形成封闭区域。
(2)终端检测起始区域581和结束区域582之间是否具有交集,若起始区域581和结束区域582之间不具有交集,则确定操作轨迹没有形成封闭区域;若起始区域581和结束区域582之间具有交集,则检测该交集的面积占比,若 该面积占比超过面积占比阈值时,确定操作轨迹形成封闭区域,若该面积占比没有超过面积占比阈值时,则确定操作轨迹没有形成封闭区域。其中,面积占比是交集面积和起始区域面积的比值、交集面积和结束区域面积的比值或交集面积和总面积(起始区域面积和结束区域的面积之和)的比值中的至少一种。
(3)终端获取起始区域581的中心位置和结束区域582的中心位置之间的第一距离,检测第一距离是否小于第一距离阈值,若第一距离小于第一距离阈值,则确定操作轨迹形成封闭区域;若第一距离不小于第一距离阈值,则确定操作轨迹没有形成封闭区域。
当终端确定操作轨迹形成封闭区域时,获取位于该封闭区域内部的虚拟物品作为目标虚拟物品。示例性的,如图6所示,在操作轨迹580形成的封闭区域583内具有第一虚拟物品541以及第三虚拟物品543,而第二虚拟物品542位于封闭区域583外,终端确定第一虚拟物品541以及第三虚拟物品543为目标虚拟物品。
步骤404,将目标虚拟物品聚集在虚拟环境中的指定位置。
指定位置可由实际需求进行设置。例如,指定位置可以是用户触控的封闭区域中的任意位置,也可以是终端计算得到封闭区域中离虚拟对象最近的位置,也可以是终端计算得到封闭区域的中心位置。
示例性的,如图7所示,终端计算得到封闭区域583的中心位置O后,将目标虚拟物品,即第一虚拟物品541以及第三虚拟物品543移动聚集至该中心位置O。
终端可通过接收在封闭区域内的被触控位置的触控操作触发的第二指令,将该被触控位置作为指定位置。例如,用户在封闭区域中触控一个触控位置,该触控位置即可作为指定位置。
步骤405,当虚拟对象移动至指定位置时,控制虚拟对象拾取得到目标虚拟物品。
当虚拟对象移动至该指定位置时,终端控制虚拟对象可通过手动或自动拾取的方式获取得到目标虚拟物品。
示例性的,如图8所示,用户可通过触控虚拟操作杆560控制虚拟对象550由初始位置A 0移动至指定位置O处,自动拾取得到第一虚拟物品541以及第三 虚拟物品543。
综上所述,本申请实施例中,通过根据在用户界面的交互操作触发的第一指令获取该交互操作在用户界面形成的操作轨迹,当操作轨迹形成为封闭区域时,将位于封闭区域内的虚拟环境中的虚拟物品聚集在指定位置,当虚拟对象移动至该指定位置附近时,拾取聚集在该指定位置的虚拟物品,由于不需要控制虚拟对象移动至每一个虚拟物品所在的位置进行拾取,因此提高了虚拟物品的拾取效率,从而提高了人机交互的效率。
请参考图9,其示出了本申请一个示例性实施例提供的在虚拟环境中对虚拟物品进行拾取的方法的流程图。如图9所示,该方法由图2实施例中的第一终端220或第二终端260执行,该方法包括:
步骤901,显示用户界面,该用户界面中显示有虚拟环境和位于虚拟环境中的虚拟对象。
终端显示用户界面的步骤可参考图4实施例中的步骤401。
步骤902,根据对用户界面的交互操作触发的第一指令,获取该交互操作在用户界面形成的操作轨迹。
以该交互操作是在终端的触控显示屏上的滑动操作为例,当用户在用户界面进行滑动操作时,终端接收该滑动操作触发的第一指令,根据该第一指令确定该滑动操作在用户界面形成的操作轨迹。
如图6所示,用户发起由用户界面由起始位置X 1滑动至结束位置X 2的滑动操作,触发第一指令,终端510接收该第一指令后,获取得到该滑动操作在用户界面形成的操作轨迹580。
示例性的,用户在触控显示屏上的滑动操作触发的第一指令包括三个指令事件:手指在触控显示屏上按下的触控开始事件(touchstart)、手指在触控显示屏上滑动时的滑动事件(touchmove)以及手指从触控显示屏上抬起时的触控结束事件(touchend),终端根据触控开始事件获取得到起始区域581,根据触控结束事件得到结束区域582,根据滑动事件得到操作轨迹580。
步骤903,检测操作轨迹是否形成封闭区域。
终端检测操作轨迹是否形成封闭区域的方式可以是图4实施例中的步骤403 中的三种方式中的任意一种。
步骤904,当确定操作轨迹形成封闭区域时,将位于该封闭区域内的至少两个目标虚拟物品作为候选虚拟物品。
虚拟物品对应有可拾取区域,可拾取区域是虚拟物品在虚拟环境中占据的面积,终端可根据需求依据可拾取区域确定候选虚拟物品。
可选的,在目标虚拟物品中,终端将可拾取区域与封闭区域之间交集区域内的虚拟物品作为候选虚拟物品。
示例性的,如图10所示,第一虚拟物品541对应有可拾取区域5410,第二虚拟物品542对应有可拾取区域5420,第三虚拟物品543对应有可拾取区域5430,其中,可拾取区域5410和可拾取区域5430位于封闭区域583内,因此可拾取区域5410与封闭区域583的交集即为可拾取区域5410,可拾取区域5430与封闭区域583的交集即为可拾取区域5430,而可拾取区域5420位于封闭区域583外且与封闭区域并无交集,故第一虚拟物品541与第二虚拟物品542为候选虚拟物品。
可选的,在目标虚拟物品中,终端将位于可拾取区域内、且的可拾取区域中心位置在封闭区域内的虚拟物品作为候选虚拟物品。其中,可拾取区域中心位置指的是可拾取区域的中心位置。
示例性的,如图10所示,终端获取第一虚拟物品541对应的可拾取区域5410的中心位置P1的坐标,第二虚拟物品542的可拾取区域5420的中心位置P2的坐标,第三虚拟物品543的可拾取区域5430的中心位置P3的坐标,根据P1、P2、P2的坐标以及封闭区域对应的像素点坐标,检测得到P1、P3位于封闭区域583内,P2位于封闭区域583外,从而确定第一虚拟物品541与第二虚拟物品542为候选虚拟物品。
步骤905,检测候选虚拟物品中是否具有不需要拾取的虚拟物品。
可选的,终端检测候选虚拟物品中是否包括不需要拾取的虚拟物品,当候选虚拟物品中包括不需要拾取的虚拟物品时,进入步骤906a;当候选虚拟物品中不包括不需要拾取的虚拟物品时,进入步骤906b。
其中,不需要拾取的虚拟物品是指物品类型是应用程序中设置的不需要拾取的物品类型的虚拟物品。不需要拾取的虚拟物品可以由应用程序预设,也可 以由用户设置。例如,用户不善于使用手枪,可将手枪类型的虚拟物品设置为不需要拾取的虚拟物品。
示例性的,如图11所示,终端在接收滑动操作触发的第一指令之前,在设置控件上触发物品设置信号后,终端在用户界面530中显示物品设置页面531,物品设置页面中显示有物品类型1、物品类型2、物品类型3;终端接收到在物品类型1上触发的物品类型确定信号后,确定物品类型1对应的虚拟物品为不需要拾取的虚拟物品。
可选的,终端在确定候选虚拟物品之后,检测是否具有与虚拟对象所携带的物品相同、且物品数量超过数量阈值的虚拟物品,当候选虚拟物品中包括物品数量超过数量阈值的虚拟物品时,将物品数量超过数量阈值的虚拟物品作为超额物品,并将超额物品作为不需要拾取的虚拟物品。
示例性的,虚拟对象携带的物品具有对应的物品数量。例如,虚拟对象可以携带一把步枪、一把手枪、120颗步枪子弹以及120颗手枪子弹,步枪和手枪对应的物品类型为枪械,步枪子弹对应的物品类型为步枪弹药,手枪子弹对应的物品类型为手枪弹药,则枪械对应的数量阈值为1,步枪弹药对应的数量阈值为119,手枪弹药对应的数量阈值为119。当虚拟对象携带有一把步枪、120颗步枪子弹时,若候选虚拟物品1为手枪、候选虚拟物品2为120颗步枪弹药、候选虚拟物品3为120颗手枪子弹,由于步枪弹药是超过数量阈值的虚拟物品,则虚拟物品2为不需要拾取的虚拟物品。
步骤906a,将候选虚拟物品中除去不需要拾取的虚拟物品的其它候选虚拟物品作为目标虚拟物品。
当终端确定候选虚拟物品中不需要拾取的虚拟物品后,将候选虚拟物品中除不需要拾取的虚拟物品之外的其它候选虚拟物品作为目标虚拟物品。
步骤906b,将候选虚拟物品作为目标虚拟物品。
当终端确定候选虚拟物品中不存在不需要拾取的虚拟物品后,将候选虚拟物品作为目标虚拟物品。
步骤907,将封闭区域的中心位置作为指定位置,将目标虚拟物品聚集在该指定位置。
示例性的,如图7所示,终端计算得到封闭区域583的中心位置O后,将 目标虚拟物品,即第一虚拟物品541以及第三虚拟物品543移动聚集至该中心位置O。例如,终端根据每个目标虚拟物品占据的区域的面积,得到以中心位置O的坐标为中心,所有目标虚拟物品所需要占据的区域的面积,确定所有目标虚拟物品占据的聚集区域,将每个目标虚拟物品随机移动至聚集区域内的任一位置,或,将每个目标虚拟物品移动至距离其最近的聚集区域的位置。
可选的,终端将目标虚拟物品聚集在指定位置后,以聚集图标的形式在指定位置显示至少两个目标虚拟物品。
聚集后的所有目标虚拟物品不需要全部显示在指定位置,可以以一个聚集图标的方式显示。例如,目标虚拟物品1、目标虚拟物品2以及目标虚拟物品3聚集到指定位置O之后,仅显示一个目标虚拟物品的图标,或显示预设的聚集图标指代目标虚拟物品1、目标虚拟物品2以及目标虚拟物品3。
可选的,终端将目标虚拟物品聚集在虚拟环境中的指定位置之后,将至少两个目标虚拟物品中属于相同物品类型的虚拟物品显示为一个图标。
示例性的,目标虚拟物品1和目标虚拟物品2的物品类型都是背包,目标虚拟物品3的物品类型是枪械,当目标虚拟物品1、目标虚拟物品2以及目标虚拟物品3聚集到指定位置O之后,只显示目标虚拟物品1(或目标虚拟物品2)和目标虚拟物品3的图标。可选的,终端在相同类型的目标虚拟物品的图标上显示相同类型的目标虚拟物品的个数。例如,在目标虚拟物品1的图标上显示“x2”,表示目标虚拟物品1的标识代表两个相同类型的目标虚拟物品。
步骤908,检测虚拟对象和指定位置之间的第二距离是否小于第二距离阈值。
虚拟对象在虚拟环境中具有对应的坐标,终端根据虚拟对象的坐标和指定位置的坐标计算虚拟对象和指定位置之间的第二距离,根据第二距离判断虚拟对象是否移动至指定位置。
步骤909,当第二距离小于第二距离阈值时,自动拾取得到目标虚拟物品。
当虚拟对象和指定位置之间的第二距离小于第二距离阈值时,终端确定虚拟对象移动至指定位置时,自动拾取得到目标虚拟物品。
示例性的,如图8所示,用户可通过触控虚拟操作杆560控制虚拟对象550由初始位置A 0移动至指定位置O处,自动拾取得到第一虚拟物品541以及第三 虚拟物品543。
步骤910,检测目标虚拟物品是否包括恢复物品。
在虚拟对象拾取得到目标虚拟物品后,终端检测虚拟对象拾取的目标虚拟物品中,是否包括恢复物品。若目标虚拟物品中包括恢复物品,进入步骤911,若目标虚拟物品中不包括恢复物品,进入步骤912。
步骤911,根据恢复物品对应的恢复值增加虚拟对象的体力值。
当目标虚拟物品中包括恢复物品时,终端根据恢复物品对应的恢复值增加虚拟对象的体力值。
示例性的,若目标虚拟物品中包括恢复物品1和恢复物品2,其中,恢复物品1对应的恢复值为10,恢复物品2对应的恢复值为15,虚拟对象的体力值为40,虚拟对象的体力值上限为100。终端根据恢复物品1的恢复值将虚拟对象的体力值由40增加到50,根据恢复物品2的恢复值将虚拟对象的体力值由50增加到65。需要说明的是,终端根据不同恢复值的恢复物品增加虚拟对象的体力值的顺序不加限定。
步骤912,检测目标虚拟物品中是否包括升级物品。
终端检测虚拟对象获取得到的目标虚拟物品中是否包括升级物品,若目标虚拟物品中包括升级物品,进入步骤913,若目标虚拟物品中不包括恢复物品,则停止步骤。
步骤913,根据升级物品对应的经验值增加虚拟对象的经验值。
当目标虚拟物品中包括升级物品时,终端根据升级物品对应的经验值增加虚拟对象的经验值,当虚拟对象经验值增加到超过当前等级的经验值上限后,虚拟对象根据增加后的经验值升级到对应的等级。
示例性的,若目标虚拟物品中包括升级物品1和升级物品2,其中,升级物品1对应的经验值为100,升级物品2对应的经验值为150,虚拟对象的经验值为500,虚拟对象的等级为1,等级1对应的经验值上限为600。终端根据升级物品1的经验值将虚拟对象的经验值由500增加到600,根据升级物品2的经验值将虚拟对象的经验值由600增加到750,由于750的经验值超过了等级1的经验值上限,因此虚拟对象的等级由等级1上升到等级2。需要说明的是,终端根据不同经验值的升级物品增加虚拟对象的经验值的顺序不加限定。
需要说明的是,终端可以先执行步骤911和步骤912,再执行步骤913和步骤914,或,终端可以先执行步骤913和步骤914,再执行步骤911和步骤912,在此不做限定。
综上所述,本申请实施例中,通过根据在用户界面的交互操作触发的第一指令获取该交互操作在用户界面形成的操作轨迹,当操作轨迹形成为封闭区域时,将位于封闭区域内的虚拟环境中的虚拟物品聚集在指定位置,当虚拟对象移动至该指定位置附近时,拾取聚集在该指定位置的虚拟物品,由于不需要控制虚拟对象移动至每一个虚拟物品所在的位置进行拾取,因此提高了虚拟物品的拾取效率,从而提高了人机交互的效率。
可选的,本申请实施例中,通过将操作轨迹形成的封闭区域内的虚拟物品作为候选虚拟物品,确定候选虚拟物品中不需要拾取的虚拟物品,将候选虚拟物品中除去不需要拾取的虚拟物品的其它候选虚拟物品作为目标虚拟物品,避免了终端将不需要拾取的候选虚拟物品聚集在指定位置,使虚拟对象移动至指定位置时对不需要拾取的虚拟物品进行拾取,从而提高了虚拟物品拾取的效率,进而提高了人机交互的效率。
可选的,本申请实施例中,通过将目标虚拟物品聚集在指定位置后,以聚集图标的形式在指定位置显示至少两个目标虚拟物品,从而减少了终端同屏显示的图像个数,降低了终端的资源占用,从而在一定程度上提高了应用程序的运行流畅度。
可选的,本申请实施例中,通过将目标虚拟物品聚集在虚拟环境中的指定位置之后,将至少两个目标虚拟物品中,物品类型相同的目标虚拟物品显示为一个图标,从而在某些情况下能够减少终端同屏显示的图像个数,降低了终端的资源占用,从而在一定程度上提高了应用程序的运行流畅度。
应该理解的是,虽然图4、9的流程图中的各个步骤按照箭头的指示依次显示,但是这些步骤并不是必然按照箭头指示的顺序依次执行。除非本文中有明确的说明,这些步骤的执行并没有严格的顺序限制,这些步骤可以以其它的顺序执行。而且,图4、9中的至少一部分步骤可以包括多个子步骤或者多个阶段,这些子步骤或者阶段并不必然是在同一时刻执行完成,而是可以在不同的时刻执行,这些子步骤或者阶段的执行顺序也不必然是依次进行,而是可以与其它 步骤或者其它步骤的子步骤或者阶段的至少一部分轮流或者交底地执行。
本申请实施例中,终端可以通过硬件层面和程序层面结合获取触控滑动操作的操作轨迹,原理如下:
一、硬件层面:
如图12所示,触摸显示屏幕检测滑动操作的原理为:终端510的触摸显示屏520镀有驱动电极1210和接收电极1220,驱动电极1210和接收电极1220之间由驱动缓冲器提供驱动脉冲,形成低压交流电场。当手指接触触控显示屏520时,由于人体导电,手指与触控显示屏520的电介质层1230之间形成一个藕合电容,驱动电极1210和接收电极1220发出的电流流向手指接触触控显示屏520的触点,触控显示屏520内层和外层经过中间的金属氧化物之间产生触发信号,终端的中央处理器通过该触发信号得到手指滑动操作的操作轨迹。
如图13所示,手指与触控屏幕形成一个接触区域,终端通过接触区域获取得到触控坐标的过程为:步骤1301,终端获取接触区域的原始信号,该原始信号为接触区域的原始触控信号,其包括干扰信号;步骤1302,终端对干扰信号进行滤波,得到滤波后的触控信号;步骤1303,计算滤波后的触控信号的压力点,得到触控信号的压力分布;步骤1304,根据触控信号的压力分布建立触控区域;步骤1305,根据触控区域,获取得到触控坐标,根据触控坐标,即可确定操作轨迹的坐标,进而确定封闭区域的边缘坐标以及封闭区域的边缘坐标包围的像素点坐标。
二、程序层面:
当上述硬件层面检测到用户触控时,终端的操作系统中会触发触控事件。其中,终端的操作系统中的触控事件(touch)会在用户手指放在屏幕的时候、在屏幕上滑动的时候或者是从屏幕上移开的时候触发。触控事件可以有如下几种:
touchstart事件(触控开始事件):当手指开始触控屏幕时触发,即使在已经有一个手指放在屏幕上的情况下,当有其它手指触控屏幕时,也会触发该事件。
touchmove事件(触控移动事件):当手指在屏幕上滑动的时候连续触发。在该事件发生期间,调用preventDefault()事件可以阻止滚动。
touchend事件(触控结束事件):当手指从屏幕上离开的时候触发。
touchcancel事件(触控取消事件):当系统停止跟踪触控的时候触发。
终端中的应用程序可以通过上述程序层面获得的触控事件获取在路径绘制界面中执行的绘制操作的操作轨迹。比如,请参考图14,其示出了本申请实施例涉及的一种根据触控事件确定操作轨迹的示意图。如图14所示,终端可以根据触控开始事件、触控结束事件以及触控开始事件和触控结束事件之间的触控移动事件各自对应的坐标获取到在路径绘制界面中执行的绘制操作的操作轨迹。
在一个示例性的例子中,本申请实施例中的应用程序是三维多人枪战类生存游戏,如图15所示,终端1510的显示屏1520上显示有该三维多人枪战类生存游戏的用户界面1530,该用户界面1530中显示有虚拟环境1540、终端1510对应的虚拟对象1550、虚拟对象1550对应的第一武器栏1531和第二武器栏1532、虚拟操作杆1560以及功能控件1570。其中,虚拟环境1540中包括第一虚拟物品1541(步枪子弹)、第二虚拟物品1542(第一步枪),第一武器栏1531和第二武器栏1532用于显示虚拟对象1550携带的武器,第一武器栏1531中显示的第二步枪1543即为虚拟对象1550携带的步枪。虚拟操作杆1560以及功能控件1570的作用可参考上述实施例。
终端1510在接收到在用户界面1530的非间断操作触发的第一指令后,获取该交互操作形成的操作轨迹1580,根据该操作轨迹的起始区域1581和结束区域1582确定该操作轨迹1580形成封闭区域1583后,将位于封闭区域1583内的步枪子弹1541和第一步枪1542聚集在封闭区域1583的中心位置,当虚拟对象1550移动至封闭区域1583的中心位置,终端1510控制虚拟对象1550拾取得到步枪子弹1541和第一步枪1542后,第二武器栏1532中显示第一步枪1542的图像,步枪子弹图标对应的数字由拾取前的50更新为拾取后的120。
在一个示例性的例子中,本申请实施例中的应用程序是二维多人对战游戏,如图16所示,终端1610的显示屏1620上显示有该二维多人对战游戏的用户界面1630,该用户界面1630中显示有虚拟环境1640、终端1610对应的虚拟人物 1650、虚拟人物1650对应的武器栏1631以及体力值显示栏1632、虚拟操作杆1660以及功能控件1670。其中,虚拟环境1640中包括第一虚拟物品1641(即恢复物品)、第二虚拟物品1642(宝剑),武器栏1631用于显示虚拟人物1650携带的武器,体力值显示栏1632用于显示虚拟人物1650的体力值,其中深色的表示虚拟人物1650的当前体力值,空白部分表示当前体力值与体力值上限的差值。虚拟操作杆1660以及功能控件1670的作用可参考上述实施例。
终端1610在接收到在用户界面1630的非间断操作触发的第一指令后,获取该交互操作形成的操作轨迹1680,根据该操作轨迹的起始区域1681和结束区域1682确定该操作轨迹1680形成封闭区域1683后,将位于封闭区域1683内的恢复物品1641和宝剑1642聚集在封闭区域1683的中心位置,当虚拟人物1650移动至封闭区域1683的中心位置,终端1610控制虚拟人物1650拾取得到恢复物品1641和宝剑1642后,武器栏1631中显示宝剑1642的图像,体力值显示栏1632中的当前体力值获得了增加。
请参考图17,其示出了本申请一个示例性实施例提供的在虚拟环境中对虚拟物品进行拾取的装置的结构框图。该装置可以通过软件、硬件或者两者的结合实现成为图2实施例中的第一终端220或第二终端260。该装置包括显示模块1710、获取模块1720、处理模块1730以及接收模块1740。
显示模块1710,用于显示用户界面,该用户界面中显示有虚拟环境和位于虚拟环境中的虚拟对象。
获取模块1720,用于根据对用户界面的交互操作触发的第一指令,获取该交互操作在用户界面形成的操作轨迹。当操作轨迹形成封闭区域时,获取虚拟环境中位于封闭区域内的至少两个目标虚拟物品。
处理模块1730,用于将目标虚拟物品聚集在虚拟环境中的指定位置;当虚拟对象移动至指定位置时,控制虚拟对象拾取得到目标虚拟物品。
在一个可选的实施例中,虚拟物品具有可拾取区域;
获取模块1720,还用于将虚拟物品中,可拾取区域与封闭区域具有交集的虚拟物品作为候选虚拟物品;或,将虚拟物品中,可拾取区域的中心位置在封闭区域内的虚拟物品作为候选虚拟物品;将候选虚拟物品中属于需要拾取的虚 拟物品作为目标虚拟物品。
在一个可选的实施例中,虚拟物品具有物品类型;
获取模块1720,还用于确定候选虚拟物品中是否包括不需要拾取的虚拟物品,不需要拾取的虚拟物品是指物品类型是应用程序中设置的不需要拾取的物品类型的虚拟物品;当候选虚拟物品中包括不需要拾取的虚拟物品时,将候选虚拟物品中除去不需要拾取的虚拟物品的其它候选虚拟物品作为目标虚拟物品;或,当候选虚拟物品中不包括不需要拾取的虚拟物品时,将候选虚拟物品作为目标虚拟物品。
在一个可选的实施例中,显示模块1710,还用于显示物品设置页面。
接收模块1740,用于接收在物品设置页面触发的物品类型确定信号。
处理模块1730,还用于根据物品类型确定信号,确定不需要拾取的物品类型。
在一个可选的实施例中,虚拟物品对应有物品数量;
获取模块1720,还用于确定候选虚拟物品中,是否具有物品数量超过数量阈值的虚拟物品;当候选虚拟物品中包括物品数量超过数量阈值的虚拟物品时,将物品数量超过数量阈值的虚拟物品作为超额物品,将候选虚拟物品中除去超额物品的其它候选虚拟物品作为目标虚拟物品;或,当候选虚拟物品中不包括超额物品时,将候选虚拟物品作为目标虚拟物品。
在一个可选的实施例中,接收模块1740,还用于接收在封闭区域内的被触控位置的触控操作所触发的第二指令。
处理模块1730,还用于根据第二指令,将该被触控位置作为指定位置。
在一个可选的实施例中,获取模块1720,还用于获取封闭区域的中心位置。
处理模块1730,还用于将封闭区域的中心位置作为指定位置。
在一个可选的实施例中,处理模块1730,还用于当目标虚拟物品中包括恢复物品时,根据恢复物品对应的恢复值增加虚拟对象的体力值。
在一个可选的实施例中,处理模块1730,还用于当目标虚拟物品中包括升级物品时,根据升级物品对应的经验值增加虚拟对象的经验值。
在一个可选的实施例中,显示模块1710,还用于将目标虚拟物品聚集在虚拟环境中的指定位置之后,以聚集图标的形式在指定位置显示至少两个目标虚 拟物品。
在一个可选的实施例中,显示模块1710,还用于将目标虚拟物品聚集在虚拟环境中的指定位置之后,将至少两个目标虚拟物品中,物品类型相同的目标虚拟物品显示为一个图标。
在一个可选的实施例中,交互操作是在终端的触控显示屏上的滑动操作,该滑动操作的操作轨迹包括起始区域以及结束区域,起始区域是滑动操作中的手指按下操作在终端的显示屏上形成的区域,结束区域是滑动操作中的手指抬起操作在终端的显示屏上形成的区域;
处理模块1730,还用于根据起始区域以及结束区域确定操作轨迹是否形成封闭区域。
在一个可选的实施例中,处理模块1730,还用于获取起始区域的中心位置和结束区域的中心位置之间的第一距离;当第一距离小于第一距离阈值时,确定操作轨迹形成封闭区域。
在一个可选的实施例中,处理模块1730,还用于获取起始区域和结束区域之间的交集的面积;当交集的面积占比超过面积占比阈值时,确定操作轨迹形成封闭区域,面积占比是交集的面积与和面积的比值,和面积是起始区域的面积和结束区域的面积的和。
在一个可选的实施例中,处理模块1730,还用于获取虚拟对象和指定位置之间的第二距离;当第二距离小于第二距离阈值时,自动拾取得到目标虚拟物品。
请参考图18,其示出了本申请一个示例性实施例提供的终端1800的结构框图。该终端1800可以是便携式移动终端,比如:智能手机、平板电脑、MP3播放器(Moving Picture Experts Group Audio Layer III,动态影像专家压缩标准音频层面3)、MP4(Moving Picture Experts Group Audio Layer IV,动态影像专家压缩标准音频层面4)播放器。终端1800还可能被称为用户设备、便携式终端等其他名称。
通常,终端1800包括有:处理器1801和存储器1802。
处理器1801可以包括一个或多个处理核心,比如4核心处理器、8核心处 理器等。处理器1801可以采用DSP(Digital Signal Processing,数字信号处理)、FPGA(Field-Programmable Gate Array,现场可编程门阵列)、PLA(Programmable Logic Array,可编程逻辑阵列)中的至少一种硬件形式来实现。处理器1801也可以包括主处理器和协处理器,主处理器是用于对在唤醒状态下的数据进行处理的处理器,也称CPU(Central Processing Unit,中央处理器);协处理器是用于对在待机状态下的数据进行处理的低功耗处理器。在一些实施例中,处理器1801可以在集成有GPU(Graphics Processing Unit,图像处理器),GPU用于负责显示屏所需要显示的内容的渲染和绘制。一些实施例中,处理器1801还可以包括AI(Artificial Intelligence,人工智能)处理器,该AI处理器用于处理有关机器学习的计算操作。
存储器1802可以包括一个或多个计算机可读存储介质,该计算机可读存储介质可以是有形的和非暂态的。存储器1802还可包括高速随机存取存储器,以及非易失性存储器,比如一个或多个磁盘存储设备、闪存存储设备。在一些实施例中,存储器1802中的非暂态的计算机可读存储介质用于存储至少一个指令,该至少一个指令用于被处理器1801所执行以实现本申请中提供的在虚拟环境中对虚拟物品进行拾取的方法。
在一些实施例中,终端1800还可选包括有:外围设备接口1803和至少一个外围设备。具体地,外围设备包括:射频电路1804、触摸显示屏1805、摄像头1806、音频电路1807、定位组件1808和电源1809中的至少一种。
在一些实施例中,终端1800还包括有一个或多个传感器1810。该一个或多个传感器1810包括但不限于:加速度传感器1811、陀螺仪传感器1812、压力传感器1813、指纹传感器1814、光学传感器1815以及接近传感器1816。
本领域技术人员可以理解,图18中示出的结构并不构成对终端1800的限定,可以包括比图示更多或更少的组件,或者组合某些组件,或者采用不同的组件布置。
本申请还提供一种计算机可读存储介质,所述存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现上述方法实施例提供 的在虚拟环境中对虚拟物品进行拾取的方法。
可选地,本申请还提供了一种包含指令的计算机程序产品,当其在计算机上运行时,使得计算机执行上述各方面所述的在虚拟环境中对虚拟物品进行拾取的方法。
应当理解的是,在本文中提及的“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。字符“/”一般表示前后关联对象是一种“或”的关系。
上述本申请实施例序号仅仅为了描述,不代表实施例的优劣。
本领域普通技术人员可以理解实现上述实施例的全部或部分步骤可以通过硬件来完成,也可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,上述提到的存储介质可以是只读存储器,磁盘或光盘等。
以上实施例的各技术特征可以进行任意的组合,为使描述简洁,未对上述实施例中的各个技术特征所有可能的组合都进行描述,然而,只要这些技术特征的组合不存在矛盾,都应当认为是本说明书记载的范围。
以上实施例仅表达了本申请的几种实施方式,其描述较为具体和详细,但并不能因此而理解为对发明专利范围的限制。应当指出的是,对于本领域的普通技术人员来说,在不脱离本申请构思的前提下,还可以做出若干变形和改进,这些都属于本申请的保护范围。因此,本申请专利的保护范围应以所附权利要求为准。

Claims (20)

  1. 一种在虚拟环境中对虚拟物品进行拾取的方法,由终端执行,其特征在于,所述方法包括:
    显示用户界面,所述用户界面中显示有虚拟环境和位于所述虚拟环境中的虚拟对象;
    根据对所述用户界面的交互操作触发的第一指令,获取所述交互操作在所述用户界面形成的操作轨迹;
    当所述操作轨迹形成封闭区域时,获取所述虚拟环境中位于所述封闭区域内的至少两个目标虚拟物品;
    将所述目标虚拟物品聚集在所述虚拟环境中的指定位置;
    当所述虚拟对象移动至所述指定位置时,控制所述虚拟对象拾取得到所述目标虚拟物品。
  2. 根据权利要求1所述的方法,其特征在于,所述虚拟物品对应有可拾取区域;所述获取所述虚拟环境中位于所述封闭区域内的至少两个目标虚拟物品,包括:
    在所述目标虚拟物品中,将位于所述可拾取区域与所述封闭区域之间交集区域内的虚拟物品作为候选虚拟物品;或,在所述目标虚拟物品中,将位于所述可拾取区域内、且可拾取区域中心位置在所述封闭区域的虚拟物品作为所述候选虚拟物品;
    将所述候选虚拟物品中属于需要拾取的虚拟物品作为所述目标虚拟物品。
  3. 根据权利要求2所述的方法,其特征在于,所述虚拟物品具有物品类型;所述将所述候选虚拟物品中属于需要拾取的虚拟物品作为所述目标虚拟物品,包括:
    当所述候选虚拟物品中包括不需要拾取的虚拟物品时,将所述候选虚拟物品中除所述不需要拾取的虚拟物品之外的其它候选虚拟物品作为所述目标虚拟物品。
  4. 根据权利要求3所述的方法,其特征在于,所述获取所述交互操作在所述用户界面形成的操作轨迹之前,还包括:
    显示物品设置页面;
    接收在所述物品设置页面触发的物品类型确定信号;
    根据所述物品类型确定信号,确定不需要拾取的物品类型。
  5. 根据权利要求2所述的方法,其特征在于,所述虚拟物品对应有物品数量;所述将所述候选虚拟物品中属于需要拾取的虚拟物品作为所述目标虚拟物品,包括:
    确定所述候选虚拟物品中,是否具有与所述虚拟对象所携带的物品相同、且物品数量超过数量阈值的虚拟物品;
    当所述候选虚拟物品中包括所述物品数量超过数量阈值的虚拟物品时,将所述物品数量超过所述数量阈值的虚拟物品作为超额物品,将所述候选虚拟物品中除去所述超额物品的其它候选虚拟物品作为所述目标虚拟物品。
  6. 根据权利要求1至5任一所述的方法,其特征在于,所述将所述目标虚拟物品聚集在所述虚拟环境中的指定位置,包括:
    接收在所述封闭区域内的被触控位置的触控操作所触发的第二指令;
    根据所述第二指令,将所述被触控位置作为所述指定位置。
  7. 根据权利要求1至5任一所述的方法,其特征在于,所述将所述目标虚拟物品聚集在所述虚拟环境中的指定位置,包括:
    获取所述封闭区域的中心位置;
    将所述封闭区域的中心位置作为所述指定位置。
  8. 根据权利要求1至5任一所述的方法,其特征在于,所述控制所述虚拟对象拾取得到所述目标虚拟物品之后,还包括:
    当所述目标虚拟物品中包括恢复物品时,根据所述恢复物品对应的恢复值增加所述虚拟对象的体力值。
  9. 根据权利要求1至5任一所述的方法,其特征在于,所述控制所述虚拟对象拾取得到所述目标虚拟物品之后,还包括:
    当所述目标虚拟物品中包括升级物品时,根据所述升级物品对应的经验值增加所述虚拟对象的经验值。
  10. 根据权利要求1至5任一所述的方法,其特征在于,所述将所述目标虚拟物品聚集在所述虚拟环境中的指定位置之后,还包括:
    以聚集图标的形式在所述指定位置显示所述至少两个目标虚拟物品。
  11. 根据权利要求1至5任一所述的方法,其特征在于,所述虚拟物品具有物品类型;所述将所述目标虚拟物品聚集在所述虚拟环境中的指定位置之后,还包括:
    将所述至少两个目标虚拟物品中属于相同物品类型的虚拟物品显示为一个图标。
  12. 根据权利要求1至11任一所述的方法,其特征在于,所述交互操作是在终端的触控显示屏上的滑动操作,所述操作轨迹包括起始区域以及结束区域,所述起始区域是所述滑动操作中的手指按下操作在所述终端的显示屏上形成的区域,所述结束区域是所述滑动操作中的手指抬起操作在所述终端的显示屏上形成的区域;
    所述当所述操作轨迹形成封闭区域时,获取所述虚拟环境中位于所述封闭区域内的至少两个目标虚拟物品之前,包括:
    根据所述起始区域以及所述结束区域确定所述操作轨迹是否形成所述封闭区域。
  13. 根据权利要求12所述的方法,其特征在于,所述根据所述起始区域以及所述结束区域确定所述操作轨迹是否形成所述封闭区域,包括:
    获取所述起始区域的中心位置和所述结束区域的中心位置之间的第一距离;
    当所述第一距离小于第一距离阈值时,确定所述操作轨迹形成所述封闭区域。
  14. 根据权利要求12所述的方法,其特征在于,所述根据所述起始区域以及所述结束区域确定所述操作轨迹是否形成所述封闭区域,包括:
    获取所述起始区域和所述结束区域之间的交集的面积;
    当所述交集的面积占比超过面积占比阈值时,确定所述操作轨迹形成所述封闭区域,所述面积占比是所述交集的面积与和面积的比值,所述和面积是所述起始区域的面积和所述结束区域的面积的和。
  15. 根据权利要求1至11任一所述的方法,其特征在于,所述当所述虚拟对象移动至所述指定位置时,控制所述虚拟对象拾取得到所述目标虚拟物品,包括:
    获取所述虚拟对象和所述指定位置之间的第二距离;
    当所述第二距离小于所述第二距离阈值时,自动拾取得到所述目标虚拟物品。
  16. 一种在虚拟环境中对虚拟物品进行拾取的装置,其特征在于,所述装置包括:
    显示模块,用于显示用户界面,所述用户界面中显示有所述虚拟环境和位于所述虚拟环境中的虚拟对象;
    获取模块,用于根据对所述用户界面的交互操作触发的第一指令,获取所述交互操作在所述用户界面形成的操作轨迹;当所述操作轨迹形成封闭区域时,获取所述虚拟环境中位于所述封闭区域内的至少两个目标虚拟物品;
    处理模块,用于将所述目标虚拟物品聚集在所述虚拟环境中的指定位置;当所述虚拟对象移动至所述指定位置时,控制所述虚拟对象拾取得到所述目标虚拟物品。
  17. 一种终端,其特征在于,所述终端包括处理器和存储器,所述存储器中存储有至少一条指令,所述指令由所述处理器加载并执行以实现以下步骤:
    显示用户界面,所述用户界面中显示有虚拟环境和位于所述虚拟环境中的虚拟对象;
    根据对所述用户界面的交互操作触发的第一指令,获取所述交互操作在所述用户界面形成的操作轨迹;
    当所述操作轨迹形成封闭区域时,获取所述虚拟环境中位于所述封闭区域内的至少两个目标虚拟物品;
    将所述目标虚拟物品聚集在所述虚拟环境中的指定位置;
    当所述虚拟对象移动至所述指定位置时,控制所述虚拟对象拾取得到所述目标虚拟物品。
  18. 如权利要求17所述的终端,其特征在于,所述指令由所述处理器加载并执行以实现以下步骤:
    在所述虚拟物品中,将位于所述可拾取区域与所述封闭区域之间交集区域 内的虚拟物品作为候选虚拟物品;或,在所述虚拟物品中,将位于所述可拾取区域内、且可拾取区域中心位置在所述封闭区域的虚拟物品作为所述候选虚拟物品;
    将所述候选虚拟物品中属于需要拾取的虚拟物品作为所述目标虚拟物品。
  19. 一种计算机可读存储介质,其特征在于,所述存储介质中存储有至少一条指令,所述指令由处理器加载并执行以实现以下步骤:
    显示用户界面,所述用户界面中显示有虚拟环境和位于所述虚拟环境中的虚拟对象;
    根据对所述用户界面的交互操作触发的第一指令,获取所述交互操作在所述用户界面形成的操作轨迹;
    当所述操作轨迹形成封闭区域时,获取所述虚拟环境中位于所述封闭区域内的至少两个目标虚拟物品;
    将所述目标虚拟物品聚集在所述虚拟环境中的指定位置;
    当所述虚拟对象移动至所述指定位置时,控制所述虚拟对象拾取得到所述目标虚拟物品。
  20. 如权利要求19所述的计算机可读存储介质,其特征在于,所述指令由所述处理器加载并执行以实现以下步骤:
    在所述虚拟物品中,将位于所述可拾取区域与所述封闭区域之间交集区域内的虚拟物品作为候选虚拟物品;或,在所述虚拟物品中,将位于所述可拾取区域内、且可拾取区域中心位置在所述封闭区域的虚拟物品作为所述候选虚拟物品;
    将所述候选虚拟物品中属于需要拾取的虚拟物品作为所述目标虚拟物品。
PCT/CN2019/094208 2018-08-31 2019-07-01 在虚拟环境中对虚拟物品进行拾取的方法、装置、终端和计算机可读存储介质 WO2020042746A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/006,358 US20200393953A1 (en) 2018-08-31 2020-08-28 Method and apparatus, computer device, and storage medium for picking up a virtual item in a virtual environment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811014962.7A CN109126129B (zh) 2018-08-31 2018-08-31 在虚拟环境中对虚拟物品进行拾取的方法、装置及终端
CN201811014962.7 2018-08-31

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/006,358 Continuation US20200393953A1 (en) 2018-08-31 2020-08-28 Method and apparatus, computer device, and storage medium for picking up a virtual item in a virtual environment

Publications (1)

Publication Number Publication Date
WO2020042746A1 true WO2020042746A1 (zh) 2020-03-05

Family

ID=64825993

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/094208 WO2020042746A1 (zh) 2018-08-31 2019-07-01 在虚拟环境中对虚拟物品进行拾取的方法、装置、终端和计算机可读存储介质

Country Status (3)

Country Link
US (1) US20200393953A1 (zh)
CN (1) CN109126129B (zh)
WO (1) WO2020042746A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113499586A (zh) * 2021-07-08 2021-10-15 网易(杭州)网络有限公司 一种游戏中的信息提示方法、装置、电子设备及存储介质
CN113537443A (zh) * 2021-09-17 2021-10-22 深圳市信润富联数字科技有限公司 虚拟角色养成方法、装置、设备及可读存储介质
CN113813599A (zh) * 2021-08-27 2021-12-21 腾讯科技(深圳)有限公司 虚拟角色的控制方法和装置、存储介质及电子设备
CN114210057A (zh) * 2021-11-02 2022-03-22 腾讯科技(深圳)有限公司 虚拟道具的拾取处理方法、装置、设备、介质及程序产品

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110420453B (zh) * 2019-01-14 2023-07-14 网易(杭州)网络有限公司 虚拟对象运动控制方法及装置、存储介质、电子设备
CN110270098B (zh) * 2019-06-21 2023-06-23 腾讯科技(深圳)有限公司 控制虚拟对象对虚拟物品进行标记的方法、装置及介质
CN110507993B (zh) * 2019-08-23 2020-12-11 腾讯科技(深圳)有限公司 控制虚拟对象的方法、装置、设备及介质
CN111151001B (zh) * 2019-12-26 2023-04-25 网易(杭州)网络有限公司 虚拟物品的处理方法、装置、存储介质和电子装置
CN111167124A (zh) * 2019-12-31 2020-05-19 腾讯科技(深圳)有限公司 虚拟道具获取方法、装置、存储介质及电子装置
JP7185670B2 (ja) * 2020-09-02 2022-12-07 株式会社スクウェア・エニックス ビデオゲーム処理プログラム、及びビデオゲーム処理システム
CN112190922A (zh) * 2020-10-22 2021-01-08 网易(杭州)网络有限公司 虚拟物品的处理方法、装置、存储介质及电子装置
US11327630B1 (en) * 2021-02-04 2022-05-10 Huawei Technologies Co., Ltd. Devices, methods, systems, and media for selecting virtual objects for extended reality interaction
CN113262475A (zh) * 2021-06-07 2021-08-17 网易(杭州)网络有限公司 游戏中的虚拟道具使用方法、装置、设备及存储介质
CN113413599A (zh) * 2021-07-01 2021-09-21 网易(杭州)网络有限公司 一种虚拟物品管理方法、装置、终端及存储介质
CN113546425B (zh) * 2021-07-27 2024-03-01 网易(杭州)网络有限公司 游戏中虚拟物品处理方法、装置、终端和存储介质
CN113694513A (zh) * 2021-08-16 2021-11-26 网易(杭州)网络有限公司 游戏中拾取物品的方法、装置、存储介质及电子设备
CN113384901B (zh) * 2021-08-16 2022-01-18 北京蔚领时代科技有限公司 交互程序实例处理方法、装置、计算机设备及存储介质
CN113986099B (zh) * 2021-10-22 2023-08-18 网易(杭州)网络有限公司 虚拟物品的交互方法、装置、计算机可读介质及电子设备
CN113975798B (zh) * 2021-11-09 2023-07-04 北京字跳网络技术有限公司 一种交互控制方法、装置以及计算机存储介质
CN114201499B (zh) * 2022-02-18 2022-05-24 树根互联股份有限公司 一种数据上传方法、装置及计算机设备

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160001184A1 (en) * 2014-07-04 2016-01-07 Trendy Entertainment Multi-platform overlay and library system and methods
CN106527887A (zh) * 2016-10-18 2017-03-22 腾讯科技(深圳)有限公司 虚拟物体选取方法、装置及vr系统
CN107837531A (zh) * 2017-09-28 2018-03-27 网易(杭州)网络有限公司 信息处理方法、装置、电子设备及存储介质

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6357023B2 (ja) * 2014-06-06 2018-07-11 任天堂株式会社 情報処理プログラム、情報処理装置、情報処理装置の制御方法および情報処理システム
CN106201235A (zh) * 2015-04-29 2016-12-07 宇龙计算机通信科技(深圳)有限公司 对象选择的方法、装置及终端
US10052561B2 (en) * 2016-03-21 2018-08-21 Roblox Corporation Tracking and recommendation system for online gaming
CN108159696B (zh) * 2017-12-19 2021-12-28 网易(杭州)网络有限公司 信息处理方法、装置、电子设备及存储介质
CN108459811B (zh) * 2018-01-09 2021-03-16 网易(杭州)网络有限公司 虚拟道具的处理方法、装置、电子设备及存储介质

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160001184A1 (en) * 2014-07-04 2016-01-07 Trendy Entertainment Multi-platform overlay and library system and methods
CN106527887A (zh) * 2016-10-18 2017-03-22 腾讯科技(深圳)有限公司 虚拟物体选取方法、装置及vr系统
CN107837531A (zh) * 2017-09-28 2018-03-27 网易(杭州)网络有限公司 信息处理方法、装置、电子设备及存储介质

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113499586A (zh) * 2021-07-08 2021-10-15 网易(杭州)网络有限公司 一种游戏中的信息提示方法、装置、电子设备及存储介质
CN113499586B (zh) * 2021-07-08 2024-04-12 网易(杭州)网络有限公司 一种游戏中的信息提示方法、装置、电子设备及存储介质
CN113813599A (zh) * 2021-08-27 2021-12-21 腾讯科技(深圳)有限公司 虚拟角色的控制方法和装置、存储介质及电子设备
CN113813599B (zh) * 2021-08-27 2023-07-14 腾讯科技(深圳)有限公司 虚拟角色的控制方法和装置、存储介质及电子设备
CN113537443A (zh) * 2021-09-17 2021-10-22 深圳市信润富联数字科技有限公司 虚拟角色养成方法、装置、设备及可读存储介质
CN113537443B (zh) * 2021-09-17 2022-01-07 深圳市信润富联数字科技有限公司 虚拟角色养成方法、装置、设备及可读存储介质
CN114210057A (zh) * 2021-11-02 2022-03-22 腾讯科技(深圳)有限公司 虚拟道具的拾取处理方法、装置、设备、介质及程序产品
CN114210057B (zh) * 2021-11-02 2023-07-25 腾讯科技(深圳)有限公司 虚拟道具的拾取处理方法、装置、设备、介质及程序产品

Also Published As

Publication number Publication date
CN109126129A (zh) 2019-01-04
CN109126129B (zh) 2022-03-08
US20200393953A1 (en) 2020-12-17

Similar Documents

Publication Publication Date Title
WO2020042746A1 (zh) 在虚拟环境中对虚拟物品进行拾取的方法、装置、终端和计算机可读存储介质
US20220047941A1 (en) Virtual object control method and apparatus, device, and storage medium
WO2021218516A1 (zh) 虚拟对象控制方法、装置、设备及存储介质
US11577171B2 (en) Method and apparatus for prompting that virtual object is attacked, terminal, and storage medium
JP2022533051A (ja) 仮想オブジェクトの制御方法、装置、デバイス及びコンピュータプログラム
US9764226B2 (en) Providing enhanced game mechanics
WO2022247592A1 (zh) 虚拟道具的切换方法、装置、终端及存储介质
US9004997B1 (en) Providing enhanced game mechanics
WO2022037529A1 (zh) 虚拟对象的控制方法、装置、终端及存储介质
WO2023138192A1 (zh) 控制虚拟对象拾取虚拟道具的方法、终端及存储介质
WO2023066003A1 (zh) 虚拟对象的控制方法、装置、终端、存储介质及程序产品
CN113546417A (zh) 一种信息处理方法、装置、电子设备和存储介质
JP2024519880A (ja) 仮想環境画面の表示方法、装置、端末及びコンピュータプログラム
TW202224740A (zh) 虛擬對象互動模式的選擇方法、裝置、設備、媒體及產品
JP7384521B2 (ja) 仮想オブジェクトの制御方法、装置、コンピュータ機器及びコンピュータプログラム
CN114307129A (zh) 游戏交互控制方法、装置、设备及介质
JP7419400B2 (ja) 仮想オブジェクトの制御方法、装置、端末及びコンピュータプログラム
CN114307130A (zh) 游戏交互控制方法、装置、设备及介质
CN115970282A (zh) 虚拟镜头的控制方法、装置、存储介质及计算机设备
CN116943148A (zh) 游戏场景中发送信号的方法、装置、存储介质及电子装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19853939

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19853939

Country of ref document: EP

Kind code of ref document: A1