CN107765986B - Information processing method and device of game system - Google Patents

Information processing method and device of game system Download PDF

Info

Publication number
CN107765986B
CN107765986B CN201711057750.2A CN201711057750A CN107765986B CN 107765986 B CN107765986 B CN 107765986B CN 201711057750 A CN201711057750 A CN 201711057750A CN 107765986 B CN107765986 B CN 107765986B
Authority
CN
China
Prior art keywords
component
moving
virtual lens
rotation angle
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711057750.2A
Other languages
Chinese (zh)
Other versions
CN107765986A (en
Inventor
尹骏
李光源
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201711057750.2A priority Critical patent/CN107765986B/en
Publication of CN107765986A publication Critical patent/CN107765986A/en
Application granted granted Critical
Publication of CN107765986B publication Critical patent/CN107765986B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets

Abstract

The invention provides an information processing method and a device of a game system, wherein the method comprises the following steps: providing a visual angle control area on a graphical user interface, and detecting touch operation acting on the visual angle control area; acquiring the moving distance, the moving speed and the moving direction of a contact corresponding to touch operation; determining the rotation angle of the virtual lens according to the moving distance and the moving speed, and determining the rotation direction of the virtual lens according to the moving direction; and according to the rotation angle and the rotation direction, rotating the virtual lens to adjust the game scene image displayed on the graphical user interface. Because the rotation angle of the lens is determined according to the moving distance and the moving speed, the user control contact moves slowly to control the virtual lens to rotate by a small rotation angle, and when the user control contact moves quickly, the user control contact does not need to move by too large distance, and the virtual lens can rotate by a large rotation angle, so that the rotation accuracy of the virtual lens is controlled to be higher, and the fault tolerance is higher.

Description

Information processing method and device of game system
Technical Field
The embodiment of the invention relates to the technical field of games, in particular to an information processing method and device of a game system.
Background
First-person Shooter games (FPSs) are shooting games played from a subjective perspective of a game player. The game players do not manipulate the virtual characters in the screen to play like other games, but experience the visual impact brought by the game personally, so that the initiative and the sense of reality of the game are greatly enhanced. If the FPS runs on the handheld device and the operation range of the handheld device is the touch screen, the operation on the FPS can only be performed on the touch screen, and therefore, how to operate the FPS on the touch screen is one of the difficulties.
One current approach is: referring to fig. 1, the screen of the handheld device is divided into left and right screens, and the left screen is used for controlling the movement of the angle in the game by pressing and moving the finger of the player on the left screen. The right screen is used to take charge of lens rotation and shooting functions, i.e., the player's finger presses and moves the right screen to control the movement of the lens, and clicks a designated area (e.g., a shooting icon in fig. 1) of the right screen to perform a shooting operation. When the lens is controlled to move, the rotation angle of the lens is in direct proportion to the moving distance of the finger of the player on the right screen, namely, the following formula is used for controlling the movement of the lens: and determining the angle of the lens rotation by a constant factor, namely the moving distance of the finger on the screen.
Frequently controlling the movement of the lens is one of the features of the FPS, and the rotation of the lens is often closely associated with various shooting operations. However, if the player suddenly suffered a shot in the scene, but did not see who shot the player within the current shot. At this time, the player often needs to rapidly and widely move the angle of the lens to find a potential shooter, and a large constant factor is needed. Alternatively, when the player is in the aiming state, the player needs to slowly finely adjust the rotation of the lens to aim at the enemy in the lens and complete the shooting operation, and a small constant factor is needed. Therefore, the above two conditions cannot be satisfied simultaneously by the above scheme, resulting in a low accuracy rate of controlling the rotation angle of the lens.
Disclosure of Invention
The embodiment of the invention provides an information processing method and device of a game system, which are used for controlling the rotation angle of a virtual lens more accurately and have higher error tolerance experience.
In a first aspect, an embodiment of the present invention provides an information processing method for a game system, which is applied to a touch terminal capable of presenting a graphical user interface, where content displayed on the graphical user interface includes a game scene image captured through a virtual lens, and the method includes:
providing a visual angle control area on the graphical user interface, and detecting touch operation acting on the visual angle control area;
acquiring the moving distance, the moving speed and the moving direction of a contact corresponding to the touch operation;
determining the rotation angle of the virtual lens according to the moving distance and the moving speed, and determining the rotation direction of the virtual lens according to the moving direction;
and according to the rotating angle and the rotating direction of the virtual lens, rotating the virtual lens to adjust the game scene image displayed on the graphical user interface.
In a second aspect, an embodiment of the present invention provides an information processing apparatus of a game system, applied to a touch terminal capable of presenting a graphical user interface, where content displayed on the graphical user interface includes a game scene image captured through a virtual lens, the apparatus including:
the detection module is used for providing a visual angle control area on the graphical user interface and detecting touch operation acting on the visual angle control area;
the acquisition module is used for acquiring the moving distance, the moving speed and the moving direction of the contact corresponding to the touch operation;
the determining module is used for determining the rotation angle of the virtual lens according to the moving distance and the moving speed and determining the rotation direction of the virtual lens according to the moving direction;
and the processing module is used for rotating the virtual lens to adjust the game scene image displayed on the graphical user interface according to the rotating angle and the rotating direction of the virtual lens.
In a third aspect, an embodiment of the present invention provides a touch terminal, including: a graphical user interface, a memory, and a processor; the content displayed by the graphical user interface comprises a game scene image captured through a virtual lens;
a memory for storing program instructions;
the processor is configured to implement the scheme provided by the embodiment of the present invention in the first aspect when the program instructions are executed.
In a fourth aspect, an embodiment of the present invention provides a storage medium, including: a readable storage medium and a computer program for implementing the solution according to the embodiment of the present invention of the first aspect.
In a fifth aspect, an embodiment of the present invention provides a program product, where the program product includes a computer program, where the computer program is stored in a readable storage medium, and at least one processor of a touch terminal can read the computer program from the readable storage medium, and execute the computer program by the at least one processor, so that the touch terminal implements the solution provided in the embodiment of the present invention in the first aspect.
The embodiment of the invention provides an information processing method and device of a game system, which are used for detecting touch operation acting on a graphical user interface; acquiring the moving distance, the moving speed and the moving direction of a contact corresponding to the touch operation; determining the rotation angle of the virtual lens according to the moving distance and the moving speed, and determining the rotation direction of the virtual lens according to the moving direction; and according to the rotating angle and the rotating direction of the virtual lens, rotating the virtual lens to adjust the game scene image displayed on the graphical user interface. Since the rotation angle of the virtual lens in this embodiment is determined according to the moving distance and the moving speed, when the user needs to fine-tune the virtual lens, and the user control contact moves slowly, the virtual lens rotates by a smaller rotation angle. When the user need remove virtual camera lens by a wide margin, user control contact moves fast, need not to remove too big distance moreover, and virtual camera lens rotates great turned angle, has solved among the prior art unable satisfying fine setting simultaneously and has adjusted virtual camera lens's turned angle's problem by a wide margin, and the accurate rate of this embodiment control virtual camera lens's pivoted angle is higher, still has higher error tolerance and experiences.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
FIG. 1 is a diagram illustrating interaction of an FPS on a handheld device according to the prior art;
FIG. 2 is a flowchart of an information processing method of a game system according to an embodiment of the present invention;
FIG. 3 is a flowchart of an information processing method of a game system according to another embodiment of the present invention;
fig. 4 is a schematic view illustrating a rotation direction of a virtual lens according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of the moving direction of a contact on a graphical user interface according to an embodiment of the present invention;
FIG. 6 is a schematic structural diagram of an information processing apparatus of a game system according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of a touch terminal according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 2 is a flowchart of an information processing method of a game system according to an embodiment of the present invention, and as shown in fig. 2, the method of the embodiment may be applied to a touch terminal capable of presenting a graphical user interface, and the method of the embodiment may include:
s101, providing a visual angle control area on the graphical user interface, and detecting touch operation acting on the visual angle control area.
The method of the embodiment can be applied to a touch terminal capable of presenting a graphical user interface, such as a touch computer, a smart phone, a tablet computer, a game machine, and the like. The graphical user interface is an important component of the touch terminal and is an interface interacting with a user, the user can operate the graphical user interface, for example, control over a game running in the touch terminal is achieved, and meanwhile the graphical user interface can display related content, wherein the content displayed by the graphical user interface comprises a game scene image captured through a virtual lens; and the graphical user interface provides a view angle control area, where the view angle control area is used to control a game view angle, and the view angle control area belongs to a display area of the graphical user interface, and may be a partial area of the graphical user interface or a whole area of the graphical user interface, which is not limited in this embodiment.
In this embodiment, when the user wants to control the virtual lens to change the game scene image displayed in the graphical user interface, the user performs a touch operation on the view angle control area, where the touch operation is used to trigger rotation control of the virtual lens, and the touch operation may be a touch operation, or a click operation performed by a mouse, or a click operation performed by a stylus.
And S102, acquiring the moving distance, the moving speed and the moving direction of the touch point corresponding to the touch operation.
In this embodiment, a touch point is formed at a touch operation performed by a corresponding user in the view angle control area, and when the user performs the touch operation in the view angle control area, the touch point moves in the view angle control area, and accordingly, the embodiment may detect that the touch point corresponding to the touch operation moves in the view angle control area, where a moving process of the touch point determines a rotation angle of the virtual lens, and then the embodiment may acquire a moving distance, a moving speed, and a moving direction of the touch point in the view angle control area.
S103, determining the rotation angle of the virtual lens according to the moving distance and the moving speed, and determining the rotation direction of the virtual lens according to the moving direction.
In this embodiment, after the moving distance and the moving speed of the contact in the view angle control area are acquired, the moving distance and the moving speed may reflect the moving process of the contact, and then the rotation angle of the virtual lens is determined according to the moving distance and the moving speed. In addition, after the moving direction of the touch point in the view angle control area is acquired, and the moving direction of the touch point determines the rotating direction of the virtual lens, for example, the moving direction of the touch point and the rotating direction of the virtual lens may satisfy a certain relationship, so the rotating direction of the virtual lens may be determined according to the moving direction of the touch point. For example: if the moving direction of the contact is a vertical direction, the rotating direction of the virtual lens is a vertical rotating direction, and if the moving direction of the contact is a horizontal direction, the rotating direction of the virtual lens is a horizontal rotating direction.
The rotation angle is proportional to the moving distance, i.e., the larger the moving distance is, the larger the rotation angle is, and the smaller the moving distance is, the smaller the rotation angle is.
The rotation angle is proportional to the moving speed, i.e. the larger the moving speed, the larger the rotation angle, and the smaller the moving speed, the smaller the moving speed.
In some embodiments, one possible implementation of S103 is: and determining the rotation angle according to the moving distance, the moving speed and a preset factor. The rotation angle, the moving distance, the moving speed and the preset factor can satisfy a certain functional relationship, for example: the rotation angle can be determined according to the product of the moving distance, the moving speed and the preset factor.
For example: the rotation angle is equal to the moving distance x the moving speed x a preset factor.
And S104, rotating the virtual lens according to the rotating angle and the rotating direction of the virtual lens to adjust the game scene image displayed on the graphical user interface.
In this embodiment, after the rotation angle and the rotation direction are determined, the virtual lens is controlled to rotate, the rotation angle of the virtual lens is controlled to be equal to the rotation angle determined in S103, and the direction of the virtual lens is controlled to rotate in the same direction as the rotation direction determined in S103. Because the virtual lens rotates to cause the change of the visual angle of the virtual lens, and accordingly the game scene graph corresponding to the visual angle also changes, the game scene image displayed on the graphical user interface can be adjusted, and the adjusted displayed game scene image is the game scene image corresponding to the virtual lens after the virtual lens rotates. The user can determine whether the virtual lens is rotated to the required visibility according to the game scene image displayed by the graphical user interface.
In the embodiment, touch operation acting on a view control area in a graphical user interface is detected; acquiring the moving distance, the moving speed and the moving direction of a contact corresponding to the touch operation; determining the rotation angle of the virtual lens according to the moving distance and the moving speed, and determining the rotation direction of the virtual lens according to the moving direction; and according to the rotating angle and the rotating direction of the virtual lens, rotating the virtual lens to adjust the game scene image displayed on the graphical user interface. Since the rotation angle of the virtual lens in this embodiment is determined according to the moving distance and the moving speed, when the user needs to fine-tune the virtual lens (i.e., fine-tune the viewing angle), and the user controls the contact to slowly move, the virtual lens rotates by a smaller rotation angle. When a user needs to move the virtual lens greatly (namely, the whole visual angle of a large scale), the user control contact moves rapidly without moving too large distance, and the virtual lens rotates by a large rotation angle, so that the problem that the rotation angle of the virtual lens cannot be adjusted finely and greatly (namely, the game scene image is adjusted) in the prior art is solved, the accurate rate of the rotation angle of the virtual lens controlled by the embodiment is higher, and higher error tolerance experience is also realized.
In some embodiments, one way to obtain the moving speed of the contact is to: acquiring the moving time of the contact in the view angle control area, wherein the moving time is the time length of the contact moving the moving distance, and the moving distance can also be considered as the moving distance of the contact in the moving time; and then determining the moving speed according to the moving distance and the moving time. For example: the moving speed is the moving distance ÷ moving time, and in some examples, the rotation angle is the moving distance × moving speed × preset factor, and thus, the rotation angle is the moving distance × preset factor ÷ moving time.
In some embodiments, the movement time is a length of time that the graphical user interface displays every N frames of game scene images, where N is an integer greater than 0. The game scene images are displayed frame by frame, the moving distance of the touch point in the visual angle control area refers to the distance difference between the position of the touch point detected last time and the position of the touch point detected currently, and the time length of displaying N frames of game scene images between the last detection and the current detection is considered to be the time length of displaying N frames of game scene images, namely, one touch detection is executed every time the N frames of game scene images are displayed. For example: the game scene image is rendered 30 times in 1s, which means that 30 frames of game scene images are displayed in 1s, and the touch detection in the view angle control area is usually performed 30 times in 1s, which can be considered that the moving time is 1/30s when the 1 frame of game scene image is displayed between the last detection and the current detection.
Therefore, the embodiment obtains the moving distance of the contact movement within the duration of displaying every N frames of game scene images on the graphical user interface; determining a rotation speed according to the movement distance and the time length, acquiring a rotation angle according to the rotation distance and the rotation speed, and determining a rotation direction according to the movement direction of the contact in the time length of each N frames of game scene images; and then according to the rotation angle and the rotation direction, rotating the virtual lens in the duration of every N frames of game scene images to adjust the game scene images displayed on the graphical user interface. Therefore, the lens rotates along with the movement of the contact in real time, and the user experience is improved.
Fig. 3 is a flowchart of an information processing method of a game system according to another embodiment of the present invention, and as shown in fig. 3, a moving distance of a touch point in the method of the present embodiment includes a moving distance component in a horizontal direction and a moving distance component in a vertical direction; the moving speed of the contact includes: a horizontal direction moving velocity component and a vertical direction moving velocity component; accordingly, the rotation angle includes a horizontal direction rotation angle component and a vertical direction rotation angle component. In some cases, the horizontal direction movement distance component may be 0, and at this time, it may be considered that no contact is moved in the horizontal direction, and accordingly, the horizontal direction movement speed component is 0, and the horizontal direction rotation angle component is also 0. In some cases, the vertical direction movement distance component may be 0, and at this time, it may be considered that no contact is moved in the vertical direction, and accordingly, the vertical direction movement speed component is 0, and the vertical direction rotation angle component is also 0.
The method of the embodiment may include:
s201, providing a visual angle control area on a graphical user interface, and detecting touch operation acting on the visual angle control area.
In this embodiment, a specific implementation process of S201 may refer to related descriptions in the embodiment shown in fig. 2, and details are not described here.
S202, acquiring a horizontal moving distance component, a vertical moving distance component and a moving direction of the touch point corresponding to the touch operation.
In this embodiment, a horizontal direction movement distance component and a vertical direction movement distance component within the movement time in the process of the movement of the contact in the view angle control area are acquired.
In some embodiments, considering the inconsistency of screen resolutions of different touch terminals, the touch terminals with different screen sizes may have different virtual lens rotation control operation experiences. In order to enable different touch terminals to have the same virtual lens rotation control experience, the moving distance of the contact in the visual angle control area can be converted into a screen occupation ratio. Because, the occupied screen ratio is a non-differentiated variable on the touch terminals with different graphic user interface sizes.
Therefore, the horizontal direction movement distance component is: the ratio of K1 to M1; wherein the K1 is the number of horizontal pixels in the view angle control area through which the contact passes within the movement time, and the M1 is the number of horizontal pixels in the view angle control area. For example: half of the horizontal direction in the graphical user interface is used for controlling the virtual lens to rotate, which is not limited in this embodiment; generally, the screen resolution of the touch terminal is represented by the width of the screen resolution × the width of the screen resolution, so that if the touch terminal is a mobile phone or a tablet computer, and the scheme of this embodiment is executed, the touch terminal is in the landscape mode, and M1 is: the resolution of the screen of the touch terminal is high/2; if the touch terminal is a touch notebook computer, etc., and the scheme of the present embodiment is executed, M1 is: width of screen resolution of the touch terminal ÷ 2.
The vertical direction movement distance component is: the ratio of K2 to M2; wherein the K2 is the number of vertical pixels in a view angle control area through which the contact passes within the movement time, and the M2 is the number of vertical pixels in the view angle control area. For example: the vertical direction in the view angle control area is all used for controlling the virtual lens to rotate, and the embodiment is not limited thereto; therefore, if the touch terminal is a mobile phone or a tablet computer, when the scheme of this embodiment is implemented, the touch terminal is in the landscape mode, and M2 is the width of the screen resolution of the touch terminal, and if the touch terminal is a touch notebook computer, etc., when the scheme of this embodiment is implemented, M2 is: the screen resolution of the touch terminal is high.
S203, determining the horizontal direction moving speed component of the contact according to the moving time of the contact and the horizontal direction moving distance component, and determining the vertical direction moving speed component of the contact according to the moving time of the contact and the vertical direction moving distance component.
For example: determining a horizontal direction moving distance component and a moving time as a horizontal direction moving speed component; the vertical direction moving distance component ÷ moving time is determined as the vertical direction moving velocity component.
S204, determining a horizontal direction rotation angle component of the virtual lens according to the horizontal direction movement distance component and the horizontal direction movement speed component; determining a vertical direction rotation angle component of the virtual lens according to the vertical direction movement distance component and the vertical direction movement speed component; and determining the rotation angle of the virtual lens according to the horizontal rotation angle component and the vertical rotation angle component.
In this embodiment, after the horizontal direction movement distance component and the horizontal direction movement speed component are determined, the horizontal direction rotation angle component of the virtual lens is determined based on the horizontal direction movement distance component and the horizontal direction movement speed component. After the vertical direction moving distance component and the vertical direction moving speed component are determined, the vertical direction rotation angle component of the virtual lens is determined according to the vertical direction moving distance component and the vertical direction moving speed component.
In this embodiment, in an actual application, the movement of the touch point corresponding to the touch operation in the view angle control area is usually not completely horizontal or vertical, for example, a certain slope or curvature exists in the movement of the touch point in the view angle control area within a certain time (assuming that the time is a time of one frame of image). For the situation, the components of the contact points in the horizontal direction and the vertical direction in the visual angle control area are calculated and then superposed, so that the rotation angle of the virtual lens is determined. In this way, the user has a higher error tolerance experience.
In some embodiments, one way to determine the horizontal direction rotation angle component is: and determining the horizontal direction rotation angle component of the virtual lens according to the product of the horizontal direction movement distance component, the horizontal direction movement speed component and a first preset factor. For example: the horizontal direction rotation angle component is equal to the horizontal direction movement distance component × the horizontal direction movement speed component × the first preset factor. Optionally, in order to avoid the excessive horizontal rotation angle component caused by the excessive horizontal movement velocity component, the horizontal rotation angle component of each frame may be set to be equal to or less than a first angle threshold, which is, for example, 30 degrees.
In some embodiments, one way to determine the vertical direction rotation angle component is: and determining the vertical direction rotation angle component of the virtual lens according to the product of the vertical direction movement distance component, the vertical direction movement speed component and a second preset factor. For example: the vertical direction rotation angle component is equal to the horizontal direction movement distance component × the vertical direction movement speed component × the second preset factor. Optionally, in order to avoid the vertical direction rotation angle component being too large due to the too large vertical direction moving speed component, the embodiment may further set the vertical direction rotation angle component of each frame to be less than or equal to a second angle threshold, where the second angle threshold is, for example, 15 degrees.
The first preset factor and the second preset factor may be the same or different.
In some embodiments, in order to ensure that the distance of the whole half screen width of the gui just enables the virtual lens to rotate 180 degrees horizontally when the contact moves in the horizontal direction, the requirement of rapidly moving the virtual lens to the right back of the current field of view is met, therefore, the embodiment sets the first preset factor as the product of the number of horizontal pixels of the gui and the first preset value.
The embodiment can set different preset factors on different touch terminals in advance and debug the moving effect of the virtual lens. Wherein the contact point fast movement can be set to 0.1 m/s. Finally, it is practical to obtain that, in a case where an error is allowed (where the error is not necessarily exactly half of a screen width, and a deviation of 10% is allowed), taking a touch terminal as an example of a mobile phone, a screen resolution of the mobile phone is, for example, 750 × 1334, where a screen resolution of the mobile phone is 1334, and when the mobile phone is in a landscape mode, that is, the mobile phone has a number of horizontal pixels of a graphical user interface of 1334, when a preset factor is 18000, the lens moves in a horizontal direction, which can meet the above requirement. Therefore, in the embodiment, the first preset value can be set as 18000 ÷ 13342Accordingly, the first predetermined factor is equal to the number of horizontal pixels of the gui × 0.0101.
In some embodiments, in order to ensure that the distance of the whole screen height of the gui just enables the virtual lens to rotate 90 degrees in the vertical direction when the contact moves in the vertical direction quickly, the requirement of quickly moving the virtual lens to the position right above and below the current field of view is met, therefore, the second preset factor is set as the product of the number of vertical pixels of the gui and the second preset value.
The embodiment can set different preset factors on different touch terminals in advance and debug the moving effect of the virtual lens. Wherein the contact point fast movement can be set to 0.1 m/s. Finally, it is practical to obtain that, in a case where an error is allowed (where the error is not necessarily exactly the height of the full screen of the mobile phone, and a deviation of 10% is allowed), taking a touch terminal as an example of a mobile phone, a screen resolution of the mobile phone is, for example, 750 × 1334, where when a width of the screen resolution of the mobile phone is 750, when the mobile phone is in a landscape mode, that is, the number of vertical pixels of a graphical user interface of the mobile phone is 750, and when a preset factor is 9000, the movement of the lens in the vertical direction can meet the above requirement. Therefore, in this embodiment, the second preset value can be set to 9000 ÷ 750 ÷ s2Accordingly, the second predetermined factor is equal to the number of vertical pixels of the gui × 0.016.
And S205, determining the rotation direction of the virtual lens according to the moving direction.
In this embodiment, the moving direction of the contact may include a horizontal moving direction component and a vertical moving direction component, and therefore, the present embodiment determines that the rotating direction of the virtual lens includes a horizontal rotating direction component according to the horizontal moving direction component of the contact, and determines that the rotating direction of the virtual lens includes a vertical rotating direction component according to the vertical moving direction component of the contact. For example, as shown in fig. 4, the horizontal rotation direction component of the virtual lens means that the virtual lens rotates around the y-axis, and the vertical rotation component of the virtual lens means that the virtual lens rotates around the z-axis. In addition, the horizontal movement direction component of the contact may be a horizontal left movement component or a horizontal left movement component, and the vertical movement direction component of the contact may be a vertical upward movement component or a vertical downward movement component, for example, as shown in fig. 5.
In one implementation, if the horizontal moving direction component of the contact is the horizontal rightward moving component, the present embodiment determines that the horizontal rotating direction component of the virtual lens is horizontal rotation in the clockwise direction according to the horizontal rightward moving component of the contact. If the horizontal moving direction component of the contact is the horizontal leftward moving component, the present embodiment determines that the horizontal rotating direction component of the virtual lens is horizontal rotation in the counterclockwise direction according to the horizontal leftward moving component of the contact.
In another implementation manner, if the horizontal moving direction component of the contact is a horizontal rightward moving component, the present embodiment determines that the horizontal rotating direction component of the virtual lens is a horizontal rotation in a counterclockwise direction according to the horizontal rightward moving component of the contact. If the horizontal moving direction component of the contact is the horizontal leftward moving component, the present embodiment determines that the horizontal rotating direction component of the virtual lens is horizontal rotation in the clockwise direction according to the horizontal leftward moving component of the contact.
In one implementation, if the vertical moving direction component of the contact is a vertical upward moving component, the embodiment determines that the vertical rotating direction component of the virtual lens is a vertical rotation in a clockwise direction according to the vertical upward moving component of the contact. If the vertical movement direction component of the contact is the vertical downward movement component, the present embodiment determines the vertical rotation direction component of the virtual lens to be a vertical rotation in the counterclockwise direction according to the vertical downward movement component of the contact.
In another implementation manner, if the vertical moving direction component of the contact is a vertical upward moving component, the embodiment determines that the vertical rotating direction component of the virtual lens is a vertical rotation in a counterclockwise direction according to the vertical upward moving component of the contact. If the vertical movement direction component of the contact is the vertical downward movement component, the embodiment determines the vertical rotation direction component of the virtual lens to be vertical rotation in the clockwise direction according to the vertical downward movement component of the contact.
The present embodiment may determine the rotation direction of the virtual lens according to the horizontal rotation direction component and the vertical rotation direction component of the virtual lens.
S206, according to the rotating angle and the rotating direction of the virtual lens, rotating the virtual lens to adjust the game scene image displayed on the graphical user interface.
In this embodiment, a specific implementation process of S206 may refer to related descriptions in the embodiment shown in fig. 2, and is not described herein again.
In this embodiment, the rotation angle of the virtual lens is determined according to the moving distance and the moving speed by the above scheme, therefore, when the user needs to fine-tune the virtual lens, the user control contact moves slowly (i.e., the moving speed is slow), and the game scene image adjusted by rotating the virtual lens by a smaller rotation angle is displayed on the graphical user interface. When the user needs to move the virtual lens by a large margin, the user control contact moves rapidly (namely, the moving speed is high), and the game scene image adjusted by rotating the virtual lens by a large rotating angle can be displayed on the graphical user interface without moving by a large distance, so that the problem that the rotating angle of the fine adjustment lens and the rotating angle of the large-margin adjustment lens cannot be simultaneously met in the prior art is solved, the precision rate of the rotating angle of the virtual lens controlled by the embodiment is higher, and higher error tolerance experience is also realized.
In order to verify the lens moving effect of the embodiment of the present invention, the embodiment of the present invention is compared with the prior art (for the touch terminal, the test data is all manually tested, and certain errors are not excluded), where, as shown in table one.
Watch 1
Figure GDA0002316508110000121
According to the comparison result, the scheme of the embodiment of the invention solves the defects in the prior art. Under the experience of greatly moving the virtual lens, the method is not greatly different from the prior art. Under the experience of slightly moving the virtual lens, the prior art solution can only control by moving a small distance, but the player often cannot easily control the finger to move by the small distance, and if the movement is too much, the player needs to operate the movement back again. The scheme of the embodiment of the invention can move a larger distance and has higher error tolerance experience.
Fig. 6 is a schematic structural diagram of an information processing apparatus of a game system according to an embodiment of the present invention, and as shown in fig. 6, an information processing apparatus 300 of the game system according to the embodiment is applied to a touch terminal capable of presenting a graphical user interface, where content displayed by the graphical user interface includes a game scene image captured through a virtual lens, and the information processing apparatus 300 of the game system may include: a detection module 310, an acquisition module 320, a determination module 330, and a processing module 340.
The detecting module 310 is configured to provide a view angle control area on the graphical user interface, and detect a touch operation applied to the view angle control area.
The obtaining module 320 is configured to obtain a moving distance, a moving speed, and a moving direction of a contact corresponding to the touch operation.
A determining module 330, configured to determine a rotation angle of the virtual lens according to the moving distance and the moving speed, and determine a rotation direction of the virtual lens according to the moving direction.
The processing module 340 is configured to rotate the virtual lens according to the rotation angle and the rotation direction of the virtual lens to adjust the game scene image displayed on the graphical user interface.
In some embodiments, the obtaining module 320 is specifically configured to: acquiring the moving time of the contact in the visual angle control area, wherein the moving time is the time length of the contact moving the moving distance; and determining the moving speed according to the moving distance and the moving time.
In some embodiments, the movement distance comprises a horizontal direction movement distance component and a vertical direction movement distance component; the moving speed includes: a horizontal direction moving velocity component and a vertical direction moving velocity component; the rotation angle comprises a horizontal rotation angle component and a vertical rotation angle component;
the determining module 330 is specifically configured to: determining a horizontal direction rotation angle component of the virtual lens according to the horizontal direction movement distance component and the horizontal direction movement speed component; determining a vertical direction rotation angle component of the virtual lens according to the vertical direction movement distance component and the vertical direction movement speed component; and determining the rotation angle of the virtual lens according to the rotation angle component in the horizontal direction and the rotation angle component in the vertical direction.
In some embodiments, the horizontal direction movement distance component is: the ratio of K1 to M1; wherein the K1 is the number of horizontal pixels in the view angle control area passed by the contact point within the movement time, and the M1 is the number of horizontal pixels in the view angle control area;
the vertical direction movement distance component is: the ratio of K2 to M2; wherein the K2 is the number of vertical pixels in the view angle control area through which the contact passes within the movement time, and the M2 is the number of vertical pixels in the view angle control area.
In some embodiments, the determining module 330 is specifically configured to: and determining the horizontal direction rotation angle component of the virtual lens according to the product of the horizontal direction movement distance component, the horizontal direction movement speed component and a first preset factor.
In some embodiments, the first predetermined factor is a product of a number of horizontal pixels of the graphical user interface and a first predetermined value.
In some embodiments, the determining module 330 is specifically configured to: and determining the vertical direction rotation angle component of the virtual lens according to the product of the vertical direction movement distance component, the vertical direction movement speed component and a second preset factor.
In some embodiments, the second predetermined factor is a product of a number of vertical pixels of the graphical user interface and a second predetermined value.
The apparatus of this embodiment may be configured to implement the technical solutions of the above method embodiments, and the implementation principle and the technical effect are similar, which are not described herein again.
Fig. 7 is a schematic structural diagram of a touch terminal according to an embodiment of the present invention, and as shown in fig. 7, the touch terminal 400 according to the embodiment may include: a graphical user interface 410, a memory 420, and a processor 430. The content displayed by the graphical user interface 410 includes game scene images captured through virtual footage.
A memory 420 for storing program instructions.
The processor 430, configured to implement the following steps when the program instructions are executed:
providing a view angle control area on the graphical user interface 410, and detecting touch operation acting on the view angle control area;
acquiring the moving distance, the moving speed and the moving direction of a contact corresponding to the touch operation;
determining the rotation angle of the virtual lens according to the moving distance and the moving speed, and determining the rotation direction of the virtual lens according to the moving direction;
and according to the rotating angle and the rotating direction of the virtual lens, rotating the virtual lens to adjust the game scene image displayed by the graphical user interface 410.
In some embodiments, the processor 430 is specifically configured to: acquiring the moving time of the contact in the visual angle control area, wherein the moving time is the time length of the contact moving the moving distance; and determining the moving speed according to the moving distance and the moving time.
In some embodiments, the movement distance comprises a horizontal direction movement distance component and a vertical direction movement distance component; the moving speed includes: a horizontal direction moving velocity component and a vertical direction moving velocity component; the rotation angle comprises a horizontal rotation angle component and a vertical rotation angle component;
the processor 430 is specifically configured to:
determining a horizontal direction rotation angle component of the virtual lens according to the horizontal direction movement distance component and the horizontal direction movement speed component;
determining a vertical direction rotation angle component of the virtual lens according to the vertical direction movement distance component and the vertical direction movement speed component;
and determining the rotation angle of the virtual lens according to the rotation angle component in the horizontal direction and the rotation angle component in the vertical direction.
In some embodiments, the horizontal direction movement distance component is: the ratio of K1 to M1; wherein the K1 is the number of horizontal pixels in the view angle control area passed by the contact point within the movement time, and the M1 is the number of horizontal pixels in the view angle control area;
the vertical direction movement distance component is: the ratio of K2 to M2; wherein the K2 is the number of vertical pixels in the view angle control area through which the contact passes within the movement time, and the M2 is the number of vertical pixels in the view angle control area.
In some embodiments, the processor 430 is specifically configured to:
and determining the horizontal rotation angle of the virtual lens according to the product of the horizontal movement distance component, the horizontal movement speed component and a first preset factor.
In some embodiments, the first predetermined factor is a product of a number of horizontal pixels of the gui 410 and a first predetermined value.
In some embodiments, the processor 430 is specifically configured to:
and determining the vertical direction rotation angle of the virtual lens according to the product of the vertical direction movement distance component, the vertical direction movement speed component and a second preset factor.
In some embodiments, the second predetermined factor is a product of the number of vertical pixels of the gui 410 and a second predetermined value.
The touch terminal of this embodiment may be configured to execute the technical solutions of the above method embodiments, and the implementation principle and the technical effect are similar, which are not described herein again.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media capable of storing program codes, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, and an optical disk.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (16)

1. An information processing method of a game system, applied to a touch terminal capable of presenting a graphical user interface, displayed content of the graphical user interface including a game scene image captured through a virtual lens, the method comprising:
providing a visual angle control area on the graphical user interface, and detecting touch operation acting on the visual angle control area;
acquiring the moving distance, the moving speed and the moving direction of a contact corresponding to the touch operation;
determining the rotation angle of the virtual lens according to the moving distance and the moving speed, and determining the rotation direction of the virtual lens according to the moving direction;
rotating the virtual lens according to the rotation angle and the rotation direction of the virtual lens to adjust the game scene image displayed on the graphical user interface;
the moving distance comprises a horizontal moving distance component and a vertical moving distance component; the moving speed includes: a horizontal direction moving velocity component and a vertical direction moving velocity component; the rotation angle includes a horizontal rotation angle component and a vertical rotation angle component, and determining the rotation angle of the virtual lens according to the moving distance and the moving speed includes:
determining a horizontal direction rotation angle component of the virtual lens according to the horizontal direction movement distance component and the horizontal direction movement speed component;
determining a vertical direction rotation angle component of the virtual lens according to the vertical direction movement distance component and the vertical direction movement speed component;
and determining the rotation angle of the virtual lens according to the rotation angle component in the horizontal direction and the rotation angle component in the vertical direction.
2. The method according to claim 1, wherein the obtaining of the moving speed of the touch point corresponding to the touch operation comprises: acquiring the moving time of the contact in the visual angle control area, wherein the moving time is the time length of the contact moving the moving distance; and determining the moving speed according to the moving distance and the moving time.
3. The method of claim 1, wherein the horizontal direction movement distance component is: the ratio of K1 to M1; wherein the K1 is the number of horizontal pixels in the view angle control area passed by the contact point within the movement time, and the M1 is the number of horizontal pixels in the view angle control area;
the vertical direction movement distance component is: the ratio of K2 to M2; wherein the K2 is the number of vertical pixels in the view angle control area through which the contact passes within the movement time, and the M2 is the number of vertical pixels in the view angle control area.
4. The method according to claim 1 or 3, wherein the determining a horizontal direction rotation angle component of the virtual lens according to the horizontal direction movement distance component and the horizontal direction movement speed component comprises:
and determining the horizontal direction rotation angle component of the virtual lens according to the product of the horizontal direction movement distance component, the horizontal direction movement speed component and a first preset factor.
5. The method of claim 4, wherein the first predetermined factor is a product of a number of horizontal pixels of the GUI and a first predetermined value.
6. The method according to claim 1 or 3, wherein the determining a vertical direction rotation angle component of the virtual lens according to the vertical direction movement distance component and the vertical direction movement speed component comprises:
and determining the vertical direction rotation angle component of the virtual lens according to the product of the vertical direction movement distance component, the vertical direction movement speed component and a second preset factor.
7. The method of claim 6, wherein the second predetermined factor is a product of a number of vertical pixels of the GUI and a second predetermined value.
8. An information processing apparatus of a game system, applied to a touch terminal capable of presenting a graphical user interface, displayed content of the graphical user interface including an image of a game scene captured through a virtual lens, the apparatus comprising:
the detection module is used for providing a visual angle control area on the graphical user interface and detecting touch operation acting on the visual angle control area;
the acquisition module is used for acquiring the moving distance, the moving speed and the moving direction of the contact corresponding to the touch operation;
the determining module is used for determining the rotation angle of the virtual lens according to the moving distance and the moving speed and determining the rotation direction of the virtual lens according to the moving direction;
the processing module is used for rotating the virtual lens according to the rotating angle and the rotating direction of the virtual lens so as to adjust the game scene image displayed on the graphical user interface;
the moving distance comprises a horizontal moving distance component and a vertical moving distance component; the moving speed includes: a horizontal direction moving velocity component and a vertical direction moving velocity component; the rotation angle comprises a horizontal rotation angle component and a vertical rotation angle component;
the determining module is specifically configured to: determining a horizontal direction rotation angle component of the virtual lens according to the horizontal direction movement distance component and the horizontal direction movement speed component; determining a vertical direction rotation angle component of the virtual lens according to the vertical direction movement distance component and the vertical direction movement speed component; and determining the rotation angle of the virtual lens according to the rotation angle component in the horizontal direction and the rotation angle component in the vertical direction.
9. The apparatus of claim 8, wherein the obtaining module is specifically configured to: acquiring the moving time of the contact in the visual angle control area, wherein the moving time is the time length of the contact moving the moving distance; and determining the moving speed according to the moving distance and the moving time.
10. The apparatus of claim 8, wherein the horizontal direction movement distance component is: the ratio of K1 to M1; wherein the K1 is the number of horizontal pixels in the view angle control area passed by the contact point within the movement time, and the M1 is the number of horizontal pixels in the view angle control area;
the vertical direction movement distance component is: the ratio of K2 to M2; wherein the K2 is the number of vertical pixels in the view angle control area through which the contact passes within the movement time, and the M2 is the number of vertical pixels in the view angle control area.
11. The apparatus according to claim 8 or 10, wherein the determining module is specifically configured to: and determining the horizontal direction rotation angle component of the virtual lens according to the product of the horizontal direction movement distance component, the horizontal direction movement speed component and a first preset factor.
12. The apparatus of claim 11, wherein the first predetermined factor is a product of a number of horizontal pixels of the gui and a first predetermined value.
13. The apparatus according to claim 8 or 10, wherein the determining module is specifically configured to: and determining the vertical direction rotation angle component of the virtual lens according to the product of the vertical direction movement distance component, the vertical direction movement speed component and a second preset factor.
14. The apparatus of claim 13, wherein the second predetermined factor is a product of a number of vertical pixels of the gui and a second predetermined value.
15. A touch terminal, comprising: a graphical user interface, a memory, and a processor; the content displayed by the graphical user interface comprises a game scene image captured through a virtual lens;
a memory for storing program instructions;
the processor, when the program instructions are executed, is configured to implement the steps of the method of any of claims 1-7.
16. A storage medium, comprising: a readable storage medium and a computer program for implementing the information processing method of the game system according to any one of claims 1 to 7.
CN201711057750.2A 2017-11-01 2017-11-01 Information processing method and device of game system Active CN107765986B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711057750.2A CN107765986B (en) 2017-11-01 2017-11-01 Information processing method and device of game system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711057750.2A CN107765986B (en) 2017-11-01 2017-11-01 Information processing method and device of game system

Publications (2)

Publication Number Publication Date
CN107765986A CN107765986A (en) 2018-03-06
CN107765986B true CN107765986B (en) 2020-05-19

Family

ID=61272077

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711057750.2A Active CN107765986B (en) 2017-11-01 2017-11-01 Information processing method and device of game system

Country Status (1)

Country Link
CN (1) CN107765986B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108525296B (en) * 2018-04-24 2019-12-06 网易(杭州)网络有限公司 Information processing method and device in virtual reality game and processor
CN108635857B (en) 2018-05-18 2022-04-22 腾讯科技(深圳)有限公司 Interface display method and device, electronic device and computer readable storage medium
CN109011559B (en) * 2018-07-26 2021-09-07 网易(杭州)网络有限公司 Method, device, equipment and storage medium for controlling virtual carrier in game
CN110448906A (en) * 2018-11-13 2019-11-15 网易(杭州)网络有限公司 The control method and device at visual angle, touch control terminal in game
CN110448898A (en) * 2018-11-13 2019-11-15 网易(杭州)网络有限公司 The control method and device of virtual role, electronic equipment in game
CN109847354B (en) * 2018-12-19 2020-05-22 网易(杭州)网络有限公司 Method and device for controlling virtual lens in game
CN110347163B (en) * 2019-08-07 2022-11-18 京东方科技集团股份有限公司 Control method and device of unmanned equipment and unmanned control system

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6699127B1 (en) * 2000-06-20 2004-03-02 Nintendo Of America Inc. Real-time replay system for video game
CN1511306A (en) * 2001-05-21 2004-07-07 ���ιɷ����޹�˾ Image processing apparatus and game apparatus
CN102414641A (en) * 2009-05-01 2012-04-11 微软公司 Altering a view perspective within a display environment
CN103955279A (en) * 2014-05-19 2014-07-30 腾讯科技(深圳)有限公司 Viewing angle feedback method and terminal
US9480921B2 (en) * 2014-03-12 2016-11-01 Wargaming.Net Limited Potential damage indicator of a targeted object
CN106528029A (en) * 2016-12-17 2017-03-22 北京小米移动软件有限公司 Display method and device
CN107029428A (en) * 2016-02-04 2017-08-11 网易(杭州)网络有限公司 A kind of control system of shooting game, method and terminal
CN107029425A (en) * 2016-02-04 2017-08-11 网易(杭州)网络有限公司 A kind of control system of shooting game, method and terminal

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6699127B1 (en) * 2000-06-20 2004-03-02 Nintendo Of America Inc. Real-time replay system for video game
CN1511306A (en) * 2001-05-21 2004-07-07 ���ιɷ����޹�˾ Image processing apparatus and game apparatus
CN102414641A (en) * 2009-05-01 2012-04-11 微软公司 Altering a view perspective within a display environment
US9480921B2 (en) * 2014-03-12 2016-11-01 Wargaming.Net Limited Potential damage indicator of a targeted object
CN103955279A (en) * 2014-05-19 2014-07-30 腾讯科技(深圳)有限公司 Viewing angle feedback method and terminal
CN107029428A (en) * 2016-02-04 2017-08-11 网易(杭州)网络有限公司 A kind of control system of shooting game, method and terminal
CN107029425A (en) * 2016-02-04 2017-08-11 网易(杭州)网络有限公司 A kind of control system of shooting game, method and terminal
CN106528029A (en) * 2016-12-17 2017-03-22 北京小米移动软件有限公司 Display method and device

Also Published As

Publication number Publication date
CN107765986A (en) 2018-03-06

Similar Documents

Publication Publication Date Title
CN107765986B (en) Information processing method and device of game system
CN110215690B (en) Visual angle switching method and device in game scene and electronic equipment
US10846936B2 (en) Image display method and device
US9189082B2 (en) Enhanced handheld screen-sensing pointer
US8477099B2 (en) Portable data processing appartatus
EP2998848A1 (en) Method, device, and apparatus for controlling screen rotation
US20150062178A1 (en) Tilting to scroll
JP6419278B1 (en) Control device, control method, and program
CN110251936B (en) Method and device for controlling virtual camera in game and storage medium
US11477432B2 (en) Information processing apparatus, information processing method and storage medium
US11297303B2 (en) Control apparatus, control method, and storage medium
US20170097692A1 (en) Display control apparatus and method for controlling the same
US20170154467A1 (en) Processing method and device for playing video
CN109753145B (en) Transition animation display method and related device
CN116301485A (en) Icon display method and device
CN113259742B (en) Video bullet screen display method and device, readable storage medium and computer equipment
JP2019057291A (en) Control device, control method, and program
CN112870714A (en) Map display method and device
US20150281585A1 (en) Apparatus Responsive To At Least Zoom-In User Input, A Method And A Computer Program
US7513700B2 (en) Image element identifier
CN110610454A (en) Method and device for calculating perspective projection matrix, terminal device and storage medium
US10074401B1 (en) Adjusting playback of images using sensor data
US20120038636A1 (en) Appearance of an object
WO2024041515A1 (en) Bullet-screen comment display method and apparatus
US20230396748A1 (en) Image processing apparatus, image processing method, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant