CN108553894B - Display control method and device, electronic equipment and storage medium - Google Patents

Display control method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN108553894B
CN108553894B CN201810425435.9A CN201810425435A CN108553894B CN 108553894 B CN108553894 B CN 108553894B CN 201810425435 A CN201810425435 A CN 201810425435A CN 108553894 B CN108553894 B CN 108553894B
Authority
CN
China
Prior art keywords
point
touch
map area
touch event
coordinate point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810425435.9A
Other languages
Chinese (zh)
Other versions
CN108553894A (en
Inventor
吴静仪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201810425435.9A priority Critical patent/CN108553894B/en
Publication of CN108553894A publication Critical patent/CN108553894A/en
Application granted granted Critical
Publication of CN108553894B publication Critical patent/CN108553894B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/303Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands

Abstract

The disclosure relates to a display control method and device, electronic equipment and a storage medium, and relates to the technical field of human-computer interaction, wherein the method comprises the following steps: providing a map area on the interactive interface; detecting a first touch event acting on the map area, and determining a coordinate point corresponding to a touch point in the map area according to the touch point of the first touch event; amplifying and displaying a preset area containing the coordinate point in the map area at a preset position on the interactive interface to obtain a local enlarged image; the coordinate point is identified in the enlarged partial view. The method and the device can accurately position the coordinate point in the map area, and improve the positioning accuracy.

Description

Display control method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of human-computer interaction technologies, and in particular, to a display control method, a display control apparatus, an electronic device, and a computer-readable storage medium.
Background
With the rapid development of mobile communication technology, a large number of game applications have emerged. In various game application scenarios, a virtual Character often needs to view the position information of multiple NPCs (Non Player characters) through a small map, or automatically find a way to a specific coordinate point through the small map.
In the related art, for example, in the application scenario shown in fig. 1, when a user's finger clicks on a map, a center point of an area clicked by the finger is directly used as a target point, and an icon is marked on the target point. For the mode, when clicking, the finger shields the interactive interface, so that the user operation is inconvenient and misoperation is caused; in addition, the clicked area is an area, and the sight of the user is shielded by fingers, so that the user cannot know whether the clicked position is the desired coordinate point during operation, and only after clicking, the user can judge whether the clicked position is accurate or not in a fuzzy mode through marks on a map; in addition, when the coordinate point corresponding to the click position is judged to be accurate, the mark on the map needs to be checked, so that the operation steps are more, and the operation efficiency is lower.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
An object of the present disclosure is to provide a display control method and apparatus, an electronic device, and a storage medium, which overcome, at least to some extent, a problem that a coordinate point cannot be accurately located on a map due to limitations and defects of the related art.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to an aspect of the present disclosure, there is provided a display control method including: providing a map area on the interactive interface; detecting a first touch event acting on the map area, and determining a coordinate point corresponding to a touch point in the map area according to the touch point of the first touch event; amplifying and displaying a preset area containing the coordinate point in the map area at a preset position on the interactive interface to obtain a local enlarged image; the coordinate point is identified in the enlarged partial view.
In an exemplary embodiment of the present disclosure, the method further comprises: and if a second touch event taking the touch point of the first touch event as a starting point is detected, updating the local enlarged image according to the position of the real-time touch point of the second touch event.
In an exemplary embodiment of the present disclosure, the method further comprises: and in the updated local enlarged view, identifying a coordinate point corresponding to the real-time touch point in the map area.
In an exemplary embodiment of the present disclosure, the method further comprises: determining a coordinate point in the map area corresponding to the end position of the second touch event as a selected point in the map area.
In an exemplary embodiment of the present disclosure, the method further comprises: and stopping displaying the local enlarged image when the touch point disappears.
In an exemplary embodiment of the present disclosure, after enlarging and displaying a preset region including the coordinate point in the map to obtain a partially enlarged view, the method further includes: and detecting a third touch event acting on the partially enlarged image, and determining a selected point in the map area according to the third touch event.
In an exemplary embodiment of the present disclosure, the method further comprises: and stopping displaying the local enlarged image when a fourth touch event acting on the local enlarged image is detected.
In an exemplary embodiment of the present disclosure, the method further comprises: and determining the position of the touch point after the touch point deviates a preset distance to the designated direction as the preset position corresponding to the touch point.
In an exemplary embodiment of the present disclosure, the displaying the preset area containing the coordinate point in the map in an enlarged manner includes: and magnifying and displaying the preset area which takes the coordinate point as the center in the map area.
According to an aspect of the present disclosure, there is provided a display control apparatus including: the map providing module is used for providing a map area on the interactive interface; the coordinate point determination module is used for detecting a first touch event acting on the map area and determining a coordinate point corresponding to the touch point in the map area according to the touch point of the first touch event; the enlarged image acquisition module is used for magnifying and displaying a preset area containing the coordinate point in the map area at a preset position on the interactive interface so as to obtain a local enlarged image; and the coordinate point identification module is used for identifying the coordinate point in the local enlarged image.
In an exemplary embodiment of the present disclosure, the apparatus further includes: and the magnified image updating module is used for detecting a second touch event taking the touch point of the first touch event as a starting point and updating the partial magnified image according to the position of the real-time touch point of the second touch event.
In an exemplary embodiment of the present disclosure, the apparatus further includes: and the identification updating module is used for identifying a coordinate point corresponding to the real-time touch point in the map area in the updated local enlarged view.
In an exemplary embodiment of the present disclosure, the apparatus further includes: and the first selection module is used for determining a coordinate point corresponding to the end position of the second touch event in the map area as a selected point in the map area.
In an exemplary embodiment of the present disclosure, the apparatus further includes: and the first stopping module is used for stopping displaying the local enlarged image when the touch point disappears.
In an exemplary embodiment of the present disclosure, the apparatus further includes: and the second selection module is used for detecting a third touch event acting on the partially enlarged image and determining a selected point in the map area according to the third touch event.
In an exemplary embodiment of the present disclosure, the apparatus further includes: and the second stopping module is used for detecting a fourth touch event acting on the local enlarged image and stopping displaying the local enlarged image.
In an exemplary embodiment of the present disclosure, the apparatus further includes: and the preset position determining module is used for determining the position of the touch point after the touch point deviates a preset distance to the specified direction as the preset position corresponding to the touch point.
In an exemplary embodiment of the present disclosure, the enlarged view display module includes: and the amplification control module is used for amplifying and displaying the preset area which takes the coordinate point as the center in the map area.
According to an aspect of the present disclosure, there is provided an electronic device including: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform any one of the above display control methods via execution of the executable instructions.
According to an aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the display control method of any one of the above.
In a display control method, a display control apparatus, an electronic device, and a computer-readable storage medium provided in exemplary embodiments of the present disclosure, when a first touch event acting on a map area is detected, a coordinate point corresponding to a touch point is determined in the map area according to the touch point of the first touch event; amplifying and displaying a preset area containing the coordinate point at a preset position on the interactive interface to obtain a local enlarged image; coordinate points are further identified in the close-up view. On one hand, by amplifying and displaying a local enlarged image containing coordinate points and identifying the coordinate points in the local enlarged image, the positions of the coordinate points in the map area can be accurately checked and determined, and the operation accuracy can be further improved; on one hand, the local enlarged image is displayed in an enlarged mode at the preset position, so that the shielding of an interactive interface is avoided, the misoperation of a user is avoided, and the operation convenience is improved; on the other hand, the coordinate point can be accurately positioned only through the first touch event acting on the map area, so that the operation steps are simplified, and the efficiency is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
Fig. 1 is a schematic view showing an interface for displaying coordinate points in the related art;
FIG. 2 schematically illustrates a display control method in an exemplary embodiment of the disclosure;
FIG. 3 schematically illustrates an interface schematic providing a partial magnified view in an exemplary embodiment of the disclosure;
FIG. 4 schematically illustrates a block diagram of a display control apparatus in an exemplary embodiment of the present disclosure;
FIG. 5 schematically illustrates a block diagram of an electronic device in an exemplary embodiment of the disclosure;
fig. 6 schematically illustrates a program product in an exemplary embodiment of the disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
In the exemplary embodiment, a display control method is first provided, which may obtain an interactive interface by executing a software application on a processor of a terminal and rendering on a display of the terminal. The terminal can be various electronic devices with touch screens, such as a smart phone, a tablet computer, a notebook computer, a desktop computer and a smart television. Referring to fig. 2, the display control method may include the steps of:
in step S210, a map area is provided on the interactive interface;
in step S220, a first touch event acting on the map area is detected, and a coordinate point corresponding to a touch point in the map area is determined according to the touch point of the first touch event;
in step S230, at a preset position on the interactive interface, a preset region including the coordinate point in the map region is displayed in an enlarged manner to obtain a local enlarged view;
in step S240, the coordinate point is identified in the partial enlarged view.
On one hand, the display control method provided in the exemplary embodiment can accurately check and determine the position of the coordinate point in the map region by enlarging and displaying the local enlarged image including the coordinate point and identifying the coordinate point in the local enlarged image, thereby improving the operation accuracy; on one hand, the local enlarged image is displayed in an enlarged mode at the preset position, so that the shielding of an interactive interface is avoided, the misoperation of a user is avoided, and the operation convenience is improved; on the other hand, the coordinate point can be accurately positioned only through the first touch event acting on the map area, so that the operation steps are simplified, and the efficiency is improved.
Next, all steps in the present exemplary embodiment are specifically explained with reference to fig. 2 and 3.
In step S210, a map area is provided on the interactive interface.
In this exemplary embodiment, the application scenario may be a game application scenario, or may also be other application scenarios including a real map, for example, a map navigation scenario, and the like, and the game application scenario is taken as an example for description here. A map area may first be provided at a suitable location of the interactive interface. The map area can be a two-dimensional plane thumbnail of a game scene, and necessary information such as an NPC, a virtual character, a scene building layout, a direction of the game scene and the like in the game scene can be displayed through the map area. Meanwhile, the user can perform operations such as marking a position, setting a travel route, and the like in the map area, thereby interacting with other virtual characters. For example, a user sets a position mark point in the map area, and the position mark is also synchronously displayed in the map area of teammates of the user or other virtual characters within a preset distance, so that the purpose of synchronously indicating the position information is achieved.
The map area can also be a small map which is positioned in the upper left corner or the upper right corner of the interactive interface in the game application and is used for indicating the position of the current screen in the whole game application and displaying the current position and the clicking position of the player; the map area may also be a virtual map area displayed on the interactive interface, and the map area displayed on the interactive interface is taken as an example for illustration.
In step S220, a first touch event acting on the map area is detected, and a coordinate point corresponding to the touch point in the map area is determined according to the touch point of the first touch event.
In this example, the first touch event may be, for example, a click operation or a press operation, where the press operation may include, for example, a light press, a heavy press, and the like, and the press operation is not particularly limited herein as long as the degree of press pressure is greater than a preset value set in advance. Here, the first touch event is taken as a click operation as an example for explanation.
Specifically, whether any position on the interactive interface receives a click operation or not can be detected; when the click operation is detected, whether the click operation acts on any position of the provided map area can be judged through the position sensor; when the click operation is judged to act on the map area, the system can determine the position coordinates of the touch point of the click operation through the position sensor or the coordinate system.
Meanwhile, a coordinate point corresponding to the touch point of the click operation can be determined in the map area. For example, the touch point of the click operation is point a, and since the coordinate point in the map area corresponds to the touch point of the click operation, the coordinate point may also be point a. In addition, in order to enhance the operability, the coordinate point may also be a point having a predetermined error value with the touch point.
Next, in step S230, at a preset position on the interactive interface, a preset region including the coordinate point in the map region is displayed in an enlarged manner to obtain a partially enlarged view.
In the exemplary embodiment, in order to avoid the problem of misoperation caused by the occlusion of the interactive interface by the finger during the click operation, a corresponding coordinate point may be presented at a preset position on the interactive interface. The preset position may be, for example, any position on the interactive interface that is not coincident with the touch point of the click operation, or may be a position associated with the touch point of the click operation, and the preset position associated with the click operation is taken as an example for description herein. In one embodiment of the invention, the preset position may be determined by:
in the first mode, a developer can set a mapping relationship between a touch point of a click operation on a map area and a preset position in advance by writing a program, and the mapping relationship can be obtained by addition, subtraction, multiplication, division or other operations. For example, when the coordinates of the touch point of the click operation are (X1, Y1), the coordinates of the preset position may be determined as (X1, 2Y1) according to the mapping relationship; in addition, the coordinates of the preset position may also be determined as other numerical values according to actual requirements, which is not particularly limited in the exemplary embodiment.
And secondly, determining the position of the touch point subjected to the clicking operation after the touch point deviates a preset distance to the specified direction as a preset position corresponding to the touch point. The designated direction can be set to any direction according to actual requirements, such as 9 o 'clock direction, 12 o' clock direction and the like; the predetermined distance may also be any suitable distance, such as 5 centimeters, 10 centimeters, and so forth. It should be noted that the designated direction and the preset distance may be set in advance by a program, as long as shielding of a map area corresponding to a touch point position of the click operation is avoided. In addition, the offset preset position may be determined by X-50 px and Y-60 px. For example, if the coordinates of the touch point of the clicking operation are (X1, Y1), the coordinates of the preset position may be determined to be (X1, 2Y1) by offsetting the touch point by a preset distance Y1 along the 12 o' clock direction. It should be noted that the preset position may also be determined in other manners, as long as the problem of shielding the interactive interface by the finger of the user during the clicking operation can be avoided.
The preset region may be understood as a local range including a coordinate point in the map region, and may be, for example, a circular region, a square region, or an arbitrarily shaped region centered on the determined coordinate point in the map region. For example, the coordinate point of the map area is (X1, Y1), the preset area may be a range of any shape and any size including (X1, Y1), for example, the abscissa range of the preset area may be [ X1-m, X1+ m ], the ordinate range thereof may be [ Y1-n, Y1+ n ], and the like. Referring to fig. 3, the coordinate point in the touch point and the map area of the click operation is point a, and the preset area including point a is 301. It should be noted that, for the convenience of the user, the correspondence between the coordinate point in the map area and the preset area may be set in advance. For example, the preset area is set as a circular area with a radius of 5 cm centered on a coordinate point in the map area.
Next, the selected preset area may be enlarged to obtain a corresponding enlarged partial area. Specifically, the preset area may be enlarged to a preset multiple, and the preset multiple may be set according to actual requirements, for example, 2 times or 4 times, so long as detailed information such as object information, position information, and surrounding environment included in the preset area can be clearly displayed. By magnifying and displaying the preset area containing the coordinate points in the map area as a local magnified image, the positions of the coordinate points in the map area can be accurately checked and determined, and the operation accuracy can be improved. For example, referring to fig. 3, a predetermined area 301 including the point a may be enlarged by 2 times to obtain a partially enlarged image 302. In addition, the partial enlarged image may be distinctively displayed, for example, the outline thereof is bolded or filled with color, etc., to distinguish the partial enlarged image from the map area, so that the user can quickly recognize the partial enlarged image to avoid visual influence.
Next, in step S240, the coordinate point is identified in the partial enlarged view.
In order to more intuitively and accurately display the coordinate point corresponding to the touch point clicked by the user, the coordinate point may be identified in the partial enlarged view. For example, a prompt mark may be provided in the enlarged partial view, so that the coordinate point is visually identified by the prompt mark. Wherein, the prompt identifier may only include an icon identifier, such as the icon identifier 303 shown in fig. 3; the prompt identifier may also include only a coordinate identifier, so as to clearly and accurately display the position coordinates of the coordinate point through the coordinate identifier, such as the coordinate identifier 304 shown in fig. 3; the prompt mark can also comprise an icon mark and a coordinate mark at the same time so as to assist a user to accurately view coordinate points in a map area corresponding to the touch point of the click operation, thereby efficiently and accurately completing the operation. In addition, the coordinate points in the map may be identified by other manners, which is not limited in this example.
In addition, after the partial enlarged image is displayed in an enlarged mode according to the touch point of the first touch event, the position of the partial enlarged image can be updated in real time, so that the interactivity of the game application is improved. Specifically, a second touch event with the touch point of the first touch event as a starting point is detected, and the local enlarged image is updated according to the position of the real-time touch point of the second touch event. The second touch event may be, for example, a sliding operation or a dragging operation, and the sliding operation is taken as an example for description herein.
In this example embodiment, while the enlarged partial view is displayed at the preset position of the touch point in an enlarged manner, it may be detected whether a sliding operation continuous to the first touch event is received, where the sliding operation may start from the position of the touch point of the first touch event and may end at any position. If the sliding operation with the touch point of the first touch event as the starting point is detected, the real-time position of the touch point of the sliding operation can be acquired in real time, the display position of the partial enlarged image can be updated according to the real-time position of the touch point, and the preset area corresponding to the partial enlarged image and the coordinate point contained in the preset area can be updated in real time. For example, if the starting point of the sliding operation is point a and the real-time touch point is point B, the preset area in the interactive interface is updated from the preset area 301 including point a to the preset area 308 including point B, and the enlarged partial view 302 is updated to the enlarged partial view 309.
Based on this, in order to more intuitively and accurately display the coordinate point corresponding to the real-time touch point in the user sliding process, the coordinate point corresponding to the real-time touch point may be identified in the updated partial enlarged view. The specific implementation method is the same as the implementation method in step S240, and is not described herein again. Note that the coordinate identification 304 identifying the coordinate point in the partially enlarged view needs to be updated to the coordinate identification 305.
Further, when it is detected that the second touch event ends, a touch point corresponding to the second touch event ends may be acquired, and a coordinate point corresponding to the touch point in the map area may be taken as a point finally selected in the map area. For example, if the touch point a of the click operation, the start point of the slide operation is point a, and the end point of the slide operation is point B, the coordinate point corresponding to point B may be used as the finally selected point in the map region. By the mode, the coordinate point selected by the user in the map area can be adjusted through the coherent operation of clicking, sliding and lifting the finger, so that the modification opportunity is provided for the user, and the interactivity is enhanced. After the coordinate points in the map area are accurately determined, the system can automatically find a path or perform other functions according to the coordinate points selected by the user, and can also display detailed map information of the coordinate points selected by the user in other map scenes, such as a map navigation scene, so that the user can accurately determine the coordinate points which the user wants to go.
Based on the above description, if the touch points of the click operation and the slide operation disappear, the coordinate point cannot be determined according to the touch points, and at this time, it can be considered that the user does not need to precisely check a certain coordinate point, so as to reduce the display control on the interactive interface, avoid the shielding of the local enlarged image on other interface information, improve the screen utilization rate, and stop displaying the local enlarged image. In the example, the partial enlarged image can be stopped being displayed by lifting the finger, so that the operation is simple and convenient, and the operation efficiency is improved.
It is to be added that the coordinate points in the map area can also be displayed accurately in another way. The method specifically comprises the following steps: after a preset area containing the coordinate points in the map is displayed in an enlarged mode to obtain a local enlarged view, if a third touch event acting on the local enlarged view is detected, a point selected in the map area is determined according to the third touch event. The third touch event may be, for example, a click operation or a press operation, and the click operation is taken as an example for description herein. If the click operation is detected in the obtained partial enlarged view, the touch point of the click operation can be mapped into the map area according to a preset rule, so that the position coordinate of the selected point in the map area, namely the coordinate point, can be determined according to the position coordinate of the touch point of the click operation.
Further, if a fourth touch event acting on the partial enlargement is detected, the partial enlargement is stopped from being displayed. The fourth touch event may be applied to the partial enlarged view, or may be applied to a preset control at a preset position of the partial enlarged view. When applied to the partially enlarged view, the fourth touch event may be a double click, a slide, or other operation, as long as it is different from the third touch event. The preset control may be a closing control located at any position of the left side, the right side, the upper right corner, and the like of the partial enlarged image, and when the preset control is applied to the preset control around the partial enlarged image, the fourth touch event may be a click operation or any other operation. If a click operation on a close control on the partial enlargement is detected, the display of the partial enlargement may be stopped.
The present disclosure also provides a display control device. Referring to fig. 4, the display control apparatus 400 may include:
the map providing module 401 may be configured to provide a map area on the interactive interface;
a coordinate point determining module 402, configured to detect a first touch event acting on the map area, and determine a coordinate point corresponding to a touch point in the map area according to the touch point of the first touch event;
the enlarged image obtaining module 403 may be configured to, at a preset position on the interactive interface, enlarge and display a preset area, which includes the coordinate point, in the map area to obtain a local enlarged image;
a coordinate point identification module 404 may be configured to identify the coordinate point in the enlarged partial view.
In an exemplary embodiment of the present disclosure, the apparatus 400 further includes: and the magnified image updating module is used for detecting a second touch event taking the touch point of the first touch event as a starting point and updating the partial magnified image according to the position of the real-time touch point of the second touch event.
In an exemplary embodiment, the apparatus 400 may further include: and the identification updating module can be used for identifying coordinate points corresponding to the real-time touch points in the map area in the updated local enlarged view.
In an exemplary embodiment, the apparatus 400 further comprises: the first selection module may be configured to determine a coordinate point in the map area corresponding to the end position of the second touch event as a selected point in the map area.
In an exemplary embodiment, the apparatus 400 further comprises: the first stopping module may be configured to stop displaying the partially enlarged image when the touch point is detected to disappear.
In an exemplary embodiment, the apparatus 400 further comprises: the second selection module may be configured to detect a third touch event applied to the enlarged partial view, and determine a selected point in the map area according to the third touch event.
In an exemplary embodiment, the apparatus 400 further comprises: the second stopping module may be configured to detect a fourth touch event applied to the partial enlarged image, and stop displaying the partial enlarged image.
In an exemplary embodiment, the apparatus 400 further comprises: the preset position determining module may be configured to determine a position of the touch point after the touch point deviates to the designated direction by a preset distance as the preset position corresponding to the touch point.
In an exemplary embodiment, the enlarged view display module 403 includes: and the amplification control module can be used for amplifying and displaying the preset area which takes the coordinate point as the center in the map area.
It should be noted that, the details of each module in the display control device have been described in detail in the corresponding display control method, and therefore, the details are not described herein again.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Moreover, although the steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that the steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
In an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Thus, various aspects of the invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 500 according to this embodiment of the invention is described below with reference to fig. 5. The electronic device 500 shown in fig. 5 is only an example and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 5, the electronic device 500 is embodied in the form of a general purpose computing device. The components of the electronic device 500 may include, but are not limited to: the at least one processing unit 510, the at least one memory unit 520, and a bus 530 that couples various system components including the memory unit 520 and the processing unit 510.
Wherein the storage unit stores program code that is executable by the processing unit 510 to cause the processing unit 510 to perform steps according to various exemplary embodiments of the present invention as described in the above section "exemplary methods" of the present specification. For example, the processing unit 510 may perform the steps as shown in fig. 2: in step S210, a map area is provided on the interactive interface; in step S220, a first touch event acting on the map area is detected, and a coordinate point corresponding to a touch point in the map area is determined according to the touch point of the first touch event; in step S230, at a preset position on the interactive interface, a preset region including the coordinate point in the map region is displayed in an enlarged manner to obtain a local enlarged view; in step S240, the coordinate point is identified in the partial enlarged view.
The memory unit 520 may include a readable medium in the form of a volatile memory unit, such as a random access memory unit (RAM)5201 and/or a cache memory unit 5202, and may further include a read only memory unit (ROM) 5203.
Storage unit 520 may also include a program/utility 5204 having a set (at least one) of program modules 5205, such program modules 5205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 530 may be one or more of any of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 500 may also communicate with one or more external devices 600 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 500, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 500 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interfaces 550. Also, the electronic device 500 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via the network adapter 560. As shown, the network adapter 560 communicates with the other modules of the electronic device 500 over the bus 530. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 500, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above section "exemplary methods" of the present description, when said program product is run on the terminal device.
Referring to fig. 6, a program product 700 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Furthermore, the above-described figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (10)

1. A display control method, comprising:
providing a map area on an interactive interface, wherein the map area is a two-dimensional plane thumbnail of a game scene;
detecting a first touch event acting on the map area, and determining a coordinate point corresponding to a touch point in the map area according to the touch point of the first touch event;
magnifying and displaying a preset area which contains the coordinate point and takes the coordinate point as a center in the map area at a preset position on the interactive interface to obtain a local enlarged view of the map area; the preset position is a position associated with a touch point of a first touch event;
identifying the coordinate point in the enlarged partial view;
and if a second touch event taking the touch point of the first touch event as a starting point is detected, updating the local enlarged image and the display position of the local enlarged image according to the position of the real-time touch point of the second touch event.
2. The display control method according to claim 1, characterized in that the method further comprises:
and in the updated local enlarged view, identifying a coordinate point corresponding to the real-time touch point in the map area.
3. The display control method according to claim 1, characterized in that the method further comprises:
determining a coordinate point in the map area corresponding to the end position of the second touch event as a selected point in the map area.
4. The display control method according to any one of claims 1 to 3, characterized in that the method further comprises:
and stopping displaying the local enlarged image when the touch point disappears.
5. The display control method according to claim 1, wherein after the preset region including the coordinate point in the map is displayed enlarged to obtain a partially enlarged view, the method further comprises:
and detecting a third touch event acting on the partially enlarged image, and determining a selected point in the map area according to the third touch event.
6. The display control method according to claim 5, characterized in that the method further comprises:
and stopping displaying the local enlarged image when a fourth touch event acting on the local enlarged image is detected.
7. The display control method according to claim 1, characterized in that the method further comprises:
and determining the position of the touch point after the touch point deviates a preset distance to the designated direction as the preset position corresponding to the touch point.
8. A display control apparatus, characterized by comprising:
the map providing module is used for providing a map area on the interactive interface, and the map area is a two-dimensional plane thumbnail of a game scene;
the coordinate point determination module is used for detecting a first touch event acting on the map area and determining a coordinate point corresponding to the touch point in the map area according to the touch point of the first touch event;
the enlarged image acquisition module is used for magnifying and displaying a preset area which contains the coordinate point and takes the coordinate point as a center in the map area at a preset position on the interactive interface so as to obtain a partial enlarged image of the map area; the preset position is a position associated with a touch point of a first touch event;
a coordinate point identification module for identifying the coordinate point in the partial enlarged image;
and the magnified image updating module is used for updating the local magnified image and the display position of the local magnified image according to the position of the real-time touch point of the second touch event when the second touch event with the touch point of the first touch event as a starting point is detected.
9. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the display control method of any one of claims 1-7 via execution of the executable instructions.
10. A computer-readable storage medium on which a computer program is stored, the computer program, when being executed by a processor, implementing the display control method according to any one of claims 1 to 7.
CN201810425435.9A 2018-05-07 2018-05-07 Display control method and device, electronic equipment and storage medium Active CN108553894B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810425435.9A CN108553894B (en) 2018-05-07 2018-05-07 Display control method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810425435.9A CN108553894B (en) 2018-05-07 2018-05-07 Display control method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN108553894A CN108553894A (en) 2018-09-21
CN108553894B true CN108553894B (en) 2022-02-18

Family

ID=63538070

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810425435.9A Active CN108553894B (en) 2018-05-07 2018-05-07 Display control method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN108553894B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109865286B (en) * 2019-02-20 2023-02-28 网易(杭州)网络有限公司 Information processing method and device in game and storage medium
CN111729298A (en) * 2020-06-22 2020-10-02 网易(杭州)网络有限公司 Map control method and device, electronic equipment and storage medium
CN111744197B (en) * 2020-08-07 2022-03-15 腾讯科技(深圳)有限公司 Data processing method, device and equipment and readable storage medium
CN112057848A (en) * 2020-09-10 2020-12-11 网易(杭州)网络有限公司 Information processing method, device, equipment and storage medium in game
US11554323B2 (en) * 2020-09-11 2023-01-17 Riot Games, Inc. System and method for precise positioning with touchscreen gestures
CN112927141A (en) * 2021-03-25 2021-06-08 追创科技(苏州)有限公司 Setting method and device of forbidden region, storage medium and electronic device
CN113082702A (en) * 2021-04-15 2021-07-09 网易(杭州)网络有限公司 Game display control method and electronic equipment
CN116841448A (en) * 2022-03-25 2023-10-03 追觅创新科技(苏州)有限公司 Map display method and device, storage medium and electronic device

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102982039A (en) * 2011-09-06 2013-03-20 上海无戒空间信息技术有限公司 Edit method of map and edit device thereof
CN104252529A (en) * 2014-09-04 2014-12-31 百度在线网络技术(北京)有限公司 Method and device for loading map annotations
CN104731883B (en) * 2015-03-11 2017-11-17 北京农业信息技术研究中心 Method for displaying network map and system
CN106289298A (en) * 2015-05-21 2017-01-04 比亚迪股份有限公司 The display packing of point of interest and device, onboard navigation system
CN104990560A (en) * 2015-07-31 2015-10-21 小米科技有限责任公司 Navigation route generation method and device
CN107715454B (en) * 2017-09-01 2018-12-21 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
自由之战地图怎么使用?自由之战地图介绍;爱吃鱼的善欧巴;《http://u.360.cn/gl/article/99265/》;20150609;正文1-3页、附图1-5 *
虎牙亦轩-橘子辅助视野站点教学;虎牙亦轩;《https://www.bilibili.com/video/BV14W411e7LZ?from=search&seid=9016870648799553496》;20180126;视频第03:12-03:18秒 *

Also Published As

Publication number Publication date
CN108553894A (en) 2018-09-21

Similar Documents

Publication Publication Date Title
CN108553894B (en) Display control method and device, electronic equipment and storage medium
CN107913520B (en) Information processing method, information processing device, electronic equipment and storage medium
CN108287657B (en) Skill applying method and device, storage medium and electronic equipment
US9098942B2 (en) Legend indicator for selecting an active graph series
CN107748641B (en) Numerical value adjustment control method and device, electronic equipment and storage medium
CN110090444B (en) Game behavior record creating method and device, storage medium and electronic equipment
CN110075519B (en) Information processing method and device in virtual reality, storage medium and electronic equipment
US9823835B2 (en) Controlling display object on display screen
CN111324252B (en) Display control method and device in live broadcast platform, storage medium and electronic equipment
CN107632769B (en) Map display method and device, electronic equipment and storage medium
CN108228065B (en) Method and device for detecting UI control information and electronic equipment
CN109542323B (en) Interaction control method and device based on virtual scene, storage medium and electronic equipment
US20150350360A1 (en) Feedback layer for native content display in virtual desktop infrastructure
CN110807161A (en) Page framework rendering method, device, equipment and medium
US20160274786A1 (en) Mobile gesture reporting and replay with unresponsive gestures identification and analysis
CN112579187A (en) Optimization method and device for cold start of application program
CN110399443B (en) Map editing method and device, mobile platform and storage medium
CN110427139B (en) Text processing method and device, computer storage medium and electronic equipment
CN109460175B (en) Method and device for moving application icon, electronic terminal and medium
CN104025001A (en) Resize handle activation for resizable portions of a user interface
CN112631474B (en) Method and device for moving elements in page, medium and equipment
CN111836093A (en) Video playing method, device, equipment and medium
CN108845924B (en) Control response area display control method, electronic device, and storage medium
WO2016081280A1 (en) Method and system for mouse pointer to automatically follow cursor
CN110737417B (en) Demonstration equipment and display control method and device of marking line of demonstration equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant