CN113467682B - Method, device, terminal and storage medium for controlling movement of map covering - Google Patents

Method, device, terminal and storage medium for controlling movement of map covering Download PDF

Info

Publication number
CN113467682B
CN113467682B CN202110777898.3A CN202110777898A CN113467682B CN 113467682 B CN113467682 B CN 113467682B CN 202110777898 A CN202110777898 A CN 202110777898A CN 113467682 B CN113467682 B CN 113467682B
Authority
CN
China
Prior art keywords
map
screen
covering
movement
longitude
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110777898.3A
Other languages
Chinese (zh)
Other versions
CN113467682A (en
Inventor
林杉
吕小豹
郑文兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sankuai Online Technology Co Ltd
Original Assignee
Beijing Sankuai Online Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sankuai Online Technology Co Ltd filed Critical Beijing Sankuai Online Technology Co Ltd
Priority to CN202110777898.3A priority Critical patent/CN113467682B/en
Publication of CN113467682A publication Critical patent/CN113467682A/en
Application granted granted Critical
Publication of CN113467682B publication Critical patent/CN113467682B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The disclosure provides a method, a device, a terminal and a storage medium for controlling movement of a map overlay, and belongs to the technical field of internet. The method comprises the following steps: in the process of controlling the map component layer to move along with the appointed trigger event, acquiring a first offset of finger movement in the appointed trigger event; controlling the movement of the map overlay in the container layer according to the type of the specified trigger event and the first offset; and correcting the movement of the map covering according to the second offset of the movement of the map component layer so that the map covering moves synchronously with the map component layer. According to the method, the drawing of the object information and the map covering object is separated, in the process of controlling the movement of the map component layer based on the appointed trigger event, the movement of the map covering object is controlled according to the moving offset of the finger, and then the error between the finger and the map component layer is corrected according to the moving offset of the map component layer, so that the effect of synchronous movement of the map covering object and the map component layer is achieved.

Description

Method, device, terminal and storage medium for controlling movement of map covering
Technical Field
The present disclosure relates to the field of internet technologies, and in particular, to a method, an apparatus, a terminal, and a storage medium for controlling movement of a map overlay.
Background
When providing navigation services based on map applications, it is generally necessary to draw map overlays such as start point icons, end point icons, user current location icons, text, bubble information, etc. on a navigation route onto a map application interface.
In practical applications, when a user wants to view other objects or areas on the map application interface, the user can perform a dragging or zooming operation on the map application interface. Because the map application interface not only displays the information of the target object, but also displays the map covering object, when the map application interface is triggered to operate, how to control the synchronous movement of the map covering object becomes a key for improving the use experience of the user on the map application.
Disclosure of Invention
The embodiment of the disclosure provides a method, a device, a terminal and a storage medium for controlling movement of a map covering. The technical scheme is as follows:
in a first aspect, there is provided a method of controlling movement of a map overlay, the method comprising:
displaying a map application interface, wherein the map application interface comprises a map component layer and a container layer, the map component layer is displayed with object information, and the container layer is displayed with a map cover;
Responding to a specified trigger event on the map application interface, and acquiring a first offset of finger movement in the specified trigger event in the process of controlling the map component layer to move along with the specified trigger event;
controlling the map overlay in the container layer to move according to the type of the specified trigger event and the first offset;
acquiring a second offset of the movement of the map component layer every other preset time period;
and correcting the movement of the map covering according to the second offset so that the map covering moves synchronously with the map component layer.
In another embodiment of the present disclosure, before displaying the map application interface, the method further includes:
acquiring longitude and latitude of the map covering;
converting the longitude of the map overlay to a screen abscissa and converting the latitude of the map overlay to a screen ordinate;
drawing the map covering on the container layer based on the screen coordinate corresponding to the map covering.
In another embodiment of the present disclosure, the converting the longitude of the map overlay to a screen abscissa comprises:
Acquiring the maximum longitude, the minimum longitude and the screen width on a map application interface displayed on a current screen;
applying the following formula to convert the longitude of the map overlay to a screen abscissa based on the maximum longitude, the minimum longitude, the screen width, and the longitude of the map overlay:
Figure BDA0003156454830000021
wherein x represents a screen abscissa corresponding to the map overlay, λ represents a longitude of the map overlay, λ represents a distance between the map overlay and the screen abscissa max Represents the maximum longitude, λ min Represents the minimum longitude, and w represents the screen width.
In another embodiment of the present disclosure, the converting the latitude of the map overlay to the screen ordinate comprises:
acquiring the maximum latitude, the minimum latitude and the screen height on the map application interface displayed on the current screen;
acquiring a maximum screen ordinate corresponding to the maximum latitude, a minimum screen ordinate corresponding to the minimum latitude and a to-be-processed screen ordinate corresponding to the map covering;
according to the maximum screen ordinate, the minimum screen ordinate, the screen ordinate to be processed and the screen height, applying the following formula to convert the latitude of the map overlay into the screen ordinate:
Figure BDA0003156454830000022
Wherein y represents a screen ordinate corresponding to the map overlay, y m Representing the vertical coordinate, y, of the screen to be processed max Representing the maximum screen ordinate, y min Represents the minimum screen ordinate and h represents the screen height.
In another embodiment of the present disclosure, the drawing the map overlay onto the container layer based on the screen coordinate corresponding to the map overlay includes:
obtaining style information of a locally stored map overlay;
determining a target display position of the map covering on the container layer based on the screen coordinate corresponding to the map covering;
and drawing the map covering to a target display position on the container layer according to the style information.
In another embodiment of the present disclosure, the specified trigger event is a single-finger drag event, and the first offset includes a specified direction and a first distance moved by the finger;
the controlling the map overlay in the container layer to move according to the type of the specified trigger event and the first offset includes:
controlling the map overlay to move the first distance in accordance with the movement direction.
In another embodiment of the present disclosure, the specified triggering event comprises a double-finger zoom-out event or a double-finger zoom-in event, and the first movement offset comprises a second distance of finger movement;
the controlling the map overlay in the container layer to move according to the type of the specified trigger event and the first offset includes:
acquiring the initial distance of the two fingers;
calculating the scaling of the two fingers according to the initial distance and the second distance;
when the specified trigger event is a double-finger zoom-out event, taking the central point of the current screen as a zooming center, and controlling the map covering object to zoom out to the central point according to the zooming proportion;
and when the specified trigger event is a double-finger amplification event, the central point is used as a zooming center, and the map covering is controlled to deviate from the central point to be amplified according to the zooming proportion.
In a second aspect, there is provided an apparatus for controlling movement of a map overlay, the apparatus comprising:
the map application interface comprises a map component layer and a container layer, wherein the map component layer displays object information, and the container layer displays a map covering;
The acquisition module is used for responding to a specified trigger event on the map application interface, and acquiring a first offset of finger movement in the specified trigger event in the process of controlling the map component layer to move along with the specified trigger event;
the control module is used for controlling the map covering in the container layer to move according to the type of the specified trigger event and the first offset;
the obtaining module is further configured to obtain a second offset of the movement of the map component layer every preset time period;
and the correction module is used for correcting the movement of the map covering according to the second offset so as to enable the map covering to move synchronously with the map component layer.
In another embodiment of the present disclosure, the apparatus further comprises:
the acquisition module is also used for acquiring the longitude and latitude of the map covering;
the conversion module is used for converting the longitude of the map covering into a screen horizontal coordinate and converting the latitude of the map covering into a screen vertical coordinate;
and the drawing module is used for drawing the map covering on the container layer based on the screen coordinate corresponding to the map covering.
In another embodiment of the present disclosure, the conversion module is configured to obtain a maximum longitude, a minimum longitude, and a screen width on a map application interface displayed on a current screen; applying the following formula to convert the longitude of the map overlay to a screen abscissa based on the maximum longitude, the minimum longitude, the screen width, and the longitude of the map overlay:
Figure BDA0003156454830000041
wherein x represents a screen abscissa corresponding to the map overlay, λ represents a longitude of the map overlay, λ represents a distance between the map overlay and the map overlay max Represents the maximum longitude, λ min Represents the minimum longitude, and w represents the screen width.
In another embodiment of the present disclosure, the conversion module is configured to obtain a maximum latitude, a minimum latitude, and a screen height on the map application interface displayed on a current screen; acquiring a maximum screen ordinate corresponding to the maximum latitude, a minimum screen ordinate corresponding to the minimum latitude and a to-be-processed screen ordinate corresponding to the map covering; according to the maximum screen ordinate, the minimum screen ordinate, the screen ordinate to be processed and the screen height, applying the following formula to convert the latitude of the map overlay into the screen ordinate:
Figure BDA0003156454830000042
Wherein y represents a screen ordinate corresponding to the map overlay, y m Representing the vertical coordinate, y, of the screen to be processed max Representing the maximum screen ordinate, y min Represents the minimum screen ordinate and h represents the screen height.
In another embodiment of the present disclosure, the rendering module is configured to obtain style information of a locally stored map overlay; determining a target display position of the map covering on the container layer based on the screen coordinate corresponding to the map covering; and drawing the map covering to a target display position on the container layer according to the style information.
In another embodiment of the present disclosure, the specified trigger event is a single-finger drag event, and the first offset includes a specified direction and a first distance moved by the finger;
the control module is used for controlling the map covering to move the first distance according to the moving direction.
In another embodiment of the present disclosure, the specified triggering event comprises a double-finger zoom-out event or a double-finger zoom-in event, and the first movement offset comprises a second distance of finger movement;
the control module is used for acquiring the initial distance of the double fingers; calculating the scaling of the two fingers according to the initial distance and the second distance; when the specified trigger event is a double-finger zoom-out event, taking the central point of the current screen as a zooming center, and controlling the map covering object to zoom out to the central point according to the zooming proportion; and when the specified trigger event is a double-finger amplification event, the central point is used as a zooming center, and the map covering is controlled to deviate from the central point to be amplified according to the zooming proportion.
In a third aspect, a terminal is provided, which comprises a processor and a memory, wherein at least one program code is stored in the memory, and the at least one program code is loaded and executed by the processor to implement the method for controlling movement of a map cover according to the first aspect.
In a fourth aspect, there is provided a computer readable storage medium having stored therein at least one program code, which is loaded and executed by a processor, to implement the method of controlling movement of a map overlay of the first aspect.
The technical scheme provided by the embodiment of the disclosure has the following beneficial effects:
in the process of controlling the map component layer to move based on the appointed trigger event, because the finger and the map component layer do not move synchronously but have certain deviation, when the offset of the movement of the map component layer cannot be acquired, the movement of the map covering is controlled according to the offset of the movement of the finger, the movement of the map covering is ensured, and when the offset of the movement of the map component layer can be acquired, the error between the finger and the map component layer is corrected according to the offset of the movement of the map component layer.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present disclosure, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and it is obvious for those skilled in the art to obtain other drawings based on the drawings without creative efforts.
FIG. 1 is a flow chart of a method of controlling movement of a map overlay provided by an embodiment of the present disclosure;
FIG. 2 is a flow chart of another method of controlling movement of a map overlay provided by embodiments of the present disclosure;
FIG. 3 is a schematic diagram of a map application interface provided by an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of another mapping application interface provided by embodiments of the present disclosure;
FIG. 5 is a flow chart of a method of controlling movement of a map overlay provided by an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of an apparatus for controlling movement of a map overlay according to an embodiment of the present disclosure;
fig. 7 shows a block diagram of a terminal according to an exemplary embodiment of the present disclosure.
Detailed Description
To make the objects, technical solutions and advantages of the present disclosure more apparent, embodiments of the present disclosure will be described in detail with reference to the accompanying drawings.
It is to be understood that the terms "each," "a plurality," and "any" and the like, as used in the embodiments of the present disclosure, are intended to encompass two or more, each referring to each of the corresponding plurality, and any referring to any one of the corresponding plurality. For example, the plurality of words includes 10 words, and each word refers to each of the 10 words, and any word refers to any one of the 10 words.
The embodiment of the present disclosure provides a method for controlling movement of a map overlay, and referring to fig. 1, a flow of the method provided by the embodiment of the present disclosure includes:
101. and displaying the map application interface.
The map application interface comprises a map component layer and a container layer, wherein the map component layer is provided with object information, and the container layer is provided with a map covering.
102. In response to a specified trigger event on the map application interface, a first offset of finger movement in the specified trigger event is obtained in the process of controlling the map component layer to move along with the specified trigger event.
103. And controlling the movement of the map covering in the container layer according to the type of the specified trigger event and the first offset.
104. And acquiring a second offset of the movement of the map component layer every preset time period.
105. And correcting the movement of the map covering according to the second offset so that the map covering moves synchronously with the map component layer.
According to the method provided by the embodiment of the disclosure, in the process of controlling the map component layer to move based on the specified trigger event, because the finger and the map component layer do not move synchronously but have a certain deviation, when the offset of the movement of the map component layer cannot be obtained, the movement of the map covering is controlled according to the offset of the movement of the finger, so that the movement of the map covering is ensured, and when the offset of the movement of the map component layer can be obtained, the error between the finger and the map component layer is corrected according to the offset of the movement of the map component layer.
In another embodiment of the present disclosure, before displaying the map application interface, the method further includes:
acquiring longitude and latitude of a map covering;
converting the longitude of the map overlay into a screen abscissa and converting the latitude of the map overlay into a screen ordinate;
and drawing the map covering on the container layer based on the corresponding screen coordinate of the map covering.
In another embodiment of the present disclosure, converting the longitude of the map overlay to a screen abscissa comprises:
acquiring the maximum longitude, the minimum longitude and the screen width on a map application interface displayed on a current screen;
according to the maximum longitude, the minimum longitude, the screen width and the longitude of the map covering, applying the following formula to convert the longitude of the map covering into the screen abscissa:
Figure BDA0003156454830000071
wherein x represents the screen abscissa corresponding to the map overlay, λ represents the longitude of the map overlay, λ max Denotes the maximum longitude, λ min Represents the minimum longitude and w represents the screen width.
In another embodiment of the present disclosure, converting the latitude of the map overlay to the screen ordinate comprises:
acquiring the maximum latitude, the minimum latitude and the screen height on a map application interface displayed by a current screen;
acquiring a maximum screen ordinate corresponding to the maximum latitude, a minimum screen ordinate corresponding to the minimum latitude and a to-be-processed screen ordinate corresponding to the map covering;
according to the maximum screen ordinate, the minimum screen ordinate, the screen ordinate to be processed and the screen height, applying the following formula to convert the latitude of the map overlay into the screen ordinate:
Figure BDA0003156454830000081
Wherein y represents the screen ordinate corresponding to the map overlay, y m Representing the ordinate, y, of the screen to be processed max Denotes the maximum screen ordinate, y min Represents the minimum screen ordinate and h represents the screen height.
In another embodiment of the present disclosure, drawing a map overlay onto a container layer based on screen coordinates corresponding to the map overlay includes:
obtaining style information of a locally stored map overlay;
determining a target display position of the map covering on the container layer based on the screen coordinates corresponding to the map covering;
and drawing the map covering on the target display position on the container layer according to the style information.
In another embodiment of the present disclosure, the designated trigger event is a single-finger drag event, and the first offset comprises a designated direction and a first distance of finger movement;
controlling the movement of the map overlay in the container layer according to the type of the specified trigger event and the first offset, comprising:
the map cover is controlled to move a first distance in accordance with the direction of movement.
In another embodiment of the present disclosure, the specified triggering event comprises a two-finger zoom-out event or a two-finger zoom-in event, the first movement offset comprises a second distance of finger movement;
Controlling the movement of the map overlay in the container layer according to the type of the specified trigger event and the first offset, comprising:
acquiring the initial distance of the two fingers;
calculating the scaling of the two fingers according to the initial distance and the second distance;
when the designated trigger event is a double-finger zoom-out event, taking the central point of the current screen as a zoom center, and controlling the map covering object to zoom out to the central point according to the zoom scale;
and when the specified trigger event is a double-finger amplification event, taking the central point as a zooming center, and controlling the map covering to deviate from the central point for amplification according to the zooming scale.
All the above optional technical solutions may be combined arbitrarily to form the optional embodiments of the present disclosure, and are not described herein again.
The embodiment of the present disclosure provides a method for controlling movement of a map overlay, and referring to fig. 2, a flow of the method provided by the embodiment of the present disclosure includes:
201. the terminal obtains the longitude and latitude of the map covering.
The map overlay comprises a starting point icon, an end point icon, a user current position icon, characters, bubble information and the like on the navigation route. After a user inputs a starting point position and an end point position based on a starting point input box and an end point input box provided by a map application interface, the map application client reads the starting point position and the end point position input by the user, and then a navigation route from the starting point position to the end point position is planned for the user by combining with a map database. And for any map covering on the navigation route, the terminal acquires the longitude and latitude of the map covering based on the geographic coordinate system. The geographic coordinate system is a coordinate system consisting of longitude and latitude, and can represent any position on the earth.
202. The terminal converts the longitude of the map overlay to the screen abscissa and converts the latitude of the map overlay to the screen ordinate.
In order to facilitate drawing of the map overlay on the navigation route on the map application interface, the terminal needs to convert the map overlay from a longitude and latitude representation form to a screen coordinate representation form. The terminal will typically convert the longitude of the map overlay to the screen abscissa and the latitude of the map overlay to the screen ordinate.
When the terminal converts the longitude of the map covering into the screen abscissa, the following method can be adopted:
20211. and the terminal acquires the maximum longitude, the minimum longitude and the screen width on the map application interface displayed on the current screen.
Based on the map application interface displayed on the current screen, the terminal acquires the longitude of two points positioned at the upper left corner and the lower right corner of the map application interface, so that the maximum longitude and the minimum longitude of the map application interface displayed on the current screen are obtained. For example, in the map application interface shown in fig. 3, when the terminal acquires the maximum longitude and the minimum longitude of the map application interface, the terminal may acquire the longitude of two points, namely, the upper left corner and the lower right corner of the map application interface.
The terminal acquires screen display parameters including screen width, screen height and the like, and then acquires the screen width from the screen display parameters.
20212. The terminal converts the longitude of the map covering into a screen abscissa by applying the following formula according to the maximum longitude, the minimum longitude, the screen width and the longitude of the map covering:
Figure BDA0003156454830000091
where x represents the screen abscissa corresponding to the map overlay, λ represents the longitude of the map overlay, λ max Denotes the maximum longitude, λ min Represents the minimum longitude and w represents the screen width.
When the terminal converts the latitude of the map covering into the screen ordinate, the following method can be adopted:
20221. the terminal acquires the maximum latitude, the minimum latitude and the screen height on a map application interface displayed by a current screen.
Based on the map application interface displayed by the current screen, the terminal acquires the latitudes of two points, namely the upper left corner and the lower right corner, on the screen, so that the maximum latitude and the minimum latitude of the map application interface displayed by the current screen are obtained. For example, in the map application interface shown in fig. 3, when the terminal acquires the maximum latitude and the minimum latitude of the map application interface, the terminal may acquire the latitudes of two points, namely the upper left corner and the lower right corner of the map application interface.
The terminal acquires screen display parameters including screen width, screen height and the like, and further acquires the screen height from the screen display parameters.
20222. The terminal obtains a maximum screen ordinate corresponding to the maximum latitude, a minimum screen ordinate corresponding to the minimum latitude and a to-be-processed screen ordinate corresponding to the map covering.
Based on the obtained maximum latitude, the terminal applies the following formula to convert the maximum latitude into a corresponding maximum screen ordinate:
Figure BDA0003156454830000101
wherein, y max Which represents the maximum screen ordinate, is,
Figure BDA0003156454830000102
representing the maximum latitude.
Based on the obtained minimum latitude, the terminal applies the following formula to convert the minimum latitude into a corresponding minimum screen ordinate:
Figure BDA0003156454830000103
wherein, y min The minimum screen ordinate is represented by the minimum screen ordinate,
Figure BDA0003156454830000104
representing the minimum latitude.
Based on the obtained latitude of the map covering, the terminal applies the following formula to convert the latitude of the map covering into a vertical coordinate of the screen to be processed:
Figure BDA0003156454830000105
wherein, y m Represents the ordinate of the screen to be processed,
Figure BDA0003156454830000106
representing the latitude of the map overlay.
20223. The terminal converts the latitude of the map cover into the screen ordinate by applying the following formula according to the maximum screen ordinate, the minimum screen ordinate, the screen ordinate to be processed and the screen height:
Figure BDA0003156454830000111
Wherein y represents the screen ordinate corresponding to the map overlay, y m Representing the ordinate, y, of the screen to be processed max Denotes the maximum screen ordinate, y min Represents the minimum screen ordinate and h represents the screen height.
203. And drawing the map covering on the container layer by the terminal based on the screen coordinate corresponding to the map covering.
After the longitude and latitude of the map overlay are converted into the screen coordinates in the step 202, the terminal draws the overlay on the container layer based on the screen coordinates corresponding to the map overlay. The container layer is positioned on the upper layer of the map component layer on the map application interface, the container layer is used for displaying map covering, the map component layer is used for displaying object information of a map, and the object comprises buildings, streets, parks, lakes and the like. To enable normal display of the objects in the map component, the container layer is typically transparent.
Based on the screen coordinates corresponding to the map covering, when the terminal draws the map covering on the container layer, the following method can be adopted:
2031. the terminal acquires style information of a locally stored map overlay.
In the embodiment of the present disclosure, the terminal may locally store style information of the map overlay, where the style information includes HTML (HyperText Markup Language), CSS (cascading styles documents), and the like. Based on the style information of the map overlay stored locally, the terminal directly obtains the style information from the local without obtaining the style information based on an Application Programming Interface (API) provided by a Software Development Kit (SDK) of the map Application. Moreover, the APIs provided by the SDKs of different map applications are different, and when switching map applications, research and development personnel need to be fully familiar with development documents, which is high in development cost.
2032. And based on the corresponding screen coordinates of the map covering, the terminal determines the target display position of the map covering on the container layer.
And the terminal determines the position indicated by the screen coordinate of the map covering as the target display position of the map covering on the container layer.
2033. And drawing the map covering on the target display position on the container layer by the terminal according to the style information.
And drawing the map covering on the target display position of the container layer by the terminal according to the style information, thereby realizing the drawing of the map covering, wherein the map covering and the object information are positioned on different layers.
204. And the terminal displays a map application interface.
The map application interface comprises a map component layer and a container layer, wherein the map component layer is provided with object information, and the container layer is provided with a map covering. The display effect of the map application interface can be seen in fig. 4.
205. In response to a specified trigger event on the map application interface, the terminal acquires a first offset of finger movement in the specified trigger event in the process of controlling the map component layer to move along with the specified trigger event.
The designated trigger event refers to an event capable of triggering the map application interface to move, and the designated trigger event includes any one of a single-finger dragging event, a double-finger zooming-out event, a double-finger zooming-in event and the like. When a specified trigger event is detected on the map application interface, the terminal controls the map component layer to move along with the specified trigger event, and a first offset of finger movement in the specified trigger event is obtained in the moving process. When the terminal acquires the first offset of the finger movement in the specified trigger event, the terminal can acquire the first offset by adopting a sensor on the surface of the map application interface.
206. And the terminal controls the map covering in the container layer to move according to the type of the specified trigger event and the first offset.
According to the type of the specified trigger event, when the terminal controls the movement of the map overlay in the container layer, the following situations are included but not limited:
first case, designating the trigger event as a single-finger drag event
When the appointed trigger event is a single-finger dragging event, a single finger drags the map application interface along the appointed direction, in the dragging process, the terminal obtains a first offset, the first offset comprises the appointed direction, a first distance moved by the finger and the like, and then the map covering in the container layer is controlled to move the first distance according to the appointed direction.
In the second case, the specified triggering event is a two-finger zoom-out event.
When the appointed trigger event is a double-finger zooming-out event, the terminal obtains the initial distance of the double fingers, and in the process that the double fingers perform zooming-out operation on the map application interface, the terminal obtains the second distance of the finger movement, then the second distance of the finger movement is subtracted from the initial distance to obtain the current distance of the double fingers, then the ratio of the current distance of the double fingers to the initial distance is calculated to obtain the zooming ratio of the double fingers, and then the map covering object is controlled to zoom out towards the central point according to the zooming ratio by taking the central point of the current screen as the zooming center.
In a third case, the specified trigger event is a dual-finger zoom event.
When the appointed trigger event is a double-finger amplification event, the terminal obtains the initial distance of the double fingers, and in the process that the double fingers perform amplification operation on the map application interface, the terminal obtains the second distance of the movement of the fingers, then the initial distance is added with the second distance of the movement of the fingers to obtain the current distance of the double fingers, then the ratio of the current distance of the double fingers to the initial distance is calculated to obtain the zoom scale of the double fingers, and then the map covering object is controlled to deviate from the central point to be amplified according to the zoom scale by taking the central point of the current screen as a zoom center.
207. And the terminal acquires a second offset of the movement of the map component layer every a preset time period.
Wherein, the preset time period may be 20 milliseconds, 30 milliseconds, etc. And the terminal performs map view polling every other preset time period, acquires the maximum coordinates and the minimum coordinates of the upper left corner and the lower right corner in the current screen through the map view polling, and calculates a second offset of the movement of the map component layer when performing the map view polling twice according to the maximum coordinates and the minimum coordinates of the upper left corner and the lower right corner in the screen when performing the map view polling last time.
208. And the terminal corrects the movement of the map covering according to the second offset so that the map covering moves synchronously with the map component layer.
For a specified trigger event on the map application interface, after the finger starts to move, the map component layer does not move along with the movement of the finger, and when the distance of the finger movement reaches a preset distance (for example, 5 pixels, 6 pixels, and the like), it is determined that the specified trigger event is not a click event, and at this time, the map component layer starts to move, so that the movement of the finger and the movement of the map component layer are not synchronous. In contrast, in the embodiment of the present disclosure, before the second offset of the movement of the map component layer is not obtained, the map overlay in the container layer is controlled to move according to the first offset of the finger movement, so that the movement of the map component layer and the movement of the container layer are not synchronized. In order to enable the map component layer and the container layer to move synchronously, the terminal corrects the movement of the map component layer according to the second offset of the movement of the map component layer, so that the moving distances of the map component layer and the container layer are the same, and the effect of synchronous movement is achieved.
Fig. 5 is a process of controlling movement of a map overlay, and referring to fig. 5, when a gesture event is detected, a map component layer moves along with the gesture event, and in the process of moving along with the gesture event, a terminal acquires an offset of movement of a finger and controls movement of the map overlay in a transparent container layer. The terminal starts map view polling, obtains the moving offset of the map component layer by carrying out the map view polling, and then corrects the movement of the map covering in the container layer according to the moving offset of the map component layer, so that the map component layer and the container layer move synchronously.
According to the method provided by the embodiment of the disclosure, in the process of controlling the map component layer to move based on the specified trigger event, because the finger and the map component layer do not move synchronously but have a certain deviation, when the offset of the movement of the map component layer cannot be obtained, the movement of the map covering is controlled according to the offset of the movement of the finger, so that the movement of the map covering is ensured, and when the offset of the movement of the map component layer can be obtained, the error between the finger and the map component layer is corrected according to the offset of the movement of the map component layer.
In addition, according to the embodiment of the disclosure, the transparent container layer is added on the map component layer, and the map covering is drawn on the container layer, so that the drawing of the object information and the drawing of the map covering are separated, the drawing of the map covering does not depend on an API provided by a map application, the map style with a complex style can be drawn, and the style of the map covering is enriched.
Referring to fig. 6, an embodiment of the present disclosure provides an apparatus for controlling movement of a map overlay, the apparatus including:
The display module 601 is configured to display a map application interface, where the map application interface includes a map component layer and a container layer, the map component layer displays object information, and the container layer displays a map cover;
the obtaining module 602 is configured to, in response to a specified trigger event on the map application interface, obtain a first offset of finger movement in the specified trigger event in a process of controlling the map component layer to move along with the specified trigger event;
the control module 603 is configured to control the map overlay in the container layer to move according to the type of the specified trigger event and the first offset;
the obtaining module 602 is further configured to obtain a second offset of the movement of the map component layer every preset time period;
and the correcting module 604 is configured to correct the movement of the map cover according to the second offset, so that the map cover moves synchronously with the map component layer.
In another embodiment of the present disclosure, the apparatus further comprises:
the obtaining module 602 is further configured to obtain longitude and latitude of the map overlay;
the conversion module is used for converting the longitude of the map covering into a screen horizontal coordinate and converting the latitude of the map covering into a screen vertical coordinate;
And the drawing module is used for drawing the map covering on the container layer based on the screen coordinate corresponding to the map covering.
In another embodiment of the present disclosure, the conversion module is configured to obtain a maximum longitude, a minimum longitude, and a screen width on a map application interface displayed on a current screen; according to the maximum longitude, the minimum longitude, the screen width and the longitude of the map covering, applying the following formula to convert the longitude of the map covering into the screen abscissa:
Figure BDA0003156454830000151
wherein x represents the screen abscissa corresponding to the map overlay, λ represents the longitude of the map overlay, λ max Denotes the maximum longitude, λ min Represents the minimum longitude and w represents the screen width.
In another embodiment of the present disclosure, the conversion module is configured to obtain a maximum latitude, a minimum latitude, and a screen height on a map application interface displayed on a current screen; acquiring a maximum screen ordinate corresponding to the maximum latitude, a minimum screen ordinate corresponding to the minimum latitude and a to-be-processed screen ordinate corresponding to the map covering; according to the maximum screen ordinate, the minimum screen ordinate, the screen ordinate to be processed and the screen height, applying the following formula to convert the latitude of the map overlay into the screen ordinate:
Figure BDA0003156454830000152
Wherein y represents the screen ordinate corresponding to the map overlay, y m Representing the ordinate, y, of the screen to be processed max Denotes the maximum screen ordinate, y min Represents the minimum screen ordinate and h represents the screen height.
In another embodiment of the present disclosure, a rendering module for obtaining style information of a locally stored map overlay; determining a target display position of the map covering on the container layer based on the screen coordinates corresponding to the map covering; and drawing the map covering on the target display position on the container layer according to the style information.
In another embodiment of the present disclosure, the designated trigger event is a single-finger drag event, and the first offset comprises a designated direction and a first distance of finger movement;
and the control module is used for controlling the map covering to move a first distance according to the moving direction.
In another embodiment of the present disclosure, the specified triggering event comprises a two-finger zoom-out event or a two-finger zoom-in event, the first movement offset comprises a second distance of finger movement;
the control module is used for acquiring the initial distance of the double fingers; calculating the scaling of the two fingers according to the initial distance and the second distance; when the designated trigger event is a double-finger zoom-out event, taking the central point of the current screen as a zoom center, and controlling the map covering object to zoom out to the central point according to the zoom scale; and when the designated trigger event is a double-finger amplification event, the central point is used as a zooming center, and the map covering is controlled to deviate from the central point to be amplified according to the zooming proportion.
To sum up, in the device provided in the embodiment of the present disclosure, in the process of controlling the movement of the map component layer based on the specified trigger event, because the finger and the map component layer do not move synchronously but have a certain deviation, when the offset of the movement of the map component layer cannot be obtained, the movement of the map overlay is controlled according to the offset of the movement of the finger, so as to ensure that the map overlay can move, and when the offset of the movement of the map component layer can be obtained, the error between the finger and the map component layer is corrected according to the offset of the movement of the map component layer.
In addition, according to the embodiment of the disclosure, the transparent container layer is added on the map component layer, and the map covering is drawn on the container layer, so that the drawing of the object information and the drawing of the map covering are separated, the drawing of the map covering does not depend on an API provided by a map application, the map style with a complex style can be drawn, and the style of the map covering is enriched.
Fig. 7 shows a block diagram of a terminal 700 according to an exemplary embodiment of the present disclosure. The terminal 700 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. Terminal 700 may also be referred to as a user equipment, portable terminal, laptop terminal, desktop terminal, or by other names.
In general, the terminal 700 includes: a processor 701 and a memory 702.
The processor 701 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 701 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 701 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 701 may be integrated with a GPU (Graphics Processing Unit) which is responsible for rendering and drawing the content required to be displayed by the display screen. In some embodiments, the processor 701 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 702 may include one or more computer-readable storage media, which may be non-transitory. Memory 702 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 702 is used to store at least one instruction for execution by processor 701 to implement a method of controlling movement of a map overlay as provided by method embodiments herein.
In some embodiments, the terminal 700 may further optionally include: a peripheral interface 703 and at least one peripheral. The processor 701, the memory 702, and the peripheral interface 703 may be connected by buses or signal lines. Various peripheral devices may be connected to peripheral interface 703 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of a radio frequency circuit 704, a display screen 705, a camera assembly 706, an audio circuit 707, a positioning component 708, and a power source 709.
The peripheral interface 703 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 701 and the memory 702. In some embodiments, processor 701, memory 702, and peripheral interface 703 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 701, the memory 702, and the peripheral interface 703 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
The Radio Frequency circuit 704 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 704 communicates with communication networks and other communication devices via electromagnetic signals. The rf circuit 704 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 704 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuitry 704 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: metropolitan area networks, various generation mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 704 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 705 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 705 is a touch display screen, the display screen 705 also has the ability to capture touch signals on or over the surface of the display screen 705. The touch signal may be input to the processor 701 as a control signal for processing. At this point, the display 705 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 705 may be one, providing the front panel of the terminal 700; in other embodiments, the display 705 can be at least two, respectively disposed on different surfaces of the terminal 700 or in a folded design; in other embodiments, the display 705 may be a flexible display disposed on a curved surface or on a folded surface of the terminal 700. Even more, the display 705 may be arranged in a non-rectangular irregular pattern, i.e. a shaped screen. The Display 705 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), or the like.
The camera assembly 706 is used to capture images or video. Optionally, camera assembly 706 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 706 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuitry 707 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 701 for processing or inputting the electric signals to the radio frequency circuit 704 to realize voice communication. For the purpose of stereo sound collection or noise reduction, a plurality of microphones may be provided at different portions of the terminal 700. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 701 or the radio frequency circuit 704 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuitry 707 may also include a headphone jack.
The positioning component 708 is used to locate the current geographic Location of the terminal 700 for navigation or LBS (Location Based Service). The Positioning component 708 can be a Positioning component based on the GPS (Global Positioning System) in the united states, the beidou System in china, the graves System in russia, or the galileo System in the european union.
A power supply 709 is used to supply power to the various components in terminal 700. The power source 709 may be alternating current, direct current, disposable batteries, or rechargeable batteries. When power source 709 includes a rechargeable battery, the rechargeable battery may support wired or wireless charging. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 700 also includes one or more sensors 710. The one or more sensors 710 include, but are not limited to: acceleration sensor 711, gyro sensor 712, pressure sensor 713, fingerprint sensor 714, optical sensor 715, and proximity sensor 716.
The acceleration sensor 711 can detect the magnitude of acceleration in three coordinate axes of a coordinate system established with the terminal 700. For example, the acceleration sensor 711 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 701 may control the display screen 705 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 711. The acceleration sensor 711 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 712 may detect a body direction and a rotation angle of the terminal 700, and the gyro sensor 712 may cooperate with the acceleration sensor 711 to acquire a 3D motion of the terminal 700 by the user. From the data collected by the gyro sensor 712, the processor 701 may implement the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensors 713 may be disposed on a side frame of terminal 700 and/or underneath display 705. When the pressure sensor 713 is disposed on a side frame of the terminal 700, a user's grip signal on the terminal 700 may be detected, and the processor 701 performs right-left hand recognition or shortcut operation according to the grip signal collected by the pressure sensor 713. When the pressure sensor 713 is disposed at a lower layer of the display screen 705, the processor 701 controls the operability control on the UI interface according to the pressure operation of the user on the display screen 705. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 714 is used for collecting a fingerprint of a user, and the processor 701 identifies the identity of the user according to the fingerprint collected by the fingerprint sensor 714, or the fingerprint sensor 714 identifies the identity of the user according to the collected fingerprint. When the user identity is identified as a trusted identity, the processor 701 authorizes the user to perform relevant sensitive operations, including unlocking a screen, viewing encrypted information, downloading software, paying, changing settings, and the like. The fingerprint sensor 714 may be disposed on the front, back, or side of the terminal 700. When a physical button or a vendor Logo is provided on the terminal 700, the fingerprint sensor 714 may be integrated with the physical button or the vendor Logo.
The optical sensor 715 is used to collect the ambient light intensity. In one embodiment, the processor 701 may control the display brightness of the display screen 705 based on the ambient light intensity collected by the optical sensor 715. Specifically, when the ambient light intensity is high, the display brightness of the display screen 705 is increased; when the ambient light intensity is low, the display brightness of the display screen 705 is adjusted down. In another embodiment, processor 701 may also dynamically adjust the shooting parameters of camera assembly 706 based on the ambient light intensity collected by optical sensor 715.
A proximity sensor 716, also referred to as a distance sensor, is typically disposed on a front panel of the terminal 700. The proximity sensor 716 is used to collect the distance between the user and the front surface of the terminal 700. In one embodiment, when the proximity sensor 716 detects that the distance between the user and the front surface of the terminal 700 gradually decreases, the processor 701 controls the display 705 to switch from the bright screen state to the dark screen state; when the proximity sensor 716 detects that the distance between the user and the front surface of the terminal 700 is gradually increased, the processor 701 controls the display 705 to switch from the breath-screen state to the bright-screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 7 is not intended to be limiting of terminal 700 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
The disclosed embodiments provide a computer-readable storage medium having stored therein at least one program code, the at least one program code being loaded into and executed by a processor, to control a method of map overlay movement. The computer readable storage medium may be non-transitory. For example, the computer readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is intended to be exemplary only and not to limit the present disclosure, and any modification, equivalent replacement, or improvement made without departing from the spirit and scope of the present disclosure is to be considered as the same as the present disclosure.

Claims (8)

1. A method of controlling movement of a map covering, the method comprising:
Acquiring longitude and latitude of a map covering;
converting the longitude of the map overlay to a screen abscissa and converting the latitude of the map overlay to a screen ordinate;
obtaining style information of a locally stored map overlay;
determining a target display position of the map covering on the container layer based on the screen coordinate corresponding to the map covering;
drawing the map covering to a target display position on the container layer according to the style information;
displaying a map application interface, wherein the map application interface comprises a map component layer and the container layer, the map component layer is displayed with object information, and the container layer is displayed with the map covering;
responding to a specified trigger event on the map application interface, and acquiring a first offset of finger movement in the specified trigger event in the process of controlling the map component layer to move along with the specified trigger event;
controlling the map covering in the container layer to move according to the type of the specified trigger event and the first offset;
acquiring a second offset of the movement of the map component layer every other preset time period;
And correcting the movement of the map covering according to the second offset so that the map covering moves synchronously with the map component layer.
2. The method of claim 1, wherein converting the longitude of the map overlay to a screen abscissa comprises:
acquiring the maximum longitude, the minimum longitude and the screen width on a map application interface displayed on a current screen;
applying the following formula to convert the longitude of the map overlay to a screen abscissa based on the maximum longitude, the minimum longitude, the screen width, and the longitude of the map overlay:
Figure 887173DEST_PATH_IMAGE002
wherein the content of the first and second substances,
Figure 273155DEST_PATH_IMAGE004
representing the corresponding screen abscissa of the map overlay,
Figure 849630DEST_PATH_IMAGE006
represents the longitude of the map overlay and,
Figure 510418DEST_PATH_IMAGE008
which represents the maximum longitude of the said object,
Figure 668998DEST_PATH_IMAGE010
which represents the minimum longitude of the said object,
Figure 960302DEST_PATH_IMAGE012
representing the screen width.
3. The method of claim 1, wherein the converting the latitude of the map overlay to a screen ordinate comprises:
acquiring the maximum latitude, the minimum latitude and the screen height on the map application interface displayed on the current screen;
acquiring a maximum screen ordinate corresponding to the maximum latitude, a minimum screen ordinate corresponding to the minimum latitude and a to-be-processed screen ordinate corresponding to the map covering;
According to the maximum screen ordinate, the minimum screen ordinate, the screen ordinate to be processed and the screen height, applying the following formula to convert the latitude of the map overlay into the screen ordinate:
Figure 820811DEST_PATH_IMAGE014
wherein, the first and the second end of the pipe are connected with each other,
Figure 488553DEST_PATH_IMAGE016
representing the screen ordinate to which the map overlay corresponds,
Figure 174DEST_PATH_IMAGE018
represents the ordinate of the screen to be processed,
Figure 462380DEST_PATH_IMAGE020
represents the maximum screen ordinate of the screen,
Figure 810184DEST_PATH_IMAGE022
represents the minimum screen ordinate, and represents the minimum screen ordinate,
Figure 16038DEST_PATH_IMAGE024
representing the screen height.
4. The method of claim 1, wherein the specified trigger event is a single-finger drag event, and wherein the first offset comprises a specified direction and a first distance of finger movement;
the controlling the map overlay in the container layer to move according to the type of the specified trigger event and the first offset includes:
and controlling the map covering to move the first distance according to the designated direction.
5. The method of claim 1, wherein the specified triggering event comprises a two-finger zoom-out event or a two-finger zoom-in event, and wherein the first offset comprises a second distance of finger movement;
the controlling the map overlay in the container layer to move according to the type of the specified trigger event and the first offset includes:
Acquiring the initial distance of the two fingers;
calculating the scaling of the two fingers according to the initial distance and the second distance;
when the specified trigger event is a double-finger zoom-out event, taking the central point of the current screen as a zooming center, and controlling the map covering object to zoom out to the central point according to the zooming proportion;
and when the specified trigger event is a double-finger amplification event, the central point is used as a zooming center, and the map covering is controlled to deviate from the central point to be amplified according to the zooming proportion.
6. An apparatus for controlling movement of a map covering, the apparatus comprising:
the acquisition module is used for acquiring the longitude and latitude of the map covering;
the conversion module is used for converting the longitude of the map covering into a screen horizontal coordinate and converting the latitude of the map covering into a screen vertical coordinate;
the drawing module is used for acquiring the style information of the map covering stored locally; determining a target display position of the map covering on the container layer based on the screen coordinate corresponding to the map covering; drawing the map covering to a target display position on the container layer according to the style information;
The display module is used for displaying a map application interface, the map application interface comprises a map component layer and the container layer, the map component layer is displayed with object information, and the container layer is displayed with the map covering;
the obtaining module is used for responding to a specified trigger event on the map application interface, and obtaining a first offset of finger movement in the specified trigger event in the process of controlling the map component layer to move along with the specified trigger event;
the control module is used for controlling the map covering in the container layer to move according to the type of the specified trigger event and the first offset;
the acquisition module is further used for acquiring a second offset of the movement of the map component layer every other preset time period;
and the correction module is used for correcting the movement of the map covering according to the second offset so as to enable the map covering to move synchronously with the map component layer.
7. A terminal, characterized in that the terminal comprises a processor and a memory, in which at least one program code is stored, which is loaded and executed by the processor to implement the method of controlling movement of a map covering as claimed in any one of claims 1 to 5.
8. A computer-readable storage medium, having stored therein at least one program code, which is loaded and executed by a processor, to implement the method of controlling movement of a map cover according to any one of claims 1 to 5.
CN202110777898.3A 2021-07-09 2021-07-09 Method, device, terminal and storage medium for controlling movement of map covering Active CN113467682B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110777898.3A CN113467682B (en) 2021-07-09 2021-07-09 Method, device, terminal and storage medium for controlling movement of map covering

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110777898.3A CN113467682B (en) 2021-07-09 2021-07-09 Method, device, terminal and storage medium for controlling movement of map covering

Publications (2)

Publication Number Publication Date
CN113467682A CN113467682A (en) 2021-10-01
CN113467682B true CN113467682B (en) 2022-07-29

Family

ID=77879421

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110777898.3A Active CN113467682B (en) 2021-07-09 2021-07-09 Method, device, terminal and storage medium for controlling movement of map covering

Country Status (1)

Country Link
CN (1) CN113467682B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013037935A1 (en) * 2011-09-15 2013-03-21 Pole Star Device and method for collecting information relating to access points
CN103631474A (en) * 2012-08-28 2014-03-12 鸿富锦精密工业(深圳)有限公司 System and method for controlling graph moving
CN105512136A (en) * 2014-09-25 2016-04-20 中兴通讯股份有限公司 Method and device for processing based on layer
CN108681453A (en) * 2018-05-21 2018-10-19 京东方科技集团股份有限公司 The implementation method and device of engine map

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4559699B2 (en) * 2002-11-25 2010-10-13 株式会社ゼンリンデータコム Site guidance system
US8302033B2 (en) * 2007-06-22 2012-10-30 Apple Inc. Touch screen device, method, and graphical user interface for providing maps, directions, and location-based information
RU2010142014A (en) * 2008-03-14 2012-04-20 Томтом Интернэшнл Б.В. (Nl) NAVIGATION DEVICE AND METHOD USING CARTOGRAPHIC DATA CORRECTION FILES
CN108829336B (en) * 2018-06-29 2021-07-16 深圳市理邦精密仪器股份有限公司 Waveform moving method, apparatus and computer readable storage medium
CN109829090A (en) * 2018-11-30 2019-05-31 青岛禧泰房地产数据有限公司 A kind of data display method using more maps
CN111815740B (en) * 2020-07-27 2024-01-30 城云科技(中国)有限公司 Map drawing method, system, terminal and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013037935A1 (en) * 2011-09-15 2013-03-21 Pole Star Device and method for collecting information relating to access points
CN103631474A (en) * 2012-08-28 2014-03-12 鸿富锦精密工业(深圳)有限公司 System and method for controlling graph moving
CN105512136A (en) * 2014-09-25 2016-04-20 中兴通讯股份有限公司 Method and device for processing based on layer
CN108681453A (en) * 2018-05-21 2018-10-19 京东方科技集团股份有限公司 The implementation method and device of engine map

Also Published As

Publication number Publication date
CN113467682A (en) 2021-10-01

Similar Documents

Publication Publication Date Title
CN108595239B (en) Picture processing method, device, terminal and computer readable storage medium
CN111464749B (en) Method, device, equipment and storage medium for image synthesis
CN108965922B (en) Video cover generation method and device and storage medium
CN108196755B (en) Background picture display method and device
CN109862412B (en) Method and device for video co-shooting and storage medium
CN109166150B (en) Pose acquisition method and device storage medium
CN110321126B (en) Method and device for generating page code
CN110941375B (en) Method, device and storage medium for locally amplifying image
CN110225390B (en) Video preview method, device, terminal and computer readable storage medium
CN109634688B (en) Session interface display method, device, terminal and storage medium
CN112667835A (en) Work processing method and device, electronic equipment and storage medium
CN108734662B (en) Method and device for displaying icons
CN111083526B (en) Video transition method and device, computer equipment and storage medium
CN112565806A (en) Virtual gift presenting method, device, computer equipment and medium
CN113160031B (en) Image processing method, device, electronic equipment and storage medium
CN110839174A (en) Image processing method and device, computer equipment and storage medium
CN108845777B (en) Method and device for playing frame animation
CN111158575B (en) Method, device and equipment for terminal to execute processing and storage medium
CN110992268B (en) Background setting method, device, terminal and storage medium
CN111370096A (en) Interactive interface display method, device, equipment and storage medium
CN109032492B (en) Song cutting method and device
CN109033473B (en) Picture sharing method and device and computer readable storage medium
CN111666076A (en) Layer adding method, device, terminal and storage medium
CN113613053B (en) Video recommendation method and device, electronic equipment and storage medium
CN113467682B (en) Method, device, terminal and storage medium for controlling movement of map covering

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant