CN110865765A - Terminal and map control method - Google Patents

Terminal and map control method Download PDF

Info

Publication number
CN110865765A
CN110865765A CN201911077345.6A CN201911077345A CN110865765A CN 110865765 A CN110865765 A CN 110865765A CN 201911077345 A CN201911077345 A CN 201911077345A CN 110865765 A CN110865765 A CN 110865765A
Authority
CN
China
Prior art keywords
map
terminal
screen
target negative
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911077345.6A
Other languages
Chinese (zh)
Inventor
李静
张军
吴锦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Mobile Communications Technology Co Ltd
Original Assignee
Hisense Mobile Communications Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Mobile Communications Technology Co Ltd filed Critical Hisense Mobile Communications Technology Co Ltd
Priority to CN201911077345.6A priority Critical patent/CN110865765A/en
Publication of CN110865765A publication Critical patent/CN110865765A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Databases & Information Systems (AREA)
  • Remote Sensing (AREA)
  • Data Mining & Analysis (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention relates to a terminal and a map control method, relating to the technical field of Internet, and aiming at solving the problem that in the prior art, the map control mode is complicated because the map control process is carried out after the map application is found in a plurality of application icons in a plurality of screens, and the terminal comprises the following steps: the processor is used for responding to a map display instruction input by a user through the input unit and displaying a map corresponding to the map display instruction in the target negative screen through the display screen; and after detecting that the user performs control operation on the map in the target negative screen, moving the map displayed in the target negative screen and/or adjusting the size of the map displayed in the target negative screen according to the control operation. The embodiment of the invention can display the map in the target negative screen when responding to the map display instruction, and can control the map displayed in the target negative screen, thereby simplifying the operation process of looking up the map.

Description

Terminal and map control method
Technical Field
The invention relates to the technical field of electronic terminals, in particular to a terminal and a map control method.
Background
Generally, when a user needs to control a map, the map application is downloaded and installed in a terminal, so that an icon of the map application is displayed on a desktop, the user clicks the icon of the map application in a touch screen of the terminal, the terminal detects that the position clicked by the user is an area on the desktop corresponding to the icon of the map application, the map application is started, the map in the map application is displayed, and then the user can control the map.
However, in addition to the icons of the map application, the icons of other applications are displayed on the desktop, and meanwhile, the existing terminal includes a plurality of desktops, which are generally called screens, for example, a first screen, a second screen, a negative one screen, a negative two screen, and the like, in each of which a plurality of application icons can be displayed. When a user needs to control a map, the user needs to find a corresponding map application icon from a plurality of application icons in each screen of the terminal, the map application displays the map after the map application is started, and then the user can operate the map.
In summary, the conventional map control method is complicated.
Disclosure of Invention
The invention provides a terminal and a map control method, which are used for solving the problem that in the prior art, a map is controlled after a map application needs to be found in a plurality of application icons in a plurality of screens, so that the map control mode is complicated.
In a first aspect, a terminal provided in an embodiment of the present invention includes: the device comprises a processor, an input unit and a display screen;
the input unit is used for receiving a map display instruction input by a user;
the display screen is used for displaying a map;
the processor is used for responding to a map display instruction input by a user through the input unit and displaying a map corresponding to the map display instruction in a target negative screen through the display screen;
and after detecting that the user performs control operation on the map in the target negative screen, moving the map displayed in the target negative screen and/or adjusting the size of the map displayed in the target negative screen according to the control operation.
According to the terminal, when the map display instruction input by the user through the input unit is responded, the map corresponding to the map display instruction is displayed in the target negative screen through the display screen, and meanwhile, after the control operation of the user on the map in the target negative screen is detected, the size of the map displayed in the target negative screen is moved and/or adjusted according to the control operation, so that the map can be directly displayed in the target negative screen, the map can be checked without finding the map application in a plurality of application icons in each screen and opening the map application, and the operation process of searching the map is simplified.
In one possible implementation, the processor is specifically configured to:
obtaining context information of a map plug-in through a desktop application;
obtaining a map layout file by the desktop application according to the context information of the map plug-in and by the desktop application according to the context information of the map plug-in by using a reflection mechanism;
adding a map identifier corresponding to a map layout file in the map plug-in into a display list of the target negative screen through the desktop application;
responding to a map display instruction of a user, and instantiating a map layout file corresponding to the map identifier in the display list through the desktop application;
initializing the map layout file through the map plug-in to obtain a visual map;
and displaying the map in the map layout file in the target negative screen through the display screen by the desktop application.
The terminal can acquire the context information of the map plug-in through the desktop application, acquire the map layout file by using the reflection mechanism, add the map identifier corresponding to the map layout file in the map plug-in to the display list of the target negative screen, instantiate the map layout file corresponding to the map identifier in the display list through the desktop application when responding to the map display instruction of the user, initialize the map layout file through the map plug-in to obtain the visual map, and display the visual map in the target negative screen, so that the map can be displayed in the target negative screen in response to the map display instruction without downloading and installing the corresponding map application in the terminal, and the operation process of map display is simplified.
In one possible implementation, the processor is specifically configured to:
after the desktop application detects that the user performs sliding operation, determining a moving track of the user touching the touch screen, and controlling a map displayed in the target negative screen to move according to the moving track through a map plug-in;
and after the desktop application detects that the user performs zooming operation, determining the position information of the touch screen touched by the user, and adjusting the size of the map displayed in the target negative screen according to the position information through a map plug-in.
According to the terminal, after the sliding operation of the user can be detected through the desktop application, the moving track touching the touch screen is determined, the map plug-in unit controls the map displayed in the target negative screen to move according to the moving track, the items for processing the sliding operation of the user on the map are completed through the map plug-in unit, the items for processing the desktop application are reduced, and therefore the processing speed of the desktop application is improved. Meanwhile, after the desktop application detects that the user performs zooming operation, the position information of the touch screen touched by the user is determined, and the size of the map displayed in the target negative screen is adjusted through the map plug-in according to the position information.
In one possible implementation, the processor is specifically configured to:
sending the moving track in the sliding operation to a map plug-in through a desktop application;
controlling the map displayed in the target negative screen to move according to the moving track through a map plug-in;
sending the position information in the zooming operation to a map plug-in through a desktop application;
and adjusting the size of the map displayed in the target negative screen according to the position information through a map plug-in.
In the process of processing and responding to the sliding operation of the user, the desktop application sends the moving track in the sliding operation to the map plug-in, the map plug-in can control the map displayed in the target negative screen to move according to the moving track, and in the same way, in the process of processing the zooming operation, the desktop application sends the intercepted position information to the map plug-in, so that the map plug-in can adjust the size of the map displayed in the target negative screen according to the position information.
In one possible implementation, the processor is specifically configured to:
if the place can be analyzed through the map display instruction, the map corresponding to the map display instruction is a map of a preset range with the place as a center; or
And if the place cannot be analyzed through the map display instruction, the map corresponding to the map display instruction is a map in a preset range with the place where the user is located as the center.
When the terminal responds to the map corresponding to the map display instruction, if the map display instruction comprises the place which the user wants to view, the map within the preset range with the place as the center is used, and if the map display instruction does not comprise the place which the user wants to view, the map within the preset range with the place where the user is located as the center is used, so that the user does not need to search the place which needs to be viewed on the map after the map is displayed, and the map viewing efficiency is improved.
In a second aspect, a map control method provided in an embodiment of the present invention is applied to a terminal, and the method includes:
responding to a map display instruction input by a user through an input unit, and displaying a map corresponding to the map display instruction in a target negative screen through a display screen by the terminal;
and after detecting that the user performs control operation on the map in the target negative screen, the terminal moves the map displayed in the target negative screen and/or adjusts the size of the map displayed in the target negative screen according to the control operation.
In a possible implementation manner, before the responding to a map display instruction input by a user through an input unit and the terminal displays a map corresponding to the map display instruction in a target negative screen through a display screen, the method further includes:
the terminal acquires the context information of the map plug-in through the desktop application;
the terminal acquires a map layout file by using a reflection mechanism according to the context information of the map plug-in through the desktop application;
the terminal adds a map identifier corresponding to a map layout file in the map plug-in into a display list of the target negative screen through desktop application;
the responding user displays the map corresponding to the map display instruction in the target negative screen through the display screen in response to the map display instruction input by the user through the input unit, and the method comprises the following steps:
responding to a map display instruction input by a user through an input unit, and instantiating a map layout file corresponding to the map identifier in the display list by the terminal through the desktop application;
the terminal initializes the map layout file through the map plug-in to obtain a visual map;
and the terminal displays the map in the map layout file in the target negative screen through a display screen through the desktop application.
In a possible implementation manner, the moving, by the terminal, the map displayed in the target negative screen according to the control operation includes:
after the terminal detects that the user performs sliding operation through desktop application, determining a moving track of the touch screen touched by the user, and controlling a map displayed in the target negative screen to move according to the moving track through a map plug-in;
the terminal adjusts the size of the map displayed in the target negative screen according to the control operation, and the method comprises the following steps:
and after the terminal detects that the user performs zooming operation through the desktop application, determining the position information of the touch screen touched by the user, and adjusting the size of the map displayed in the target negative screen through a map plug-in according to the position information.
In a possible implementation manner, the controlling, by the map plug-in, the map displayed in the target negative screen to move according to the movement trajectory includes:
the terminal sends the moving track in the sliding operation to a map plug-in through a desktop application;
the terminal controls the map displayed in the target negative screen to move according to the moving track through a map plug-in;
the adjusting, by the map plug-in, the size of the map displayed in the target negative screen according to the location information includes:
the terminal sends the position information in the zooming operation to a map plug-in through a desktop application;
and the terminal adjusts the size of the map displayed in the target negative screen according to the position information through a map plug-in.
In a possible implementation manner, if a place can be analyzed through the map display instruction, the map corresponding to the map display instruction is a map of a preset range with the place as a center; or
And if the place cannot be analyzed through the map display instruction, the map corresponding to the map display instruction is a map in a preset range with the place where the user is located as the center.
In a third aspect, the present application also provides a computer storage medium having a computer program stored thereon, which when executed by a processing unit, performs the steps of the method of the first aspect.
In addition, for technical effects brought by any one implementation manner of the second aspect to the third aspect, reference may be made to technical effects brought by different implementation manners of the first aspect, and details are not described here.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention and are not to be construed as limiting the invention.
FIG. 1 is a diagram of a plurality of desktops in a background art terminal;
FIG. 2 is a flow chart of a map control method provided by an embodiment of the present invention;
fig. 3 is a schematic diagram of a terminal responding to a map display instruction according to an embodiment of the present invention;
fig. 4 is a schematic diagram of another terminal responding to a map display instruction according to an embodiment of the present invention;
FIG. 5 is a diagram illustrating a terminal display when a user slides up/down a map according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a terminal display when a user slides a map left/right according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of a terminal display when a user zooms in/out a sliding map according to an embodiment of the present invention;
FIG. 8 is a flow chart of a method of moving a displayed map in a target negative screen provided by an embodiment of the present invention;
FIG. 9 is a flow chart of a method of resizing a displayed map in a target negative screen provided by embodiments of the present invention;
fig. 10 is a schematic diagram of a terminal display when a search bar in a first screen inputs a map as a map display instruction according to an embodiment of the present invention;
FIG. 11 is a schematic diagram of a terminal display when a map is input as a map display instruction for a voice application according to an embodiment of the present invention;
FIG. 12 is a flowchart of a process for adding a map display function and a map display to a target negative screen according to an embodiment of the present invention;
fig. 13 is a block diagram of a terminal according to an embodiment of the present invention;
fig. 14 is a block diagram of another terminal according to an embodiment of the present invention;
fig. 15 is a schematic software architecture diagram of a terminal according to an embodiment of the present invention.
Detailed Description
In order to make those skilled in the art better understand the technical solution of the present invention, the technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present invention. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the invention, as detailed in the appended claims.
Some of the words that appear in the text are explained below:
1. the term "and/or" in the embodiments of the present invention describes an association relationship of associated objects, and indicates that three relationships may exist, for example, a and/or B may indicate: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
2. The term "terminal" in the embodiments of the present invention refers to any intelligent electronic device capable of operating according to a program and automatically processing a large amount of data at a high speed, including a mobile phone, a computer, a tablet, an intelligent terminal, a multimedia device, a streaming media device, and the like.
The application scenario described in the embodiment of the present invention is for more clearly illustrating the technical solution of the embodiment of the present invention, and does not form a limitation on the technical solution provided in the embodiment of the present invention, and it can be known by a person skilled in the art that with the occurrence of a new application scenario, the technical solution provided in the embodiment of the present invention is also applicable to similar technical problems. Wherein, in the description of the present invention, unless otherwise indicated, "a plurality" means.
Currently, terminals include multiple desktops, which are commonly referred to as screens. Icons corresponding to applications downloaded and installed in the terminal are displayed on a plurality of screens, and as shown in fig. 1, a schematic diagram of a first screen, a second screen and a third screen included in the terminal is shown, wherein the first screen includes a dialing application, an information application, a browser application, an application market, a mobile phone manager, a picture and a time application, the second screen includes a communication application, a calculator, a camera, a video application, a music application, a subway application, a weather application, a shopping application and an e-mail besides the dialing application, the information application and the browser application, and the third screen includes a map application, a game application, a taxi calling application, a house application, a bank application and a voice application besides the dialing application, the information application and the browser application.
When a user needs to check a map, the terminal is opened, a displayed desktop before the last screen locking is displayed, if the displayed desktop before the last screen locking is the first screen, the user slides to the third screen, finds a map application, clicks the map application, displays the map after the map application is started, and then the user can control the map; if the displayed desktop before the last screen locking is the second screen, the user slides to the third screen, finds the map application, clicks the map application, displays the map after the map application is started, and then the user can control the map; if the displayed desktop before the last screen locking is the third screen, the user finds the map application in the third screen, clicks the map application, displays the map after the map application is started, and then the user can control the map.
In summary, the conventional map control method is complicated.
Therefore, the embodiment of the invention provides a map control method, which can solve the problem that in the prior art, a map is controlled after a map application needs to be found in a plurality of application icons in a plurality of screens, so that the map control mode is complicated.
The following is a detailed description of a map control method. As shown in fig. 2, the method specifically includes the following steps:
s200: and responding to a map display instruction input by the user through the input unit, and displaying a map corresponding to the map display instruction in the target negative screen through the display screen by the terminal.
S201: and after detecting that the user performs control operation on the map in the target negative screen, the terminal moves the map displayed in the target negative screen and/or adjusts the size of the map displayed in the target negative screen according to the control operation.
Through the scheme, when the map is controlled, the map application does not need to be searched in the plurality of screens, the map application is started to control and operate the map application, the map can be directly displayed in the target negative screen when a map display instruction is responded, a user controls and operates the map in the target negative screen, and the terminal moves the map displayed in the target negative screen and/or adjusts the size of the map displayed in the target negative screen according to the control and operation, so that the map control operation process is simplified.
For example, in the case of the terminal operating, as shown in the left diagram of fig. 3, the user inputs a map display instruction, and after the terminal responds to the map display instruction, the terminal displays the map corresponding to the map display instruction in the target negative screen. In the target negative screen, as shown in fig. 3, the map may be displayed in the form of a card, and the following target negative screens are all exemplified by negative one screen. Or the target negative screen is other negative screens besides the negative one screen, such as negative two screens, negative three screens and the like. In the negative two-screen or negative three-screen, as shown in fig. 4, the map may be displayed on the entire screen. After the map is displayed on the target negative screen, when the user performs sliding operation in the area of the map, the terminal can move the map displayed on the target negative screen along with the moving direction of the finger of the user. For example, as shown in fig. 5, the user performs a sliding operation downward or upward in the area of the map, the downward or upward sliding trajectory is a dotted-line trajectory, and the terminal follows the downward or upward movement trajectory of the user's finger to control the map to move downward or upward. Referring to fig. 6, when the user slides to the right or left in the area of the map, the track sliding to the right or left is a dotted track, and the terminal follows the movement track of the user's finger to the right or left to control the map to move to the right or left. Referring to fig. 7, when the user performs a zoom-in or zoom-out operation in the area of the map, the zoomed-in or zoom-out trajectory is a dotted-line trajectory, and the terminal adjusts the size of the map to be smaller or larger along with the movement trajectory of the user's finger for zoom-in or zoom-out.
The above-described specific implementation manner of the moving process shown in fig. 5 and 6 specifically includes:
after the terminal detects that the user performs sliding operation through the desktop application, the moving track of the touch screen touched by the user is determined, and the map displayed in the target negative screen is controlled to move through the map plug-in unit according to the moving track.
The method for detecting the sliding operation of the user by the terminal through the desktop application may include: if it is detected that the preset number of moving continuous positions of the single finger on the touch screen are in the area of the map, and the positions of the dotted line in the figure 5 and the dotted line in the figure 6 are both in the area of the map, it is determined that the user performs the sliding operation.
Optionally, the method for determining that the user performs the sliding operation may specifically be performed according to the following steps of detecting whether a first moving position of the user on the touch screen by using a single finger is in an area of the map, if so, detecting whether moving continuous positions of a target number of the user on the touch screen by using a single finger are in the area of the map, where the target number is a preset number minus 1, and if so, determining that the user performs the sliding operation. If the first mobile position of the user on the touch screen by using a single finger is detected not to be in the area of the map, the terminal judges the area corresponding to the position of the first mobile position in the target negative screen through the desktop application, if the position is not in the area of the control corresponding to the target negative screen, the operation is finished, and if the position is in the area of the control corresponding to the target negative screen, the terminal processes the control through the control in the target negative screen corresponding to the position. And if the situation that the moving continuous positions of the number of the targets on the touch screen of the user are not in the area with the uniform map is detected, determining that the user performs sliding operation on the target negative screen, and processing the sliding operation through the desktop application by the terminal.
It should be noted that the manner of determining that the user performs the sliding operation in the embodiment of the present invention is only an example, and any manner of determining that the user performs the sliding operation is applicable to the embodiment of the present invention.
After the terminal detects that the user performs sliding operation through the desktop application, the terminal sends the moving track in the sliding operation to the map plug-in through the desktop application; and the terminal controls the map displayed in the target negative screen to move according to the moving track through the map plug-in.
A flow chart of a method of moving a displayed map in a target negative screen is described below in conjunction with fig. 8:
s800: the terminal determines that the onInterceptTouchent is on the target negative screen through the desktop application, and acquires the position of the moving track.
Wherein onInterceptTouchEvent refers to handling an event that a finger touches the touch screen.
S801: and the terminal acquires the area of the map plug-in on the target negative screen by adopting plug.
S802: and if the terminal detects that the preset number of moving continuous positions of the user on the touch screen by adopting a single finger are in the area of the map through the desktop application, determining that the user performs sliding operation.
For example, when moveX > rect.x + offset is satisfied, moveX < rect.x + rect.width-offset, movy > rect.y, and movy < rect.y + rect.height, the terminal obtains the value of issrolenabledx () through reflection by the desktop application as true, that is, the preset number of mobile continuous positions are all in the map area, and the user is determined to perform the sliding operation. The issrolenabledx () function is a function of a slide operation for judging whether or not a user movement trajectory is within a map area. Wherein the issrolenabledx () function can be called by reflection in other plug-ins.
In detail, moveX > rect.x + offset means that the abscissa of the preset number of movement continuous positions corresponding to the movement of the finger is larger than the left boundary of the map, moveX < rect.x + rect.width-offset means that the abscissa of the preset number of movement continuous positions corresponding to the movement of the finger is larger than the right boundary of the map, moveY > rect.y means that the ordinate of the preset number of movement continuous positions corresponding to the movement of the finger is larger than the upper boundary of the map, and moveY < rect.y + rect.height means that the ordinate of the preset number of movement continuous positions corresponding to the movement of the finger is larger than the lower boundary of the map.
The offset mentioned above is a variable, and the offset can be customized, and is used for still realizing page turning when sliding to the left and right boundaries of the map, so as to avoid that page turning cannot be performed when the map is available. Specifically, when determining the size of moveX and the left boundary, it is possible to determine by defining offset between moveX and rect.x + offset, that is, the abscissa of the movement continuous position, and both the left boundary of the map and the display screen boundary. When judging the moveX and the right boundary, judging the moveX, and judging the width rect.x, width of map and offset.
S803: and the terminal sends the moving track in the sliding operation to the map plug-in through the desktop application.
S804: and the terminal controls the map displayed in the target negative screen to move according to the moving track through the map plug-in.
The above-described specific implementation of the map resizing shown in fig. 7 includes: and after the terminal detects that the user performs zooming operation through the desktop application, determining the position information of the touch screen touched by the user, and adjusting the size of the map displayed in the target negative screen through the map plug-in according to the position information.
The determining the zoom operation mode of the user specifically includes: if the terminal detects that the position where the user presses the touch screen by the first finger is in the area of the map through the desktop application and continues to detect that the user presses the touch screen by the second finger before the user lifts the first finger, determining that the user performs zooming operation; or
And if the terminal detects that the positions where the user presses the touch screen by two fingers simultaneously are both in the area of the map through the desktop application, determining that the user performs zooming operation.
It should be noted that the manner of determining the zoom operation performed by the user in the embodiment of the present invention is only an example, and any manner of determining the zoom operation performed by the user is applicable to the embodiment of the present invention.
After the terminal detects that the user performs zooming operation through the desktop application, the terminal sends the position information in the zooming operation to the map plug-in through the desktop application; and the terminal adjusts the size of the map displayed in the target negative screen according to the position information through the map plug-in.
A flow chart of a method of resizing a displayed map in a target negative screen is described below in conjunction with fig. 9:
s900: the terminal determines that the onInterceptTouchent is on the target negative screen through the desktop application, and acquires the position information of the finger in the target negative screen.
S901: the terminal detects whether the positions of two fingers of a user pressing the touch screen within a preset time period are both in the area of the map view through the desktop application; if yes, S902 is executed, and if no, the process ends.
S902: and determining that the user performs zooming operation, and sending the position information in the zooming operation to the map plug-in by the terminal through the desktop application.
S903: and the terminal adjusts the size of the map displayed in the target negative screen according to the position information through the map plug-in.
The method for displaying the map, which is input by the user through the input unit, may include:
the first condition is as follows: and generating a map display instruction when the operation that the user slides to the target negative screen is detected.
Referring to fig. 3, when the terminal is running, the user may slide from the third screen to the negative one screen, and after the target negative screen is slid, the terminal displays the map on the negative one screen through the desktop application.
Case two: and generating a map display instruction according to the information input by the user through the input unit.
As shown in fig. 10, a user may input a "map" word in a search bar of a desktop in the terminal, and after the terminal receives the "map" word, the terminal turns from a current screen to a target negative screen through a desktop application, and displays a map corresponding to a map display instruction on the target negative screen. Or, with reference to fig. 11, the user may input a "map" word in a search bar of a voice application in the terminal, and after the terminal receives the "map" word, the terminal switches from the voice application to a target negative screen, and displays a map corresponding to the map display instruction on the target negative screen.
It should be noted that the manner of generating the map display instruction recited in the embodiment of the present invention is only an example, and any manner of generating the map display instruction is applicable to the embodiment of the present invention.
In the practical application process, a user may need to check a map of a location of the user and may also need to check a map of a location other than the user, so that the way in which the user checks the map is more efficient, the embodiment of the present invention may determine whether the map that the user wants to check can be analyzed in the map display instruction to determine which location of the map is displayed. Specifically, the method comprises the following steps:
the first condition is as follows: if the place can be analyzed through the map display instruction, the map corresponding to the map display instruction is a map of a preset range with the place as the center.
It can be understood that: the place can be analyzed through the map display instruction, namely when the map display instruction comprises the place which the user wants to view, the map with the place as the center and the preset range is obtained. For example, as shown in fig. 10, in a search bar of a desktop in the terminal, a word "a-zone map" is input, the terminal receives and analyzes the word "a-zone map", it can be seen that the map display instruction includes a location "a-zone", and a map of a preset range with the a-zone as a center of the terminal is displayed on the target negative screen. Alternatively, as shown in fig. 11, in the voice application, the user inputs "a-zone map", the terminal receives and parses the word "a-zone map", the terminal changes from the display interface of the voice application to the target negative screen, and the map of the preset range with the a-zone as the center is displayed in the target negative screen.
Case two: if the place cannot be analyzed through the map display instruction, the map corresponding to the map display instruction is a map in a preset range with the place where the user is located as the center.
It can be understood that: the location cannot be analyzed through the map display instruction, namely the map display instruction does not include the location which the user wants to view, and the map with the location of the user as the center is within the preset range. For example, as shown in fig. 10, in a search bar of a desktop in the terminal, a "map" character is input, and the terminal receives and analyzes the "map" character, and it can be seen that a map display instruction does not include a place, and a map of a preset range centered on a user is displayed on the target negative screen by the terminal. Alternatively, as shown in fig. 11, in the voice application, the user inputs a "map", the terminal receives and parses the "map" word, the terminal changes from the display interface of the voice application to the target negative screen, and the map of the preset range centered on the user is displayed in the target negative screen.
In summary, the embodiment of the invention can determine the map display range directly by analyzing the map display instruction, and does not need the user to search the map for the location to be checked, thereby improving the map searching efficiency.
In the existing negative screen, there is no function of displaying a map, so the invention can add a map display function in the target negative screen, and then the map can be displayed in the target negative screen when responding to a map display instruction, wherein, with reference to fig. 12, the specific process of adding the map display function and the map display in the target negative screen includes:
s1200: the terminal obtains the context information of the map plug-in through the desktop application.
In detail, the package name of the map plug-in is set as mPackageName, and the obtaining of the context information of the map plug-in may be:
Context mRemoteContext=Utilities.getTargetContext(context,pkg),Context.CONTEXT_INCLUDE_CODE
|Context.CONTEXT_IGNORE_SECURITY)。
s1201: and the terminal acquires the map layout file by using a reflection mechanism according to the context information of the map plug-in through the desktop application.
The map layout file of the map plug-in is obtained, and the map layout file can be understood as a map layout file class defined by the desktop application. The detailed process is that the map plug-in adds a map identifier to the map layout file, and the desktop application obtains the map layout file of the map plug-in through the map identifier, namely:
Resources targetResource=mRemoteContext.getResources();
int resId=targetResource.getIdentifier(layoutname,"layout",mPackageName)。
s1202: and adding a map identifier corresponding to the map layout file in the map plug-in into a display list of the target negative screen by the terminal through the desktop application.
S1203: and responding to a map display instruction input by a user through the input unit, and instantiating a map layout file corresponding to the map identifier in the display list through the desktop application by the terminal.
When responding to a map display instruction, the terminal instantiates a map layout file corresponding to a map identifier in a display list of a target negative screen through the desktop application, namely the terminal generates the map layout file corresponding to the map identifier in the display list of the target negative screen according to the defined class through the desktop application, and the instantiation process is a process of materializing the abstract concept of the defined class, wherein the instantiation map layout file is as follows:
View remoteView=((LayoutInflater)
mRemoteContext.getSystemService("layout_inflater")).inflate(resId,null)。
s1204: and initializing the map layout file by the terminal through a map plug-in to obtain a visual map.
The initialization refers to running logic in a map layout file, for example, detecting a current network state of the terminal, determining a map display area, and the like.
S1205: and the terminal displays the map in the map layout file in the target negative screen through the desktop application. Specifically, the method comprises the following steps:
LinearLayout.addView(remoteView)。
according to the method for adding the map display function to the target negative screen, the map in the map plug-in can be displayed in the target negative screen, so that the map can be displayed in the target negative screen by responding to the map display instruction without downloading and installing the corresponding map application in the terminal, and the operation process of map display is simplified.
Fig. 13 is a block diagram of a terminal 1300 according to an embodiment of the present invention, including: a processor 1310, an input unit 1320, and a display 1330;
the input unit 1320 is configured to receive a map display instruction input by a user;
the display screen 1330 is used for displaying a map;
the processor 1310 is configured to respond to a map display instruction input by a user through the input unit, and display a map corresponding to the map display instruction in a target negative screen through the display screen;
and after detecting that the user performs control operation on the map in the target negative screen, moving the map displayed in the target negative screen and/or adjusting the size of the map displayed in the target negative screen according to the control operation.
The input unit 1320 may be used to receive numeric or character information input by a user and generate key signal inputs related to user settings and function control of the terminal 1300.
Alternatively, the input unit 1320 may include a touch panel and other input terminals.
The touch panel, also called a touch screen, may collect touch operations of a user on or near the touch panel (for example, operations of the user on or near the touch panel using any suitable object or accessory such as a finger, a stylus pen, etc.), and drive the corresponding connection device according to a preset program. Optionally, the touch panel may include two parts, namely a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 1310, and receives and executes commands sent from the processor 1310. In addition, the touch panel may be implemented in various types, such as resistive, capacitive, infrared, and surface acoustic wave.
Optionally, the other input terminals may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display 1330 may be used to display information input by or provided to the user and a Graphical User Interface (GUI) for various menus of the terminal 1300. Display 1330 may include a display disposed on the front side of terminal 1300. The display screen may be configured in the form of a liquid crystal display, a light emitting diode, or the like. Display 1330 may be used to display various graphical user interfaces described herein.
Further, the touch panel may cover the display panel, and when the touch panel detects a touch operation on or near the touch panel, the touch panel transmits the touch operation to the processor 1310 to determine the type of the touch event, and then the processor 1310 provides a corresponding visual output on the display panel according to the type of the touch event.
The touch panel and the display panel are two independent components to implement the input and output functions of the terminal 1300, but in some embodiments, the touch panel and the display panel may be integrated to implement the input and output functions of the terminal 1300, and after integration, the integrated touch panel may be referred to as a touch display screen. The display screen can display the application programs and the corresponding operation steps.
Optionally, the processor 1310 is specifically configured to:
obtaining context information of a map plug-in through a desktop application;
obtaining a map layout file by the desktop application according to the context information of the map plug-in by using a reflection mechanism;
adding a map identifier corresponding to a map layout file in the map plug-in into a display list of the target negative screen through desktop application;
responding to a map display instruction of a user, and instantiating a map layout file corresponding to the map identifier in the display list by the terminal through a desktop application;
initializing the map layout file through the map plug-in to obtain a visual map;
and displaying the map in the map layout file in the target negative screen through the display screen by the desktop application.
Optionally, the processor 1310 is specifically configured to:
after the desktop application detects that the user performs sliding operation, determining a moving track of the user touching the touch screen, and controlling a map displayed in the target negative screen to move according to the moving track through a map plug-in;
and after the desktop application detects that the user performs zooming operation, determining the position information of the touch screen touched by the user, and adjusting the size of the map displayed in the target negative screen according to the position information through a map plug-in.
Optionally, the processor 1310 is specifically configured to:
sending the moving track in the sliding operation to a map plug-in through a desktop application;
controlling the map displayed in the target negative screen to move according to the moving track through a map plug-in;
sending the position information in the zooming operation to a map plug-in through a desktop application;
and adjusting the size of the map displayed in the target negative screen according to the position information through a map plug-in.
Optionally, the processor 1310 is specifically configured to:
if the place can be analyzed through the map display instruction, the map corresponding to the map display instruction is a map of a preset range with the place as a center; or
And if the place cannot be analyzed through the map display instruction, the map corresponding to the map display instruction is a map in a preset range with the place where the user is located as the center.
In an exemplary embodiment, a storage medium comprising instructions, such as the memory 1320 comprising instructions, executable by the processor 1310 of the electronic device 1300 to perform the above-described method is also provided. Alternatively, the storage medium may be a non-transitory computer readable storage medium, which may be, for example, a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Terminal may include other structures in addition to the structure described in fig. 13, as shown in fig. 14, terminal 1400 further includes: radio Frequency (RF) circuitry 1410, memory 1420, display unit 1430, camera 1440, sensor 1450, audio circuitry 1460, Wireless Fidelity (Wi-Fi) module 1470, processor 1480, bluetooth module 1481, and power supply 1490.
Therein, it should be understood that terminal 1400 shown in fig. 14 is merely an example, and terminal 1400 may have more or fewer components than shown in fig. 14, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The RF circuit 1410 may be used for receiving and transmitting signals during information transmission and reception or during a call, and may receive downlink data of a base station and then send the downlink data to the processor 1480 for processing; the uplink data may be transmitted to the base station. Typically, the RF circuitry includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
Memory 1420 may be used to store software programs and data. The processor 1480 performs various functions of the terminal 1400 and data processing by executing software programs or data stored in the memory 1420. The memory 1420 may include high-speed random access memory and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device. Memory 1420 stores an operating system that enables terminal 1400 to operate. The memory 1420 may store an operating system and various application programs, and may store codes for performing the methods according to the embodiments of the present application.
The display unit 1430 may be used to receive input numeric or character information, generate signal input related to user settings and function control of the terminal 1400, and particularly, the display unit 1430 may include a touch screen 1431 disposed on the front of the terminal 1400, and may collect touch operations of the user thereon or nearby, such as clicking a button, dragging a scroll box, and the like.
The display unit 1430 may also be used to display information input by the user or information provided to the user and a Graphical User Interface (GUI) of various menus of the terminal 1000. Specifically, the display unit 1430 may include a display screen 1432 provided on the front surface of the terminal 1000. The display screen 1432 may be configured in the form of a liquid crystal display, a light emitting diode, or the like. The display unit 130 may be used to display various graphical user interfaces described herein.
The touch screen 1431 may be covered on the display screen 1432, or the touch screen 1431 and the display screen 1432 may be integrated to implement an input and output function of the terminal 1000, and after the integration, the touch screen may be referred to as a touch display screen for short. The display unit 1430 in the present application may display the application programs and the corresponding operation steps.
The camera 1440 may be used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing elements convert the light signals into electrical signals, which are then passed to a processor 1480 for conversion into digital image signals.
Terminal 1400 may also include at least one sensor 1450, such as an acceleration sensor 1451, a distance sensor 1452, a fingerprint sensor 1453, a temperature sensor 1454. Terminal 1400 can also be configured with other sensors such as gyroscopes, barometers, hygrometers, thermometers, infrared sensors, light sensors, motion sensors, and the like.
The audio circuit 1460, speaker 1461, and microphone 1462 can provide an audio interface between a user and the terminal 1400. The audio circuit 1460 may transmit the electrical signal converted from the received audio data to the speaker 1461, and convert the electrical signal into an audio signal by the speaker 1461 for output. The terminal 1400 may be further provided with a volume button for adjusting the volume of the sound signal. On the other hand, the microphone 1462 converts collected sound signals into electrical signals, which are received by the audio circuit 1460 and converted into audio data, which are output to the RF circuit 1410 for transmission to, for example, another terminal or output to the memory 1420 for further processing. The microphone 1462 in this application can capture the voice of the user.
Wi-Fi belongs to short-distance wireless transmission technology, and the terminal 1400 can help a user to send and receive e-mails, browse webpages, access streaming media and the like through a Wi-Fi module 1470, and provides wireless broadband Internet access for the user.
The processor 1480, which is the control center of the terminal 1400, connects the various parts of the overall terminal using various interfaces and lines, and performs various functions of the terminal 1400 and processes data by running or executing software programs stored in the memory 1420 and calling data stored in the memory 1420. In some embodiments, the processor 1480 may include one or more processing units; the processor 1480 may also integrate an application processor, which primarily handles operating systems, user interfaces, and applications, etc., and a baseband processor, which primarily handles wireless communications. It will be appreciated that the baseband processor described above may not be integrated into the processor 1480. The processor 1480 may run an operating system, an application program, a user interface display, and a touch response, and the processing method described in the embodiments of the present application. In addition, the processor 1480 may be capable of performing the steps of the processor 1310 described above, with the processor 1480 being coupled to the input unit and display screen described above.
A bluetooth module 1481 for performing information interaction with other bluetooth devices having a bluetooth module through a bluetooth protocol. For example, the terminal 1400 may establish a bluetooth connection with a wearable electronic device (e.g., a smart watch) also equipped with a bluetooth module through the bluetooth module 1481, so as to perform data interaction.
Terminal 1400 also includes a power supply 1490 (e.g., a battery) that powers the various components. The power supply may be logically coupled to the processor 1480 through a power management system to manage charging, discharging, and power consumption through the power management system. Terminal 1400 may also be configured with power buttons for powering the terminal on and off, and for locking the screen.
Fig. 15 is a block diagram of a software configuration of terminal 1400 according to an embodiment of the present invention.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 15, the application package may include camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc. applications.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 15, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions of the terminal 1400. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, text information is prompted in the status bar, a prompt tone is given, the communication terminal vibrates, and an indicator light flashes.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The following describes exemplary workflow of the software and hardware of the terminal 1400 in connection with capturing a photo scene.
When the touch screen receives the touch operation, the corresponding hardware interrupt is sent to the kernel layer. The kernel layer processes the touch operation into an original input event (including touch coordinates, a time stamp of the touch operation, and other information). The raw input events are stored at the kernel layer. And the application program framework layer acquires the original input event from the kernel layer and identifies the control corresponding to the input event. Taking the touch operation as a touch click operation, and taking a control corresponding to the click operation as a control of a camera application icon as an example, the camera application calls an interface of an application framework layer, starts the camera application, further starts a camera drive by calling a kernel layer, and captures a still image or a video through the camera.
The terminal 1400 in this embodiment may be a mobile phone, a tablet computer, a wearable device, a notebook computer, a television, and the like.
An embodiment of the present invention further provides a computer program product, which, when running on an electronic device, enables the electronic device to execute any one of the map control methods described above in the embodiments of the present invention.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This invention is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.
It will be understood that the invention is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the invention is limited only by the appended claims.

Claims (10)

1. A terminal, comprising: the device comprises a processor, an input unit and a display screen;
the input unit is used for receiving a map display instruction input by a user;
the display screen is used for displaying a map;
the processor is used for responding to a map display instruction input by a user through the input unit and displaying a map corresponding to the map display instruction in a target negative screen through the display screen;
and after detecting that the user performs control operation on the map in the target negative screen, moving the map displayed in the target negative screen and/or adjusting the size of the map displayed in the target negative screen according to the control operation.
2. The terminal of claim 1, wherein the processor is specifically configured to:
obtaining context information of a map plug-in through a desktop application;
obtaining a map layout file by the desktop application according to the context information of the map plug-in by using a reflection mechanism;
adding a map identifier corresponding to a map layout file in the map plug-in into a display list of the target negative screen through the desktop application;
responding to a map display instruction of a user, and instantiating a map layout file corresponding to the map identifier in the display list through the desktop application;
initializing the map layout file through the map plug-in to obtain a visual map;
and displaying the map in the map layout file in the target negative screen through the display screen by the desktop application.
3. The terminal of claim 1, wherein the processor is specifically configured to:
after the desktop application detects that the user performs sliding operation, determining a moving track of the user touching the touch screen, and controlling a map displayed in the target negative screen to move according to the moving track through a map plug-in;
and after the desktop application detects that the user performs zooming operation, determining the position information of the touch screen touched by the user, and adjusting the size of the map displayed in the target negative screen according to the position information through a map plug-in.
4. The terminal of claim 3, wherein the processor is specifically configured to:
sending the moving track in the sliding operation to a map plug-in through a desktop application;
controlling the map displayed in the target negative screen to move according to the moving track through a map plug-in;
sending the position information in the zooming operation to a map plug-in through a desktop application;
and adjusting the size of the map displayed in the target negative screen according to the position information through a map plug-in.
5. The terminal of any of claims 1 to 4, wherein the processor is specifically configured to:
if the place can be analyzed through the map display instruction, the map corresponding to the map display instruction is a map of a preset range with the place as a center; or
And if the place cannot be analyzed through the map display instruction, the map corresponding to the map display instruction is a map in a preset range with the place where the user is located as the center.
6. A map control method is applied to a terminal, and the method comprises the following steps:
responding to a map display instruction input by a user through an input unit, and displaying a map corresponding to the map display instruction in a target negative screen through a display screen by the terminal;
and after detecting that the user performs control operation on the map in the target negative screen, the terminal moves the map displayed in the target negative screen and/or adjusts the size of the map displayed in the target negative screen according to the control operation.
7. The map control method according to claim 6, wherein before the terminal displays the map corresponding to the map display instruction on the target screen through the display screen in response to the map display instruction input by the user through the input unit, the method further comprises:
the terminal acquires the context information of the map plug-in through the desktop application;
the terminal acquires a map layout file by using a reflection mechanism according to the context information of the map plug-in through the desktop application;
the terminal adds a map identifier corresponding to a map layout file in the map plug-in into a display list of the target negative screen through the desktop application;
the responding user displays the map corresponding to the map display instruction in the target negative screen through the display screen in response to the map display instruction input by the user through the input unit, and the method comprises the following steps:
responding to a map display instruction input by a user through an input unit, and instantiating a map layout file corresponding to the map identifier in the display list by the terminal through the desktop application;
the terminal initializes the map layout file through the map plug-in to obtain a visual map;
and the terminal displays the map in the map layout file in the target negative screen through a display screen through the desktop application.
8. The map control method according to claim 6, wherein the terminal moves the map displayed in the target negative screen according to the control operation, including:
after the terminal detects that the user performs sliding operation through desktop application, determining a moving track of the touch screen touched by the user, and controlling a map displayed in the target negative screen to move according to the moving track through a map plug-in;
the terminal adjusts the size of the map displayed in the target negative screen according to the control operation, and the method comprises the following steps:
and after the terminal detects that the user performs zooming operation through the desktop application, determining the position information of the touch screen touched by the user, and adjusting the size of the map displayed in the target negative screen through a map plug-in according to the position information.
9. The map control method according to claim 8, wherein the controlling, by the map plug-in, the map displayed in the target negative screen to move according to the movement trajectory includes:
the terminal sends the moving track in the sliding operation to a map plug-in through a desktop application;
the terminal controls the map displayed in the target negative screen to move according to the moving track through a map plug-in;
the adjusting, by the map plug-in, the size of the map displayed in the target negative screen according to the location information includes:
the terminal sends the position information in the zooming operation to a map plug-in through a desktop application;
and the terminal adjusts the size of the map displayed in the target negative screen according to the position information through a map plug-in.
10. The map control method according to any one of claims 6 to 9, characterized in that:
if the place can be analyzed through the map display instruction, the map corresponding to the map display instruction is a map of a preset range with the place as a center; or
And if the place cannot be analyzed through the map display instruction, the map corresponding to the map display instruction is a map in a preset range with the place where the user is located as the center.
CN201911077345.6A 2019-11-06 2019-11-06 Terminal and map control method Pending CN110865765A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911077345.6A CN110865765A (en) 2019-11-06 2019-11-06 Terminal and map control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911077345.6A CN110865765A (en) 2019-11-06 2019-11-06 Terminal and map control method

Publications (1)

Publication Number Publication Date
CN110865765A true CN110865765A (en) 2020-03-06

Family

ID=69654444

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911077345.6A Pending CN110865765A (en) 2019-11-06 2019-11-06 Terminal and map control method

Country Status (1)

Country Link
CN (1) CN110865765A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111324255A (en) * 2020-03-17 2020-06-23 海信电子科技(深圳)有限公司 Application processing method based on double-screen terminal and communication terminal
CN113778310A (en) * 2021-08-05 2021-12-10 阿里巴巴新加坡控股有限公司 Cross-device control method and computer program product
CN113835571A (en) * 2021-09-17 2021-12-24 青岛海信移动通信技术股份有限公司 Terminal device, information display method and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104991947A (en) * 2015-07-13 2015-10-21 小米科技有限责任公司 Map display method and apparatus
CN106557319A (en) * 2016-11-17 2017-04-05 腾讯科技(深圳)有限公司 The method and apparatus that negative one screen loads object
CN109059934A (en) * 2018-09-28 2018-12-21 Oppo广东移动通信有限公司 Paths planning method, device, terminal and storage medium
CN109348417A (en) * 2018-09-28 2019-02-15 Oppo广东移动通信有限公司 Display methods, device, terminal and the storage medium of route
CN110019630A (en) * 2017-12-28 2019-07-16 上海擎感智能科技有限公司 The display methods and device of electronic map
CN110019621A (en) * 2017-12-08 2019-07-16 上海博泰悦臻网络技术服务有限公司 Geographical location sharing method, system, terminal and vehicle based on chat tool

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104991947A (en) * 2015-07-13 2015-10-21 小米科技有限责任公司 Map display method and apparatus
CN106557319A (en) * 2016-11-17 2017-04-05 腾讯科技(深圳)有限公司 The method and apparatus that negative one screen loads object
CN110019621A (en) * 2017-12-08 2019-07-16 上海博泰悦臻网络技术服务有限公司 Geographical location sharing method, system, terminal and vehicle based on chat tool
CN110019630A (en) * 2017-12-28 2019-07-16 上海擎感智能科技有限公司 The display methods and device of electronic map
CN109059934A (en) * 2018-09-28 2018-12-21 Oppo广东移动通信有限公司 Paths planning method, device, terminal and storage medium
CN109348417A (en) * 2018-09-28 2019-02-15 Oppo广东移动通信有限公司 Display methods, device, terminal and the storage medium of route

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111324255A (en) * 2020-03-17 2020-06-23 海信电子科技(深圳)有限公司 Application processing method based on double-screen terminal and communication terminal
CN111324255B (en) * 2020-03-17 2023-11-24 青岛海信移动通信技术有限公司 Application processing method based on double-screen terminal and communication terminal
CN113778310A (en) * 2021-08-05 2021-12-10 阿里巴巴新加坡控股有限公司 Cross-device control method and computer program product
CN113835571A (en) * 2021-09-17 2021-12-24 青岛海信移动通信技术股份有限公司 Terminal device, information display method and storage medium

Similar Documents

Publication Publication Date Title
US20230325067A1 (en) Cross-device object drag method and device
WO2021244443A1 (en) Split-screen display method, electronic device, and computer readable storage medium
US20170205894A1 (en) Method and device for switching tasks
WO2021083132A1 (en) Icon moving method and electronic device
CN110221885B (en) Interface display method and terminal equipment
US20150077362A1 (en) Terminal with fingerprint reader and method for processing user input through fingerprint reader
CN111597000B (en) Small window management method and terminal
CN109917995B (en) Object processing method and terminal equipment
WO2021129536A1 (en) Icon moving method and electronic device
CN111078076A (en) Application program switching method and electronic equipment
CN112114733B (en) Screen capturing and recording method, mobile terminal and computer storage medium
CN111225108A (en) Communication terminal and card display method of negative screen interface
CN111240546B (en) Split screen processing method and communication terminal
CN111367456A (en) Communication terminal and display method in multi-window mode
CN110703972B (en) File control method and electronic equipment
EP4280058A1 (en) Information display method and electronic device
CN110865765A (en) Terminal and map control method
CN111143299A (en) File management method and electronic equipment
CN111124219A (en) Communication terminal and card display method of negative screen interface
CN111176766A (en) Communication terminal and component display method
CN113741708A (en) Input method and electronic equipment
WO2020000276A1 (en) Method and terminal for controlling shortcut button
CN111324255B (en) Application processing method based on double-screen terminal and communication terminal
CN114546219A (en) Picture list processing method and related device
CN110888571B (en) File selection method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination