CN114579227A - Map rendering method, apparatus, program product, and storage medium - Google Patents

Map rendering method, apparatus, program product, and storage medium Download PDF

Info

Publication number
CN114579227A
CN114579227A CN202210067334.5A CN202210067334A CN114579227A CN 114579227 A CN114579227 A CN 114579227A CN 202210067334 A CN202210067334 A CN 202210067334A CN 114579227 A CN114579227 A CN 114579227A
Authority
CN
China
Prior art keywords
filter
rendering
map
display
filter configuration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210067334.5A
Other languages
Chinese (zh)
Inventor
崔彤
卢景熙
田瀚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Innovation Co
Original Assignee
Alibaba Singapore Holdings Pte Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Singapore Holdings Pte Ltd filed Critical Alibaba Singapore Holdings Pte Ltd
Priority to CN202210067334.5A priority Critical patent/CN114579227A/en
Publication of CN114579227A publication Critical patent/CN114579227A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a map rendering method, a map rendering device, a program product and a storage medium. In an embodiment of the present application, a display correction portal may be provided for a user to autonomously configure filter parameters of a map interface. For a terminal, a filter configuration item may be displayed in response to a display correction event; a user can autonomously configure filter parameters of the filter configuration items through the filter configuration items; a map interface may be rendered on the display screen for the terminal based on the user-configured filter parameters and map data. In the rendering mode, a user can configure adaptive map filter parameters for display screens with different performances and adjust the display effect of the map interface on the display screen, so that the display effect of the map interface is adaptive to the display screen, and the map display effect is improved.

Description

Map rendering method, apparatus, program product, and storage medium
Technical Field
The present application relates to the field of map rendering technologies, and in particular, to a map rendering method, apparatus, program product, and storage medium.
Background
With the popularization of intelligent terminals, travel application software (programs) is widely installed and used. By combining the positioning technology and the map data, the trip application software can display a map interface on a display screen of the terminal, and provide various location-based services for a user, including map searching, navigation and the like.
As is known, the model of the terminal is very numerous and complex, and the performance of the display screen of the terminal is also uneven, which results in uneven display effects of the map rendered based on the same filter parameters on different display screens. Therefore, how to enable display screens with different performances to have a better display effect becomes a problem to be solved by technical personnel in the field.
Disclosure of Invention
Aspects of the present application provide a map rendering method, apparatus, program product, and storage medium, which are used to configure adaptive map filter parameters for display screens with different performances, so as to improve a map display effect.
The embodiment of the application provides a map rendering method, which comprises the following steps:
displaying a filter configuration item in response to a display correction event;
responding to the configuration operation aiming at the filter configuration item, and acquiring filter parameters corresponding to the filter configuration item;
rendering a map interface on a display screen based on the filter parameters and the map data.
An embodiment of the present application further provides a terminal device, including: a memory, a processor and a display screen; wherein the memory is used for storing a computer program;
the processor is coupled to the memory and the display screen for executing the computer program for performing the steps in the map rendering method described above.
An embodiment of the present application further provides a map rendering apparatus, including:
a display module for displaying a filter configuration item in response to a display correction event;
the acquisition module is used for responding to the configuration operation aiming at the filter configuration item and acquiring the filter parameter corresponding to the filter configuration item;
and the rendering module is used for rendering a map interface on a display screen based on the filter parameters and the map data.
An embodiment of the present application further provides a computer program product, including: a computer program; the computer program is executed by a processor to implement the steps in the map rendering method described above.
Embodiments of the present application also provide a computer-readable storage medium storing computer instructions, which, when executed by one or more processors, cause the one or more processors to perform the steps of the map rendering method described above.
In an embodiment of the application, a display correction entry may be provided for a user to autonomously configure filter parameters of a map interface. For a terminal, a filter configuration item may be displayed in response to a display correction event; a user can autonomously configure filter parameters of the filter configuration items through the filter configuration items; a map interface may be rendered on the display screen for the terminal based on the user-configured filter parameters and map data. In the rendering mode, a user can configure adaptive map filter parameters for display screens with different performances and adjust the display effect of the map interface on the display screen, so that the display effect of the map interface is adaptive to the display screen, and the map display effect is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a schematic flowchart of a map rendering method according to an embodiment of the present application;
fig. 2 to fig. 6 are schematic diagrams of a display correction function call entry and a display correction page according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a terminal device according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a map rendering apparatus according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only a few embodiments of the present application, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The model of the terminal is very numerous and complex, and the performances of the display screens of the terminal are also uneven, so that the display effects of the map rendered based on the same filter parameters on different display screens are uneven. Therefore, how to enable display screens with different performances to have a better display effect becomes a problem to be solved by technical personnel in the field.
In order to solve the above technical problem, in some embodiments of the present application, a display correction entry may be provided for a user to autonomously configure filter parameters of a map interface. For a terminal, a filter configuration item may be displayed in response to a display correction event; a user can autonomously configure filter parameters of the filter configuration items through the filter configuration items; a map interface may be rendered on the display screen for the terminal based on the user-configured filter parameters and map data. In the rendering mode, a user can configure adaptive map filter parameters for display screens with different performances and adjust the display effect of the map interface on the display screen, so that the display effect of the map interface is adaptive to the display screen, and the map display effect is improved.
The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
It should be noted that: like reference numerals refer to like objects in the following figures and embodiments, and thus, once an object is defined in one figure or embodiment, further discussion thereof is not required in subsequent figures and embodiments.
Fig. 1 is a schematic flowchart of a map rendering method according to an embodiment of the present application. As shown in fig. 1, the map rendering method includes:
101. in response to displaying the correction event, a filter configuration item is displayed.
102. And responding to the configuration operation aiming at the filter configuration items, and acquiring filter parameters corresponding to the filter configuration items.
103. Rendering a map interface on the display screen based on the filter parameters and the map data.
The map rendering method provided by the embodiment can be suitable for the terminal. The terminal refers to an electronic device that can provide a travel service for a user. For example, the terminal may provide positioning and navigation functions, etc. for the user. In the present embodiment, the implementation form of the terminal is not limited. For example, the terminal may be a mobile phone, a tablet computer, a personal computer, an intelligent wearable device, or a dedicated navigation device. For example, an in-vehicle navigation apparatus, and the like. In this embodiment, the terminal may be installed with software such as an Application (APP) related to trip-type Application software, which may provide a map interface to the user. In this embodiment, the map interface may be any map interface, such as a map navigation guidance interface, a map location sharing interface, a route planning interface, or any map interface in travel application software. Of course, the terminal may also run installation-free applications, such as applets and the like. The installation-free application can provide positioning, navigation and other functions for the user. Accordingly, the installation-less application may also provide a map interface or the like to the user.
In practical applications, the display screen of the terminal has various performances due to the variety and model of the terminal. Various aspects of the display screen, such as color saturation, color gamut, resolution, pixel density, refresh rate, contrast, and High Dynamic Range (HDR) display, affect the display effect of the display screen, and further affect the perception of human eyes. Therefore, the map rendered based on the same filter parameters has a varying display effect on different display screens. In this embodiment, in order to enable display screens with different performances to have a better display effect, a display correction entry may be provided for a user to configure filter parameters of a map interface autonomously. Accordingly, for the terminal, in step 101, filter configuration items may be displayed in response to a display correction event. The filter configuration items refer to configuration items corresponding to filter parameters. The filter parameters refer to parameters used for rendering a map interface, and include: but is not limited to, one or more of brightness, contrast, saturation, and color tendency. The plurality means 2 or more than 2.
The brightness is also referred to as lightness, and refers to the brightness of a color, and the higher the brightness, the brighter the display effect, and the lower the brightness, the darker the display effect. Contrast is a measure of the different brightness levels between the brightest white and darkest black of the bright and dark regions in an image, and a larger difference range represents a larger contrast, and a smaller difference range represents a smaller contrast. Saturation refers to the saturation of a color, and is used to identify the vividness of the color, also referred to as purity. The higher the saturation, the more vivid the color, and the lower the saturation, the darker the color. The color tendency is a color attribute of a user tendency, and is a name of a color such as red, orange, yellow, green, cyan, blue, purple, or the like.
In the embodiment of the present application, a specific implementation form of the display correction event is not limited. In some embodiments, the terminal is installed with a travel class application that displays display correction controls. The user can touch the display correction control to enter the page where the filter configuration item is located. Accordingly, the terminal may display the filter configuration item in response to a display correction event generated with respect to a touch operation of the display correction control. In this embodiment, the display correction event is implemented as a touch operation for the display correction control.
In the embodiment of the present application, the page where the correction control is displayed is not limited. For example, in some embodiments, the display correction function is a new online function. The server can push the display correction function to the terminal. As shown in fig. 2, the terminal may receive a display correction function notification pushed by the server; and in response to the starting operation for the travel application, displaying a correction function notification and a correction control (such as 'immediate optimization' in the upper half of fig. 2) in a floating frame manner on the home page of the travel application. If the user selects "immediate optimization," it is determined to generate a display correction event generated for a touch operation of the display correction control. Further, the terminal may display the filter configuration item in response to a display correction event generated by a touch operation for the display correction control, and the display effect of the filter configuration item is as shown in the lower half of fig. 2.
Alternatively, if the user selects "ignore," the display correction function notification may be stored in a message center. As shown in fig. 3, when the user needs to display the correction, the user may enter the message center to call the display correction function. Alternatively, the terminal may display a display correction control in the form of a floating box on the message center page in response to an interactive operation directed to display correction function notification in the message center ("immediate optimization" in the upper half of fig. 3). If the user selects "immediate optimization," it is determined to generate a display correction event generated for a touch operation of the display correction control. Further, the terminal may display the filter configuration item in response to a display correction event generated by a touch operation for the display correction control, and a display effect of the filter configuration item is as shown in a lower half of fig. 3.
In other embodiments, the display correction function is a new online function. The server can push the display correction function to the terminal. As shown in fig. 4, the terminal may store associated information (e.g., introduction information of the display correction function, etc.) of the display correction function in a new function introduction page. The user can call the display correction function through the new function introduction page. Alternatively, the terminal may display a display correction control (e.g., "immediate optimization" in the upper half of fig. 4) on the new function introduction interface in response to an interactive operation directed to the associated information of the display correction function in the new function introduction page. Further, the terminal may display the filter configuration item in response to a display correction event generated by a touch operation for the display correction control, and the display effect of the filter configuration item is as shown in the lower half of fig. 4.
In still other embodiments, the display correction function is a new online function. The server can push the display correction function to the terminal. As shown in fig. 5, the terminal may store associated information (e.g., introduction information of the display correction function, etc.) of the display correction function in a setting page of the travel application. The user can call the display correction function through the setup page. Optionally, the terminal may display the display correction page in response to an interactive operation for the associated information of the display correction function in the setting page; the display correction page includes a display correction control (e.g., "immediate optimization" in the top half of FIG. 5). Further, the terminal may display the filter configuration item in response to a display correction event generated by a touch operation for the display correction control, and the display effect of the filter configuration item is as shown in the lower half of fig. 5.
In other embodiments, the travel-class application may provide voice interaction functionality. The user can send a voice command to the terminal, such as 'please enter display correction', and the like, and call the display correction function. Correspondingly, the terminal can respond to the monitored voice interaction triggering event and start a microphone to pick up sound; carrying out voice recognition on the voice data picked up by the microphone to obtain a voice recognition result; and if the voice recognition result reflects the display correction requirement, determining that a voice instruction for starting the display correction function is received. Further, the terminal may display the filter configuration item in response to receiving a display correction event generated when a voice instruction to activate the display correction function is received. In this embodiment, the display correction event is implemented as the receipt of a voice instruction to initiate the display correction function.
Further, after the terminal displays the filter configuration items, the user may autonomously configure the filter parameters of the filter configuration items through the filter configuration items. Accordingly, for the terminal, in step 102, filter parameters corresponding to the filter configuration items may be acquired in response to the configuration operation for the filter configuration items.
In some embodiments, different rendering level options may be set for the filter configuration items. For example, as shown in fig. 2-5, for brightness, a default option, a highlight option, a dim option, and the like may be set; for contrast, default options, clear options, and soft options may be set. Fig. 2-5 are merely illustrative of two filter configurations for brightness and contrast, and are not limiting. And aiming at the rendering level options corresponding to the filter configuration items, the filter parameters corresponding to each rendering level option can be configured in advance. For example, for brightness, the default option corresponds to a brightness of 0%; the brightness corresponding to the brightening option is + 20%; the brightness corresponding to the darkening option is-20%, etc. The user can realize different rendering effects by selecting the rendering level options corresponding to the filter configuration items. Accordingly, an alternative implementation of step 102 is: the terminal can respond to the interactive operation aiming at the rendering level option corresponding to the filter configuration item, and can determine the target rendering level selected by the interactive operation; and acquiring the filter parameter corresponding to the target rendering level from the corresponding relation between the set rendering level and the filter parameter as the filter parameter corresponding to the filter configuration item.
In the embodiment of the application, the interactive operation may be implemented as a touch interactive operation, a voice interactive operation, or a combination of a touch interaction and a voice interactive operation. As shown in fig. 2-5, the user may touch the rendering level option of the filter configuration item to select a corresponding target rendering level. For the terminal, in response to the touch interaction operation aiming at the rendering level option of the filter configuration item, the rendering level corresponding to the rendering level option selected by the touch operation is determined to be the target rendering level.
In some embodiments, the interaction may also be implemented as a voice interaction. The user can select a target rendering level corresponding to the filter configuration item through voice interaction with the terminal. For example, the terminal may send a warning tone "please correct the brightness", and the user may send a voice command to the terminal based on the warning tone, such as "please adjust the brightness to be brightened", etc. For another example, the terminal may send "please correct brightness, default please say 1, mention 2, say 3" dim; the user can send a voice command containing a number corresponding to the selection level, such as "2", to the terminal based on the alert tone. Correspondingly, the terminal can respond to the voice interaction operation generated aiming at the filter configuration item and start the microphone to pick up sound; and analyzing the rendering grade corresponding to the filter configuration item from the sound data collected by the microphone to be used as a target rendering grade.
In addition to adjusting the filter parameters by setting different rendering levels for each filter configuration item, in some embodiments, a user may also set a specific value of the filter parameter corresponding to the filter configuration item. For example, the user may set the brightness value autonomously. The user can set the specific values of the filter parameters corresponding to the filter configuration items independently through touch interaction with the terminal, and also can realize the touch interaction with the terminal through voice interaction. An exemplary manner for the user to autonomously set the specific values of the filter parameters corresponding to the filter configuration items is described below.
Embodiment 1: the travel application may provide a parameter configuration page through which the user may manually configure filter parameters. Accordingly, as shown in fig. 6, the terminal may display a parameter configuration page in response to a touch interactive operation for the filter configuration item. The user can configure the filter parameters through the parameter configuration page. Further, the terminal can determine filter parameters provided by the configuration operation of the parameter configuration page as filter parameters corresponding to the filter configuration items. Fig. 6 illustrates an example in which only the filter arrangement item is the brightness and the brightness arranged by the user is "-10%", but the present invention is not limited thereto. In fig. 6, only the entry into the display correction page is illustrated as the main picture push page, but the present invention is not limited thereto. Of course, the entry into the display correction page may also be in the manner described above for the entry into the display correction page shown in FIGS. 3-5.
Embodiment 2: the travel application may provide voice configuration filter parameter functionality. The filter parameter configuration can be realized by the interaction between the user and the terminal through voice. For example, the terminal may send a warning tone "please correct the brightness", and the user may send a voice command to the terminal based on the warning tone, such as "please adjust the brightness to-10%", etc. Correspondingly, the terminal can respond to the voice interaction operation generated aiming at the filter configuration item and start the microphone to pick up sound; and analyzing filter parameters from the sound data picked up by the microphone as filter parameters corresponding to filter configuration items.
Further, after the filter parameters corresponding to the filter configuration items are obtained in step 102, in step 103, a map interface may be rendered on the display screen based on the filter parameters corresponding to the filter configuration items and the map data. Wherein the map data may include: point of interest (POI) data, rendering scale, rendering camera pitch angle, and the like.
Optionally, the map data corresponding to the map interface can be requested from the server in response to the triggering operation of the map interface; and receiving the map data of the map interface returned by the server. Further, in step 103, a map to be displayed on the display screen may be rendered based on the map data; and rendering a map interface containing the map on a display screen according to the filter parameters corresponding to the filter configuration items acquired in the step 102.
In this embodiment, a display correction portal may be provided for the user to configure the filter parameters of the map interface autonomously. For a terminal, a filter configuration item may be displayed in response to a display correction event; a user can autonomously configure filter parameters of the filter configuration items through the filter configuration items; a map interface may be rendered on the display screen for the terminal based on the user-configured filter parameters and map data. In the rendering mode, a user can configure adaptive map filter parameters for display screens with different performances and adjust the display effect of the map interface on the display screen, so that the display effect of the map interface is adaptive to the display screen, and the map display effect is improved.
It should be noted that the execution subjects of the steps of the methods provided in the above embodiments may be the same device, or different devices may be used as the execution subjects of the methods. For example, the execution subject of steps 101 and 102 may be device a; for another example, the execution subject of step 101 may be device a, and the execution subject of step 102 may be device B; and so on.
In addition, in some of the flows described in the above embodiments and the drawings, a plurality of operations occurring in a specific order are included, but it should be clearly understood that these operations may be executed out of order or in parallel as they appear herein, and the sequence numbers of the operations, such as 101, 102, etc., are used merely to distinguish various operations, and the sequence numbers themselves do not represent any execution order. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel.
Accordingly, embodiments of the present application also provide a computer-readable storage medium storing computer instructions, which, when executed by one or more processors, cause the one or more processors to perform the steps of the map rendering method described above.
An embodiment of the present application further provides a computer program product, including: a computer program. The computer program is executed by a processor to implement the steps in the map rendering method described above. The computer program product may be travel application class software. For example, the map application software may be also other application software integrating the map application capability, such as a network appointment car application software, a life service class application software (take-away delivery), and the like.
Fig. 7 is a schematic structural diagram of a terminal device according to an embodiment of the present application. As shown in fig. 7, the terminal device includes: a memory 70a, a processor 70b and a display 70 c. The memory 70a is used for storing computer programs.
The processor 70b is coupled to the memory 70a and the display screen 70c for executing computer programs for: displaying filter configuration items on the display screen 70c in response to the display correction event; responding to the configuration operation aiming at the filter configuration items, and acquiring filter parameters corresponding to the filter configuration items; based on the filter parameters and the map data, a map interface is rendered on the display screen 70 c.
In some embodiments, the processor 70b, when displaying the filter configuration item, is specifically configured to: in response to a display correction event generated by a touch operation with respect to the display correction control, a filter configuration item is displayed on the display screen 70 c.
In other embodiments, the processor 70b is further configured to: in response to a monitored voice interaction triggering event, starting the microphone 70d to pick up sound; performing voice recognition on the first sound data picked up by the microphone 70d to obtain a voice recognition result; and if the voice recognition result reflects the display correction requirement, determining that a voice instruction for starting the display correction function is received. Accordingly, the processor 70b is specifically configured to, when displaying the filter configuration item: in response to receiving a display correction event generated by a voice command, a filter configuration item is displayed on the display screen 70 c.
In some embodiments, the processor 70b is specifically configured to, when acquiring the filter parameter corresponding to the filter configuration item: in response to the interactive operation aiming at the rendering level option corresponding to the filter configuration item, determining a target rendering level selected by the interactive operation; and acquiring filter parameters corresponding to the target rendering level from the corresponding relation between the set rendering level and the filter parameters as the filter parameters corresponding to the filter configuration items.
Optionally, when determining the target rendering level selected by the interactive operation, the processor 70b is specifically configured to: responding to the touch interaction operation aiming at the rendering level option of the filter configuration item, and determining the rendering level corresponding to the rendering level option selected by the touch operation as a target rendering level; or, in response to the voice interaction generated for the filter configuration item, activating the microphone 70d to pick up sound; the rendering level corresponding to the filter configuration item is analyzed from the second sound data picked up by the microphone 70d as a target rendering level.
In other embodiments, 70b is specifically configured to, when obtaining the filter parameter corresponding to the filter configuration item: displaying a parameter configuration page on the display screen 70c in response to the touch interactive operation for the filter configuration item; determining filter parameters provided by the configuration operation of the parameter configuration page, and setting the filter parameters corresponding to filter configuration items; or, in response to the voice interaction generated for the filter configuration item, activating the microphone 70d to pick up sound; the filter parameters are analyzed from the third sound data picked up by the microphone 70d as filter parameters corresponding to the filter configuration items.
Optionally, when the processor 70b renders the map interface on the display screen 70c, it is specifically configured to: rendering a map to be displayed on the display screen 70c based on the map data; and rendering a map interface containing the map on a display screen according to the filter parameters.
In some optional embodiments, as shown in fig. 7, the terminal device may further include: communication component 70e, power component 70f, audio component 70g, and the like. Only some of the components are schematically shown in fig. 7, and it does not mean that the terminal device must include all of the components shown in fig. 7, nor that the terminal device can include only the components shown in fig. 7.
The terminal device provided by the embodiment can provide a display correction entrance for a user to configure the filter parameters of the map interface autonomously. For a terminal, a filter configuration item may be displayed in response to a display correction event; a user can autonomously configure filter parameters of the filter configuration items through the filter configuration items; a map interface may be rendered on the display screen for the terminal based on the user-configured filter parameters and map data. In the rendering mode, a user can configure adaptive map filter parameters for display screens with different performances and adjust the display effect of the map interface on the display screen, so that the display effect of the map interface is adaptive to the display screen, and the map display effect is improved.
Fig. 8 is a schematic structural diagram of a map rendering apparatus according to an embodiment of the present application. The map rendering device may be implemented as any device or software that can execute the map rendering method provided by the above embodiments. As shown in fig. 8, the map rendering apparatus includes: a display module 80a, an acquisition module 80b, and a rendering module 80 c.
A display module 80a for displaying the filter configuration items in response to the display correction event.
The obtaining module 80b is configured to, in response to the configuration operation for the filter configuration item, obtain a filter parameter corresponding to the filter configuration item.
And a rendering module 80c for rendering a map interface on the display screen based on the filter parameters and the map data.
In some embodiments, the display module 80a is specifically configured to: and responding to a display correction event generated by touch operation aiming at the display correction control, and displaying the filter configuration item.
In other embodiments, as shown in fig. 8, the map rendering apparatus may further include: an activation module 80d and a speech recognition module 80 e. Wherein, the starting module 80d is operable to: and starting a microphone to pick up sound in response to the monitored voice interaction triggering event. The speech recognition module 80e may be configured to: performing voice recognition on first sound data picked up by a microphone to obtain a voice recognition result; and if the voice recognition result reflects the display correction requirement, determining that a voice instruction for starting the display correction function is received. Accordingly, the display module 80a is specifically configured to: the filter configuration items are displayed in response to a display correction event generated by receiving a voice command.
In other embodiments, the obtaining module 80b is specifically configured to: responding to the interactive operation aiming at the rendering grade option corresponding to the filter configuration item, and determining a target rendering grade selected by the interactive operation; and acquiring filter parameters corresponding to the target rendering level from the corresponding relation between the set rendering level and the filter parameters as the filter parameters corresponding to the filter configuration items.
Optionally, when determining the target rendering level selected by the interactive operation, the obtaining module 80b is specifically configured to: and in response to the touch interaction operation aiming at the rendering level option of the filter configuration item, determining the rendering level corresponding to the rendering level option selected by the touch operation as a target rendering level. Or, in response to the voice interaction operation generated aiming at the filter configuration item, starting a microphone to pick up sound; and analyzing the rendering grade corresponding to the filter configuration item from the second sound data picked up by the microphone to be used as a target rendering grade.
In still other embodiments, the obtaining module 80b is specifically configured to, when obtaining the filter parameter corresponding to the filter configuration item: responding to touch interactive operation aiming at the filter configuration item, and displaying a parameter configuration page; determining filter parameters provided by configuration operation of a parameter configuration page, and the filter parameters are corresponding to filter configuration items; or, in response to the voice interaction operation generated aiming at the filter configuration item, starting a microphone to pick up sound; and analyzing filter parameters from the third sound data picked up by the microphone as filter parameters corresponding to the filter configuration items.
In some other embodiments, the rendering module 80c is specifically configured to: rendering a map to be displayed on a display screen based on the map data; and according to the filter parameters, rendering a map interface containing a map on the display screen.
The map rendering device provided by the embodiment can provide a display correction inlet for a user to autonomously configure the filter parameters of the map interface. For a terminal, a filter configuration item may be displayed in response to a display correction event; a user can autonomously configure filter parameters of the filter configuration items through the filter configuration items; a map interface may be rendered on the display screen for the terminal based on the user-configured filter parameters and map data. In the rendering mode, a user can configure adaptive map filter parameters for display screens with different performances and adjust the display effect of the map interface on the display screen, so that the display effect of the map interface is adaptive to the display screen, and the map display effect is improved.
In embodiments of the present application, the memory is used to store computer programs and may be configured to store other various data to support operations on the device on which it is located. Wherein the processor may execute a computer program stored in the memory to implement the corresponding control logic. The memory may be implemented by any type or combination of volatile or non-volatile memory devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
In the embodiments of the present application, the processor may be any hardware processing device that can execute the above described method logic. Alternatively, the processor may be a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), or a Micro Controller Unit (MCU); programmable devices such as Field-Programmable Gate arrays (FPGAs), Programmable Array Logic devices (PALs), General Array Logic devices (GAL), Complex Programmable Logic Devices (CPLDs), etc. may also be used; or Advanced Reduced Instruction Set (RISC) processors (ARM), or System On Chips (SOC), etc., but is not limited thereto.
In embodiments of the present application, the communication component is configured to facilitate wired or wireless communication between the device in which it is located and other devices. The device in which the communication component is located can access a wireless network based on a communication standard, such as WiFi, 2G or 3G, 4G, 5G or a combination thereof. In an exemplary embodiment, the communication component receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component may also be implemented based on Near Field Communication (NFC) technology, Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, or other technologies.
In the embodiment of the present application, the display assembly may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the display assembly includes a touch panel, the display assembly may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
In embodiments of the present application, a power supply component is configured to provide power to various components of the device in which it is located. The power components may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device in which the power component is located.
In embodiments of the present application, the audio component may be configured to output and/or input audio signals. For example, the audio component includes a Microphone (MIC) configured to receive an external audio signal when the device in which the audio component is located is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in a memory or transmitted via a communication component. In some embodiments, the audio assembly further comprises a speaker for outputting audio signals. For example, for devices with language interaction functionality, voice interaction with a user may be enabled through an audio component, and so forth.
It should be noted that, the descriptions of "first", "second", etc. in this document are used for distinguishing different messages, devices, modules, etc., and do not represent a sequential order, nor limit the types of "first" and "second" to be different.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (10)

1. A map rendering method, comprising:
displaying a filter configuration item in response to a display correction event;
responding to the configuration operation aiming at the filter configuration item, and acquiring filter parameters corresponding to the filter configuration item;
rendering a map interface on a display screen based on the filter parameters and the map data.
2. The method of claim 1, wherein the displaying filter configuration items in response to a display correction event comprises:
and responding to a display correction event generated by touch operation aiming at the display correction control, and displaying the filter configuration item.
3. The method of claim 1, further comprising:
starting a microphone to pick up sound in response to the monitored voice interaction triggering event;
performing voice recognition on first sound data picked up by the microphone to obtain a voice recognition result;
if the voice recognition result reflects the display correction requirement, determining that a voice instruction for starting the display correction function is received;
the displaying filter configuration items in response to a display correction event, comprising:
displaying the filter configuration item in response to receiving a display correction event generated by the voice instruction.
4. The method of claim 1, wherein the obtaining filter parameters corresponding to the filter configuration items in response to the configuration operations for the filter configuration items comprises:
in response to the interactive operation aiming at the rendering level option corresponding to the filter configuration item, determining a target rendering level selected by the interactive operation;
and acquiring filter parameters corresponding to the target rendering level from the corresponding relation between the set rendering level and the filter parameters as the filter parameters corresponding to the filter configuration items.
5. The method of claim 4, wherein the determining a target rendering level selected by the interoperation in response to the interoperation for the rendering level option of the filter configuration item comprises:
in response to the touch interaction operation aiming at the rendering level option of the filter configuration item, determining the rendering level corresponding to the rendering level option selected by the touch operation as the target rendering level;
alternatively, the first and second electrodes may be,
in response to the voice interaction generated for the filter configuration item, starting a microphone to pick up sound;
and analyzing a rendering grade corresponding to the filter configuration item from the second sound data picked up by the microphone to serve as the target rendering grade.
6. The method of claim 1, wherein the obtaining filter parameters corresponding to the filter configuration items in response to the configuration operations for the filter configuration items comprises:
responding to the touch interactive operation aiming at the filter configuration item, and displaying a parameter configuration page;
determining filter parameters provided by the configuration operation of the parameter configuration page, and setting the filter parameters as filter parameters corresponding to the filter configuration items;
alternatively, the first and second electrodes may be,
in response to the voice interaction generated for the filter configuration item, starting a microphone to pick up sound;
and analyzing filter parameters from the third sound data picked up by the microphone as the filter parameters corresponding to the filter configuration items.
7. The method of any of claims 1-6, wherein the rendering a map interface on a display screen based on the filter parameters and map data comprises:
rendering a map to be displayed on the display screen based on the map data;
and rendering a map interface containing the map on a display screen according to the filter parameters.
8. A map rendering apparatus, comprising:
a display module for displaying a filter configuration item in response to a display correction event;
the acquisition module is used for responding to the configuration operation aiming at the filter configuration item and acquiring the filter parameter corresponding to the filter configuration item;
and the rendering module is used for rendering a map interface on a display screen based on the filter parameters and the map data.
9. A computer program product, comprising: a computer program; the computer program is executed by a processor to implement the steps of the method of any one of claims 1 to 7.
10. A computer-readable storage medium storing computer instructions, which, when executed by one or more processors, cause the one or more processors to perform the steps of the method of any one of claims 1-7.
CN202210067334.5A 2022-01-20 2022-01-20 Map rendering method, apparatus, program product, and storage medium Pending CN114579227A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210067334.5A CN114579227A (en) 2022-01-20 2022-01-20 Map rendering method, apparatus, program product, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210067334.5A CN114579227A (en) 2022-01-20 2022-01-20 Map rendering method, apparatus, program product, and storage medium

Publications (1)

Publication Number Publication Date
CN114579227A true CN114579227A (en) 2022-06-03

Family

ID=81772064

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210067334.5A Pending CN114579227A (en) 2022-01-20 2022-01-20 Map rendering method, apparatus, program product, and storage medium

Country Status (1)

Country Link
CN (1) CN114579227A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101576392A (en) * 2009-06-12 2009-11-11 深圳市凯立德计算机系统技术有限公司 Navigation system and display color setting method thereof
CN106503067A (en) * 2016-09-29 2017-03-15 百度在线网络技术(北京)有限公司 The exhibiting method and device of electronic chart
CN109388467A (en) * 2018-09-30 2019-02-26 百度在线网络技术(北京)有限公司 Map information display method, device, computer equipment and storage medium
CN111338726A (en) * 2020-02-18 2020-06-26 北京梧桐车联科技有限责任公司 Display interface adjusting method and device and computer storage medium
CN112492400A (en) * 2019-09-12 2021-03-12 阿里巴巴集团控股有限公司 Interaction method, device, equipment, communication method and shooting method
CN113094142A (en) * 2021-04-23 2021-07-09 海信视像科技股份有限公司 Page display method and display equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101576392A (en) * 2009-06-12 2009-11-11 深圳市凯立德计算机系统技术有限公司 Navigation system and display color setting method thereof
CN106503067A (en) * 2016-09-29 2017-03-15 百度在线网络技术(北京)有限公司 The exhibiting method and device of electronic chart
CN109388467A (en) * 2018-09-30 2019-02-26 百度在线网络技术(北京)有限公司 Map information display method, device, computer equipment and storage medium
CN112492400A (en) * 2019-09-12 2021-03-12 阿里巴巴集团控股有限公司 Interaction method, device, equipment, communication method and shooting method
CN111338726A (en) * 2020-02-18 2020-06-26 北京梧桐车联科技有限责任公司 Display interface adjusting method and device and computer storage medium
CN113094142A (en) * 2021-04-23 2021-07-09 海信视像科技股份有限公司 Page display method and display equipment

Similar Documents

Publication Publication Date Title
EP3121557B1 (en) Method and apparatus for determining spatial parameter based on an image
US9928811B2 (en) Methods, devices, and computer-readable storage medium for image display
RU2669511C2 (en) Method and device for recognising picture type
CN105574484A (en) Electronic device, and method for analyzing face information in electronic device
CN113763856B (en) Method and device for determining ambient illumination intensity and storage medium
CN113610811B (en) Automobile instrument panel indicator lamp testing method and device and storage medium
AU2018287568B2 (en) Method and apparatus for adjusting screen luminance, terminal device and storage medium
KR102664723B1 (en) Method for providing preview and electronic device using the same
US20160358592A1 (en) Text legibility over images
KR20190088000A (en) Interface image display method, apparatus, program, and recording medium
CN112925596B (en) Mobile terminal and display method of display object thereof
US10204403B2 (en) Method, device and medium for enhancing saturation
CN114449243B (en) White balance method and terminal equipment
CN115379208A (en) Camera evaluation method and device
CN114579227A (en) Map rendering method, apparatus, program product, and storage medium
CN112699044A (en) Pressure testing method, equipment and storage medium
CN114666497B (en) Imaging method, terminal device and storage medium
US10068151B2 (en) Method, device and computer-readable medium for enhancing readability
CN112784128B (en) Data processing and displaying method, device, system and storage medium
CN114863432A (en) Terminal device, contrast adjusting method, device and medium
CN114170359A (en) Volume fog rendering method, device and equipment and storage medium
CN115309302A (en) Icon display method, device, program product and storage medium
US20190180679A1 (en) Display calibration to minimize image retention
CN114416226A (en) Display adjusting method, device, terminal and storage medium
CN112069057A (en) Code testing method, device, apparatus and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20240313

Address after: # 03-06, Lai Zan Da Building 1, 51 Belarusian Road, Singapore

Applicant after: Alibaba Innovation Co.

Country or region after: Singapore

Address before: Room 01, 45th Floor, AXA Building, 8 Shanton Road, Singapore

Applicant before: Alibaba Singapore Holdings Ltd.

Country or region before: Singapore