CN112433651B - Region identification method, device, storage medium and device - Google Patents

Region identification method, device, storage medium and device Download PDF

Info

Publication number
CN112433651B
CN112433651B CN202011274838.1A CN202011274838A CN112433651B CN 112433651 B CN112433651 B CN 112433651B CN 202011274838 A CN202011274838 A CN 202011274838A CN 112433651 B CN112433651 B CN 112433651B
Authority
CN
China
Prior art keywords
gray value
target
area
mapping table
historical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011274838.1A
Other languages
Chinese (zh)
Other versions
CN112433651A (en
Inventor
张薇
刘春�
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
360 Digital Security Technology Group Co Ltd
Original Assignee
Beijing Hongteng Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Hongteng Intelligent Technology Co ltd filed Critical Beijing Hongteng Intelligent Technology Co ltd
Priority to CN202011274838.1A priority Critical patent/CN112433651B/en
Publication of CN112433651A publication Critical patent/CN112433651A/en
Application granted granted Critical
Publication of CN112433651B publication Critical patent/CN112433651B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/903Querying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/20Drawing from basic elements, e.g. lines or circles
    • G06T11/206Drawing of charts or graphs

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Databases & Information Systems (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention relates to the technical field of data processing, and discloses a region identification method, a device, a storage medium and a device. According to the method, when an interactive instruction input by a user based on a preset interface is received, the interactive position of the user on the preset interface is determined according to the interactive instruction; determining a corresponding target gray value according to the interaction position; the target area identification corresponding to the target gray value is searched in a preset mapping table, the corresponding target area is determined according to the target area identification, the preset interface is provided with a plurality of different areas, the preset mapping table comprises the corresponding relation between the area identification corresponding to each area in the preset interface and the gray value, therefore, the corresponding target area can be obtained according to the interaction position of the user through the preset mapping table, the area required by the user is obtained without complex boundary point calculation, and the efficiency of area identification is improved.

Description

Region identification method, device, storage medium and device
Technical Field
The present invention relates to the field of data processing technologies, and in particular, to a region identification method, device, storage medium, and apparatus.
Background
In map data visualization application, corresponding area highlighting or information display and the like are realized according to the current contact position of a user, and the method is a key mode for improving user experience. Considering that each area in the map is an irregular polygon, how to judge which area the contact belongs to is a complicated problem.
In general, the ray theory is a common detection method for points in a polygon, that is, rays are taken from a current interaction point, and judgment is performed according to the number of intersection points with the boundary of the polygon, wherein the number is a base number inside, and an even number is outside the polygon. However, in a visualization application, real-time user interaction is involved, and there are frequent detection and judgment, for example, a map involves a large number of areas, and it is more complicated to judge which area specifically belongs to by using the method.
Disclosure of Invention
The invention mainly aims to provide a region identification method, a device, a storage medium and a device, aiming at improving the efficiency of region identification.
In order to achieve the above object, the present invention provides a region identification method, including the steps of:
when an interactive instruction input by a user based on a preset interface is received, determining an interactive position of the user on the preset interface according to the interactive instruction;
determining a corresponding target gray value according to the interaction position;
searching a target area identifier corresponding to the target gray value in a preset mapping table, and determining a corresponding target area according to the target area identifier, wherein the preset interface has a plurality of different areas, and the preset mapping table comprises the corresponding relation between the area identifier corresponding to each area in the preset interface and the gray value.
Optionally, before the target area identifier corresponding to the target gray value is searched in a preset mapping table and the corresponding target area is determined according to the target area identifier, the method further includes:
defining a history identifier for the history area;
creating a corresponding relation between the history identification and the history gray value;
and generating a preset mapping table according to the corresponding relation.
Optionally, before the target area identifier corresponding to the target gray value is searched in a preset mapping table and the corresponding target area is determined according to the target area identifier, the method further includes:
judging whether the preset mapping table has the target gray value or not;
and when the target gray value is stored in the preset mapping table, the step of searching a target area identifier corresponding to the target gray value in the preset mapping table and determining a corresponding target area according to the target area identifier is executed.
Optionally, after determining whether the preset mapping table has the target grayscale value, the method further includes:
judging whether the preset mapping table has the target gray value or not;
when the target gray value is not stored in the preset mapping table, acquiring a reference area corresponding to the target gray value;
setting a reference area identifier for the reference area, and determining the corresponding relation between the reference area identifier and the target gray value;
updating the preset mapping table according to the corresponding relation between the reference area identifier and the target gray value to obtain an updated preset mapping table;
the searching for the target area identifier corresponding to the target gray value in a preset mapping table and determining the corresponding target area according to the target area identifier includes:
and searching a target area identifier corresponding to the target gray value in the updated preset mapping table, and determining a corresponding target area according to the target area identifier.
Optionally, the determining a corresponding target gray value according to the interaction position includes:
determining an interaction coordinate on the preset interface according to the interaction position;
converting the interaction coordinate into a two-dimensional coordinate on a canvas;
and determining a corresponding target gray value according to the two-dimensional coordinates on the canvas.
Optionally, before converting the interaction coordinate into a two-dimensional coordinate on a canvas, the method further includes:
acquiring a historical two-dimensional coordinate, defining a gray value according to the historical two-dimensional coordinate, and acquiring a gray value corresponding to the historical two-dimensional coordinate;
and making a gray scale map according to the gray scale value corresponding to the historical two-dimensional coordinate, and drawing a canvas according to the gray scale map.
Optionally, before the making of the grayscale map according to the grayscale value corresponding to the historical two-dimensional coordinate and the drawing of the canvas according to the grayscale map, the method further includes:
carrying out similarity verification on the gray value corresponding to the historical two-dimensional coordinate to obtain a similarity verification result;
adjusting the gray value corresponding to the historical two-dimensional coordinate according to the similarity check result to obtain the gray value corresponding to the adjusted historical two-dimensional coordinate;
the making of the gray scale map according to the gray scale value corresponding to the historical two-dimensional coordinate and the drawing of the canvas according to the gray scale map comprise:
and making a gray scale map according to the gray scale value corresponding to the adjusted historical two-dimensional coordinate, and drawing a canvas according to the gray scale map.
Optionally, the performing similarity check on the gray value corresponding to the historical two-dimensional coordinate to obtain a similarity check result includes:
obtaining a historical two-dimensional coordinate with a shorter distance according to the coordinate value of the historical two-dimensional coordinate;
acquiring a first historical two-dimensional coordinate and a second historical two-dimensional coordinate in the historical two-dimensional coordinates which are close to each other;
obtaining a corresponding first gray value according to the first historical two-dimensional coordinate, and obtaining a corresponding second gray value according to the second historical two-dimensional coordinate;
and carrying out similarity verification on the first gray value and the second gray value to obtain a similarity verification result.
Optionally, the performing similarity verification on the first gray value and the second gray value to obtain a similarity verification result includes:
extracting each numerical value in the first gray value and the second gray value and corresponding digital information;
comparing the numerical values corresponding to the same digital information to judge whether the numerical values corresponding to the same digital information are the same;
and obtaining a similarity checking result according to the same quantity.
Optionally, the adjusting the gray value corresponding to the historical two-dimensional coordinate according to the similarity check result to obtain the gray value corresponding to the adjusted historical two-dimensional coordinate includes:
when the same number reaches a preset number, obtaining gray values which belong to the same digital information and have the same numerical value;
and adjusting the gray values which belong to the same digital information and have the same numerical value according to a preset strategy to obtain the gray values corresponding to the adjusted historical two-dimensional coordinates.
Optionally, the determining a corresponding target gray value according to the two-dimensional coordinate on the canvas includes:
calling a canvas image data application program interface;
and determining a corresponding target gray value according to the two-dimensional coordinates on the canvas through the canvas image data application program interface.
Optionally, after searching for the target area identifier corresponding to the target gray value in a preset mapping table and determining the corresponding target area according to the target area identifier, the method further includes:
acquiring current display parameters of the target area;
comparing the current display parameter with a preset display parameter;
and when the current display parameter is inconsistent with a preset display parameter, adjusting the current display parameter to be the preset display parameter.
Optionally, after obtaining the current display parameters of the target region, the method further includes:
acquiring a parameter adjusting instruction, and extracting a target configuration parameter in the parameter adjusting instruction;
and adjusting the current display parameters according to the target configuration parameters to obtain the adjusted current display parameters.
In addition, to achieve the above object, the present invention also provides an area recognition apparatus including:
the acquisition module is used for determining the interaction position of a user on a preset interface according to an interaction instruction when the interaction instruction input by the user based on the preset interface is received;
the acquisition module is further used for determining a corresponding target gray value according to the interaction position;
the searching module is used for searching a target area identifier corresponding to the target gray value in a preset mapping table, and determining a corresponding target area according to the target area identifier, wherein the preset interface has a plurality of different areas, and the preset mapping table comprises the corresponding relation between the area identifier corresponding to each area in the preset interface and the gray value.
Optionally, the area identification apparatus further includes: a generation module;
the generation module is used for defining a history identifier for the history area;
creating a corresponding relation between the history identification and the history gray value;
and generating a preset mapping table according to the corresponding relation.
Optionally, the area identification apparatus further includes: a judgment module;
the judging module is used for judging whether the preset mapping table has the target gray value.
Optionally, the determining module is further configured to determine whether the preset mapping table has the target gray value;
when the target gray value is not stored in the preset mapping table, acquiring a reference area corresponding to the target gray value;
setting a reference area identifier for the reference area, and determining the corresponding relation between the reference area identifier and the target gray value;
and updating the preset mapping table according to the corresponding relation between the reference area identifier and the target gray value to obtain the updated preset mapping table.
Optionally, the obtaining module is further configured to determine an interaction coordinate on the preset interface according to the interaction position;
converting the interaction coordinate into a two-dimensional coordinate on a canvas;
and determining a corresponding target gray value according to the two-dimensional coordinates on the canvas.
Further, to achieve the above object, the present invention also provides an area recognition apparatus including: memory, a processor and a region identification program stored on the memory and running on the processor, the region identification program when executed by the processor implementing the steps of the region identification method as described above.
Furthermore, to achieve the above object, the present invention also proposes a storage medium having stored thereon an area identification program which, when executed by a processor, implements the steps of the area identification method as described above.
According to the technical scheme provided by the invention, when an interactive instruction input by a user based on a preset interface is received, the interactive position of the user on the preset interface is determined according to the interactive instruction; determining a corresponding target gray value according to the interaction position; the target area identification corresponding to the target gray value is searched in a preset mapping table, the corresponding target area is determined according to the target area identification, the preset interface is provided with a plurality of different areas, the preset mapping table comprises the corresponding relation between the area identification corresponding to each area in the preset interface and the gray value, therefore, the corresponding target area can be obtained according to the interaction position of the user through the preset mapping table, the area required by the user is obtained without complex boundary point calculation, and the efficiency of area identification is improved.
Drawings
FIG. 1 is a schematic diagram of a region identification device of a hardware operating environment according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating a region identification method according to a first embodiment of the present invention;
FIG. 3 is a schematic overall flowchart of an embodiment of a region identification method according to the present invention;
FIG. 4 is a flowchart illustrating a region identification method according to a second embodiment of the present invention;
FIG. 5 is a flowchart illustrating a third embodiment of a region identification method according to the present invention;
fig. 6 is a block diagram of a first embodiment of the area recognition device according to the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Referring to fig. 1, fig. 1 is a schematic structural diagram of a region identification device in a hardware operating environment according to an embodiment of the present invention.
As shown in fig. 1, the area recognition apparatus may include: a processor 1001, such as a Central Processing Unit (CPU), a communication bus 1002, a user interface 1003, a network interface 1004, and a memory 1005. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display screen (Display), the optional user interface 1003 may also include a standard wired interface and a wireless interface, and the wired interface of the user interface 1003 may be a Universal Serial Bus (USB) interface in the present invention. The network interface 1004 may optionally include a standard wired interface as well as a wireless interface (e.g., WI-FI interface). The Memory 1005 may be a high speed Random Access Memory (RAM); or a stable Memory, such as a Non-volatile Memory (Non-volatile Memory), and may be a disk Memory. The memory 1005 may alternatively be a storage device separate from the processor 1001.
Those skilled in the art will appreciate that the configuration shown in fig. 1 does not constitute a limitation of the area identification device, and may include more or less components than those shown, or some components in combination, or a different arrangement of components.
As shown in fig. 1, a memory 1005, which is a kind of computer storage medium, may include therein an operating system, a network communication module, a user interface module, and an area recognition program.
In the area identification device shown in fig. 1, the network interface 1004 is mainly used for connecting to a background server and performing data communication with the background server; the user interface 1003 is mainly used for connecting peripheral equipment; the area recognition apparatus calls an area recognition program stored in the memory 1005 through the processor 1001 and performs the area recognition method provided by the embodiment of the present invention.
Based on the above hardware structure, an embodiment of the area identification method of the present invention is provided.
Referring to fig. 2, fig. 2 is a flowchart illustrating a region identification method according to a first embodiment of the present invention.
In a first embodiment, the area identification method comprises the steps of:
step S10: when an interactive instruction input by a user based on a preset interface is received, determining the interactive position of the user on the preset interface according to the interactive instruction.
It should be noted that the execution subject in this embodiment may be a terminal device, for example, a client provided with an area identification program, or may also be another device that can achieve the same or similar functions.
In this embodiment, the preset interface may be a map interface displayed by a browser on a client, and may also be a map interface displayed by other third-party application programs.
It can be understood that the interactive position of the user on the map interface may be coordinate information clicked by the user in the map interface, or may also be a position area selected by the user according to the area frame, which is not limited in this embodiment, and when performing the map area identification, the interactive position may be a position area corresponding to the current clicked position in the map by clicking a certain point on the map in the map display interface for the user, so as to achieve effective identification of the map area, and the interactive position may also be applied to a scene, such as a particle system, in which a detection point is required to be in an irregular polygon area, and a corresponding area may be obtained according to the detection point.
Step S20: and determining a corresponding target gray value according to the interaction position.
The target grayscale value is a numeric value that is defined by the user in advance and represents a grayscale, for example, #010101, and may be in another form that represents a grayscale.
In specific implementation, when the user interaction information is acquired, coordinate information of an interaction point is obtained according to the user interaction information, a corresponding gray value is obtained according to the coordinate information, for example, when the user clicks a map, the interaction coordinate on the user click map is (23, 40), and the corresponding gray value is #020202, so that determination of the gray value is achieved, and effective identification of the gray value is improved.
Step S30: searching a target area identifier corresponding to the target gray value in a preset mapping table, and determining a corresponding target area according to the target area identifier, wherein the preset interface has a plurality of different areas, and the preset mapping table comprises the corresponding relation between the area identifier corresponding to each area in the preset interface and the gray value.
In this embodiment, a preset mapping table is created, where the preset mapping table includes a corresponding relationship between area identifiers and gray values corresponding to each area in the preset interface, so that an accurate area identifier can be obtained according to the gray values, and thus, the area identification can be realized according to the area identifiers, for example, beijing, tianjin, shanghai, and the like exist in a map interface, each area corresponds to an area identifier, a corresponding area can be obtained according to the area identifier, when an interactive coordinate is (23, 40), it is determined that the corresponding gray value is #020202, an area identifier corresponding to a gray value #020202 is obtained according to a query of the preset mapping table, and an area is tianjin is identified according to the area identifier, where the area identifier may be an ID number, and may also be an identifier in another form, which is not limited in this embodiment.
As shown in the overall flow diagram of fig. 3, firstly, a grayscale map and a mapping table of area names are defined, for example, the grayscale value corresponding to shanghai is #03030, a corresponding grayscale map is generated according to an area and a corresponding area value, and the manufactured grayscale value is drawn in the canvas, where the canvas may be a carvas canvas, or may be a canvas of another form, this embodiment does not limit this, the interaction point coordinates of the interface are converted into two-dimensional coordinates of the carvas canvas, then, the getImageDate () API of the carvas is used according to the two-dimensional coordinates to obtain a color value corresponding to the two-dimensional coordinates in the carvas canvas, and the mapping table of the grayscale value is searched backwards according to the color value, so that corresponding area information is obtained, the area name of the current interaction area is obtained, and the identification of the area is completed.
In the embodiment, when an interactive instruction input by a user based on a preset interface is received, an interactive position of the user on the preset interface is determined according to the interactive instruction; determining a corresponding target gray value according to the interaction position; the target area identification corresponding to the target gray value is searched in a preset mapping table, the corresponding target area is determined according to the target area identification, the preset interface is provided with a plurality of different areas, the preset mapping table comprises the corresponding relation between the area identification corresponding to each area in the preset interface and the gray value, therefore, the corresponding target area can be obtained according to the interaction position of the user through the preset mapping table, the area required by the user is obtained without complex boundary point calculation, and the efficiency of area identification is improved.
Referring to fig. 4, fig. 4 is a flowchart illustrating a second embodiment of the area identification method according to the present invention, and the second embodiment of the area identification method according to the present invention is proposed based on the first embodiment shown in fig. 2.
In the second embodiment, before the step S30, the method further includes:
in step S301, a history flag is defined for the history area.
It should be noted that the history area is an acquired area sample, such as beijing, tianjin, shanghai, and the like, and includes a foreign area in addition to a domestic area, which is not limited in this embodiment, and in order to obtain the map, the area search is performed through the map, so that rapid area identification can be achieved.
Step S302, creating a corresponding relation between the history identification and the history gray value.
In this embodiment, the corresponding relationship between the history identifier and the history gray scale value is established, the corresponding relationship between the history identifier and the history gray scale value may be established in the form of a mapping table, and the corresponding relationship between the history identifier and the history gray scale value may also be established in other forms.
And step S303, generating a preset mapping table according to the corresponding relation.
It can be understood that after the preset mapping table is generated according to the corresponding relationship, a corresponding query interface is generated, the corresponding history identifier can be queried according to the history gray value according to the query interface, and in order to improve the flexibility of the preset mapping table, the preset mapping table can be updated in real time.
Further, before the step S30, the method further includes: judging whether the preset mapping table has the target gray value or not; and executing step S30 when the preset mapping table stores the target gray scale value.
In this embodiment, in order to improve the recognition efficiency, before searching for the target area identifier corresponding to the target grayscale value in the preset mapping table, the grayscale value in the preset mapping table is determined, and it is determined whether the preset mapping table has the current target grayscale value, for example, when the query grayscale value is #020202, it is determined whether the preset mapping table has the grayscale value #020202, when the preset mapping table has the grayscale value #020202, the identifier query step is performed, and when the preset mapping table does not have the grayscale value #020202, the identifier query step cannot be performed.
Further, after determining whether the preset mapping table has the target grayscale value, the method further includes: when the target gray value is not stored in the preset mapping table, acquiring a reference area corresponding to the target gray value; setting a reference area identifier for the reference area, and determining the corresponding relation between the reference area identifier and the target gray value; updating the preset mapping table according to the corresponding relation between the reference area identifier and the target gray value to obtain an updated preset mapping table; the step S30 includes: and searching a target area identifier corresponding to the target gray value in the updated preset mapping table, and determining a corresponding target area according to the target area identifier.
In this embodiment, the updating of the preset mapping table is realized by obtaining a corresponding region according to a gray value not stored currently when the target gray value is not stored in the preset mapping table, establishing a correspondence between the gray value and the actual region, and updating the preset mapping table according to the correspondence between the gray value and the actual region to obtain the updated preset mapping table, so as to expand the query range of the preset mapping table.
Further, the step S20 includes:
determining an interaction coordinate on the preset interface according to the interaction position; converting the interaction coordinate into a two-dimensional coordinate on a canvas; and determining a corresponding target gray value according to the two-dimensional coordinates on the canvas.
In the specific implementation, the interaction point coordinates of the interface are converted into two-dimensional coordinates of the carvas canvas, the getImageDate () API of the carvas is used according to the two-dimensional coordinates, the color values of the two-dimensional coordinates corresponding to the carvas canvas are obtained, and the color values are used as the corresponding target gray values.
Further, before converting the interaction coordinate into a two-dimensional coordinate on a canvas, the method further includes:
acquiring a historical two-dimensional coordinate, defining a gray value according to the historical two-dimensional coordinate, and acquiring a gray value corresponding to the historical two-dimensional coordinate; and making a gray scale map according to the gray scale value corresponding to the historical two-dimensional coordinate, and drawing a canvas according to the gray scale map.
It should be noted that, before the making of the grayscale map according to the grayscale value corresponding to the historical two-dimensional coordinate and the drawing of the canvas according to the grayscale map, the method further includes:
carrying out similarity verification on the gray value corresponding to the historical two-dimensional coordinate to obtain a similarity verification result; adjusting the gray value corresponding to the historical two-dimensional coordinate according to the similarity check result to obtain the gray value corresponding to the adjusted historical two-dimensional coordinate; the making of the gray scale map according to the gray scale value corresponding to the historical two-dimensional coordinate and the drawing of the canvas according to the gray scale map comprise: and making a gray scale map according to the gray scale value corresponding to the adjusted historical two-dimensional coordinate, and drawing a canvas according to the gray scale map.
In this embodiment, before drawing the canvas, the gray-scale values of the boundary need to be checked, and when defining the gray-scale values of the boundary, the gray-scale values of the boundary are ensured to be greatly distinguished, so that the accuracy is ensured to be higher during the boundary identification.
In the specific implementation, the similarity check is carried out on the gray value corresponding to the historical two-dimensional coordinate, the similarity result is judged, and when the similarity is higher, the gray value of the boundary with higher similarity is adjusted to obtain the gray value corresponding to the adjusted two-dimensional coordinate, so that the accuracy of establishing the canvas is improved.
Further, the performing similarity check on the gray value corresponding to the historical two-dimensional coordinate to obtain a similarity check result includes:
obtaining a historical two-dimensional coordinate with a shorter distance according to the coordinate value of the historical two-dimensional coordinate; acquiring a first historical two-dimensional coordinate and a second historical two-dimensional coordinate in the historical two-dimensional coordinates which are close to each other; obtaining a corresponding first gray value according to the first historical two-dimensional coordinate, and obtaining a corresponding second gray value according to the second historical two-dimensional coordinate; and carrying out similarity verification on the first gray value and the second gray value to obtain a similarity verification result.
In order to perform effective similarity verification, gray-scale values with close distances may be compared, and similarity verification is performed according to the comparison result, for example, when coordinates (30, 42) and (30, 41) are located at the boundary between two regions, in this case, gray-scale values corresponding to coordinates with close distances are compared, so that the difference between the gray-scale values is larger, for example, when the gray-scale value corresponding to (30, 42) is #020203, and the gray-scale value corresponding to (30, 41) is #020204, the gray-scale values are closer, and it is necessary to adjust the gray-scale value corresponding to the coordinates with close distances, that is, #020203 is #030203, or #020204 is # 040204.
In a specific implementation, a preset adjustment strategy is obtained, and the gray value is adjusted according to the preset adjustment strategy, wherein the preset adjustment strategy comprises: the gray value corresponding to the coordinate close to the distance is obtained, and the gray value is randomly selected from the gray value corresponding to the coordinate close to the distance for adjustment, which may also be other adjustment manners, which is not limited in this embodiment.
Further, the performing similarity verification on the first gray value and the second gray value to obtain a similarity verification result includes:
extracting each numerical value in the first gray value and the second gray value and corresponding digital information; comparing the numerical values corresponding to the same digital information to judge whether the numerical values corresponding to the same digital information are the same; and obtaining a similarity checking result according to the same quantity.
It should be noted that the numerical information corresponding to each numerical value may be coordinate position information corresponding to the numerical value, and may also be corresponding unit position information.
In order to manage the gradation value, the gradation value is managed based on the digit information, for example, #020203 and #020204 are compared based on digits and divided into a first digit, a second digit, a third digit, and the like, and #020203 is divided into "0, 2, 0, 2, 0, 3", and the digits corresponding in this order are the first digit, the second digit, the third digit, and the like, and the first digit of #020203 and #020204 is compared, and the second digit of #020203 and #020204 is compared and compared in this order, so that the similarity of the numerical values of the same digit is obtained, and more detailed data analysis is realized, and the accuracy of gradation value identification is improved.
Further, the adjusting the gray value corresponding to the historical two-dimensional coordinate according to the similarity check result to obtain the gray value corresponding to the adjusted historical two-dimensional coordinate includes:
when the same number reaches a preset number, obtaining gray values which belong to the same digital information and have the same numerical value; and adjusting the gray values which belong to the same digital information and have the same numerical value according to a preset strategy to obtain the gray values corresponding to the adjusted historical two-dimensional coordinates.
In this embodiment, the preset policy may be the preset adjustment policy described above, and the gray scale values with the same number of bits and the same numerical value are adjusted according to the preset adjustment policy, for example, the same numerical value in #020203 and #020204 is adjusted, #020203 is adjusted to #030203, or #020204 is adjusted to #040204, so as to adjust the gray scale value corresponding to the coordinate with the closer distance, thereby increasing the difference between the boundary regions and improving the accuracy of gray scale value identification.
Further, the determining a corresponding target gray value according to the two-dimensional coordinates on the canvas includes:
calling a canvas image data application program interface; and determining a corresponding target gray value according to the two-dimensional coordinates on the canvas through the canvas image data application program interface.
It should be noted that the canvas image data application program interface may be getImageDate () API for calling carvas, and may also be other interfaces capable of realizing image data calling, which is not limited in this embodiment, the getImageDate () API interface is taken as an example for explaining, for example, a corresponding two-dimensional coordinate is obtained according to a coordinate, and a corresponding gray value is obtained by using the getImageDate () API interface of the canvas according to the two-dimensional coordinate, so as to realize obtaining of the gray value, obtain the gray value by using the parameter calling interface of the canvas, and improve the usage rate of the canvas parameters.
In this embodiment, when the gray value is established, the similarity verification is performed on the gray value corresponding to the coordinate at the closer distance, so that the gray value corresponding to the coordinate at the closer distance is ensured to be greatly different, and the accuracy of gray value definition is improved when the boundary identification is performed.
Referring to fig. 5, fig. 5 is a flowchart illustrating a third embodiment of the area identification method according to the present invention, and the third embodiment of the area identification method according to the present invention is proposed based on the first embodiment shown in fig. 2.
After the step S30, the method further includes:
step S304, acquiring the current display parameters of the target area.
In this embodiment, the current display parameters include a display color, a display brightness, a display style, and the like, and may further include other display parameters, which are not limited in this embodiment, for example, when the current display color is black, the current display color is a default display parameter.
Step S305, comparing the current display parameter with a preset display parameter.
In a specific implementation, the preset display parameter may be a preset display parameter, for example, the display color is yellow, the current display parameter may be compared with the preset display parameter, and when the current display parameter is inconsistent with the preset display parameter, the current display parameter is adjusted to the preset display parameter, that is, when the current display color is black, and when the preset display parameter is yellow, the black of the current display color is adjusted to yellow.
Step S306, when the current display parameter is inconsistent with a preset display parameter, adjusting the current display parameter to the preset display parameter.
It can be understood that when the current display parameter is consistent with the preset display parameter, that is, when the current display color is black, and the preset display parameter is black, no processing is performed, so that the data processing efficiency is improved.
In an embodiment, after the step S304, the method further includes: acquiring a parameter adjusting instruction, and extracting a target configuration parameter in the parameter adjusting instruction; and adjusting the current display parameters according to the target configuration parameters to obtain the adjusted current display parameters.
In this embodiment, the display parameters may also be set according to a selection of a user, so that the display parameters may meet the use requirements of different users, where the parameter adjustment instruction may be preset by the user through a parameter setting interface, the identification information of the user may be obtained according to the interaction information of the user, and the corresponding display parameters are obtained according to the presentation information of the user, for example, the display area set by the user a is blue, and when the current display parameter is yellow, when the user a clicks on the area, the current display parameter is adjusted from yellow to blue, thereby implementing flexible adjustment of the display parameters.
In addition, an embodiment of the present invention further provides a storage medium, where an area identification program is stored on the storage medium, and the area identification program, when executed by a processor, implements the steps of the terminal network access method described above.
Since the storage medium adopts all technical solutions of all the embodiments, at least all the beneficial effects brought by the technical solutions of the embodiments are achieved, and no further description is given here.
In addition, referring to fig. 6, an embodiment of the present invention further provides an area identification apparatus, where the area identification apparatus includes:
the obtaining module 10 is configured to, when an interaction instruction input by a user based on a preset interface is received, determine an interaction position of the user on the preset interface according to the interaction instruction.
In this embodiment, the preset interface may be a map interface displayed by a browser on a client, and may also be a map interface displayed by other third-party application programs.
It can be understood that the interactive position of the user on the map interface may be coordinate information clicked by the user in the map interface, or may also be a position area selected by the user according to the area frame, which is not limited in this embodiment, and when performing the map area identification, the interactive position may be a position area corresponding to the current clicked position in the map by clicking a certain point on the map in the map display interface for the user, so as to achieve effective identification of the map area, and the interactive position may also be applied to a scene, such as a particle system, in which a detection point is required to be in an irregular polygon area, and a corresponding area may be obtained according to the detection point.
The obtaining module 10 is further configured to determine a corresponding target gray value according to the interaction position.
The target grayscale value is a numeric value that is defined by the user in advance and represents a grayscale, for example, #010101, and may be in another form that represents a grayscale.
In specific implementation, when the user interaction information is acquired, coordinate information of an interaction point is obtained according to the user interaction information, a corresponding gray value is obtained according to the coordinate information, for example, when the user clicks a map, the interaction coordinate on the user click map is (23, 40), and the corresponding gray value is #020202, so that determination of the gray value is achieved, and effective identification of the gray value is improved.
The searching module 20 is configured to search a preset mapping table for a target area identifier corresponding to the target gray-scale value, and determine a corresponding target area according to the target area identifier, where the preset interface has a plurality of different areas, and the preset mapping table includes a correspondence between area identifiers corresponding to the areas in the preset interface and the gray-scale value.
In this embodiment, a preset mapping table is created, where the preset mapping table includes a corresponding relationship between area identifiers and gray values corresponding to each area in the preset interface, so that an accurate area identifier can be obtained according to the gray values, and thus, the area identification can be realized according to the area identifiers, for example, beijing, tianjin, shanghai, and the like exist in a map interface, each area corresponds to an area identifier, a corresponding area can be obtained according to the area identifier, when an interactive coordinate is (23, 40), it is determined that the corresponding gray value is #020202, an area identifier corresponding to a gray value #020202 is obtained according to a query of the preset mapping table, and an area is tianjin is identified according to the area identifier, where the area identifier may be an ID number, and may also be an identifier in another form, which is not limited in this embodiment.
As shown in the overall flow diagram of fig. 3, firstly, a grayscale map and a mapping table of area names are defined, for example, the grayscale value corresponding to shanghai is #03030, a corresponding grayscale map is generated according to an area and a corresponding area value, and the manufactured grayscale value is drawn in the canvas, where the canvas may be a carvas canvas, or may be a canvas of another form, this embodiment does not limit this, the interaction point coordinates of the interface are converted into two-dimensional coordinates of the carvas canvas, then, the getImageDate () API of the carvas is used according to the two-dimensional coordinates to obtain a color value corresponding to the two-dimensional coordinates in the carvas canvas, and the mapping table of the grayscale value is searched backwards according to the color value, so that corresponding area information is obtained, the area name of the current interaction area is obtained, and the identification of the area is completed.
In the embodiment, when an interactive instruction input by a user based on a preset interface is received, an interactive position of the user on the preset interface is determined according to the interactive instruction; determining a corresponding target gray value according to the interaction position; the target area identification corresponding to the target gray value is searched in a preset mapping table, the corresponding target area is determined according to the target area identification, the preset interface is provided with a plurality of different areas, the preset mapping table comprises the corresponding relation between the area identification corresponding to each area in the preset interface and the gray value, therefore, the corresponding target area can be obtained according to the interaction position of the user through the preset mapping table, the area required by the user is obtained without complex boundary point calculation, and the efficiency of area identification is improved.
In one embodiment, the area identification apparatus further includes: a generation module;
the generation module is used for defining a history identifier for the history area;
creating a corresponding relation between the history identification and the history gray value;
and generating a preset mapping table according to the corresponding relation.
In one embodiment, the area identification apparatus further includes: a judgment module;
the judging module is used for judging whether the preset mapping table has the target gray value.
In an embodiment, the determining module is further configured to determine whether the preset mapping table has the target gray value;
when the target gray value is not stored in the preset mapping table, acquiring a reference area corresponding to the target gray value;
setting a reference area identifier for the reference area, and determining the corresponding relation between the reference area identifier and the target gray value;
and updating the preset mapping table according to the corresponding relation between the reference area identifier and the target gray value to obtain the updated preset mapping table.
In an embodiment, the obtaining module is further configured to determine an interaction coordinate on the preset interface according to the interaction position;
converting the interaction coordinate into a two-dimensional coordinate on a canvas;
and determining a corresponding target gray value according to the two-dimensional coordinates on the canvas.
In an embodiment, the obtaining module is further configured to obtain a historical two-dimensional coordinate, define a gray value according to the historical two-dimensional coordinate, and obtain a gray value corresponding to the historical two-dimensional coordinate;
and making a gray scale map according to the gray scale value corresponding to the historical two-dimensional coordinate, and drawing a canvas according to the gray scale map.
In one embodiment, the area identification apparatus further includes: an adjustment module;
the adjusting module is used for carrying out similarity verification on the gray value corresponding to the historical two-dimensional coordinate to obtain a similarity verification result;
and adjusting the gray value corresponding to the historical two-dimensional coordinate according to the similarity check result to obtain the gray value corresponding to the adjusted historical two-dimensional coordinate.
In an embodiment, the adjusting module is further configured to obtain a historical two-dimensional coordinate closer to the current position according to the coordinate value of the historical two-dimensional coordinate;
acquiring a first historical two-dimensional coordinate and a second historical two-dimensional coordinate in the historical two-dimensional coordinates which are close to each other;
obtaining a corresponding first gray value according to the first historical two-dimensional coordinate, and obtaining a corresponding second gray value according to the second historical two-dimensional coordinate;
and carrying out similarity verification on the first gray value and the second gray value to obtain a similarity verification result.
In an embodiment, the adjusting module is further configured to extract each of the first gray scale value and the second gray scale value and corresponding digital information;
comparing the numerical values corresponding to the same digital information to judge whether the numerical values corresponding to the same digital information are the same;
and obtaining a similarity checking result according to the same quantity.
In an embodiment, the adjusting module is further configured to obtain gray values belonging to the same digital information and having the same numerical value when the same number reaches a preset number;
and adjusting the gray values which belong to the same digital information and have the same numerical value according to a preset strategy to obtain the gray values corresponding to the adjusted historical two-dimensional coordinates.
In an embodiment, the obtaining module is further configured to invoke a canvas image data application program interface;
and determining a corresponding target gray value according to the two-dimensional coordinates on the canvas through the canvas image data application program interface.
In one embodiment, the area identification apparatus further includes: a comparison module;
the comparison module is used for acquiring the current display parameters of the target area;
comparing the current display parameter with a preset display parameter;
and when the current display parameter is inconsistent with a preset display parameter, adjusting the current display parameter to be the preset display parameter.
In an embodiment, the comparison module is further configured to obtain a parameter adjustment instruction, and extract a target configuration parameter in the parameter adjustment instruction;
and adjusting the current display parameters according to the target configuration parameters to obtain the adjusted current display parameters.
The area identification device of the present invention adopts all the technical solutions of all the embodiments described above, so that at least all the beneficial effects brought by the technical solutions of the embodiments described above are achieved, and no further description is given here.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (16)

1. A region identification method, characterized by comprising the steps of:
when an interactive instruction input by a user based on a preset interface is received, determining an interactive position of the user on the preset interface according to the interactive instruction;
determining a corresponding target gray value according to the interaction position;
searching a target area identifier corresponding to the target gray value in a preset mapping table, and determining a corresponding target area according to the target area identifier, wherein the preset interface has a plurality of different areas, and the preset mapping table comprises the corresponding relation between the area identifier corresponding to each area in the preset interface and the gray value;
the determining the corresponding target gray value according to the interaction position includes:
acquiring a historical two-dimensional coordinate, defining a gray value according to the historical two-dimensional coordinate, and acquiring a gray value corresponding to the historical two-dimensional coordinate;
carrying out similarity verification on the gray value corresponding to the historical two-dimensional coordinate to obtain a similarity verification result;
adjusting the gray value corresponding to the historical two-dimensional coordinate according to the similarity check result to obtain the gray value corresponding to the adjusted historical two-dimensional coordinate;
making a gray scale map according to the gray scale value corresponding to the adjusted historical two-dimensional coordinate, and drawing a canvas according to the gray scale map;
determining an interaction coordinate on the preset interface according to the interaction position;
converting the interaction coordinate into a two-dimensional coordinate on a canvas;
and determining a corresponding target gray value according to the two-dimensional coordinates on the canvas.
2. The area identification method according to claim 1, wherein before searching a preset mapping table for a target area identifier corresponding to the target gray-level value and determining a corresponding target area according to the target area identifier, the method further comprises:
defining a history identifier for the history area;
creating a corresponding relation between the history identification and the history gray value;
and generating a preset mapping table according to the corresponding relation.
3. The area identification method according to claim 1, wherein before searching a preset mapping table for a target area identifier corresponding to the target gray-level value and determining a corresponding target area according to the target area identifier, the method further comprises:
judging whether the preset mapping table has the target gray value or not;
and when the target gray value is stored in the preset mapping table, the step of searching a target area identifier corresponding to the target gray value in the preset mapping table and determining a corresponding target area according to the target area identifier is executed.
4. The method of claim 3, wherein after determining whether the preset mapping table has the target gray level value, the method further comprises:
judging whether the preset mapping table has the target gray value or not;
when the target gray value is not stored in the preset mapping table, acquiring a reference area corresponding to the target gray value;
setting a reference area identifier for the reference area, and determining the corresponding relation between the reference area identifier and the target gray value;
updating the preset mapping table according to the corresponding relation between the reference area identifier and the target gray value to obtain an updated preset mapping table;
the searching for the target area identifier corresponding to the target gray value in a preset mapping table and determining the corresponding target area according to the target area identifier includes:
and searching a target area identifier corresponding to the target gray value in the updated preset mapping table, and determining a corresponding target area according to the target area identifier.
5. The area identification method according to claim 1, wherein the performing similarity check on the gray-scale value corresponding to the historical two-dimensional coordinate to obtain a similarity check result comprises:
obtaining a historical two-dimensional coordinate with a shorter distance according to the coordinate value of the historical two-dimensional coordinate;
acquiring a first historical two-dimensional coordinate and a second historical two-dimensional coordinate in the historical two-dimensional coordinates which are close to each other;
obtaining a corresponding first gray value according to the first historical two-dimensional coordinate, and obtaining a corresponding second gray value according to the second historical two-dimensional coordinate;
and carrying out similarity verification on the first gray value and the second gray value to obtain a similarity verification result.
6. The area identification method according to claim 5, wherein the performing similarity check on the first gray scale value and the second gray scale value to obtain a similarity check result comprises:
extracting each numerical value in the first gray value and the second gray value and corresponding digital information;
comparing the numerical values corresponding to the same digital information to judge whether the numerical values corresponding to the same digital information are the same;
and obtaining a similarity checking result according to the same quantity.
7. The area identification method according to claim 6, wherein the adjusting the gray scale value corresponding to the historical two-dimensional coordinate according to the similarity check result to obtain the adjusted gray scale value corresponding to the historical two-dimensional coordinate comprises:
when the same number reaches a preset number, obtaining gray values which belong to the same digital information and have the same numerical value;
and adjusting the gray values which belong to the same digital information and have the same numerical value according to a preset strategy to obtain the gray values corresponding to the adjusted historical two-dimensional coordinates.
8. The region identification method of claim 1, wherein said determining a corresponding target grayscale value according to two-dimensional coordinates on the canvas comprises:
calling a canvas image data application program interface;
and determining a corresponding target gray value according to the two-dimensional coordinates on the canvas through the canvas image data application program interface.
9. The area identification method according to any one of claims 1 to 8, wherein after the target area identifier corresponding to the target gray scale value is searched in a preset mapping table and the corresponding target area is determined according to the target area identifier, the method further comprises:
acquiring current display parameters of the target area;
comparing the current display parameter with a preset display parameter;
and when the current display parameter is inconsistent with a preset display parameter, adjusting the current display parameter to be the preset display parameter.
10. The method of claim 9, wherein after obtaining the current display parameters of the target area, the method further comprises:
acquiring a parameter adjusting instruction, and extracting a target configuration parameter in the parameter adjusting instruction;
and adjusting the current display parameters according to the target configuration parameters to obtain the adjusted current display parameters.
11. An area recognition apparatus, characterized in that the area recognition apparatus comprises:
the acquisition module is used for determining the interaction position of a user on a preset interface according to an interaction instruction when the interaction instruction input by the user based on the preset interface is received;
the acquisition module is further used for determining a corresponding target gray value according to the interaction position;
the searching module is used for searching a target area identifier corresponding to the target gray value in a preset mapping table, and determining a corresponding target area according to the target area identifier, wherein the preset interface has a plurality of different areas, and the preset mapping table comprises the corresponding relation between the area identifier corresponding to each area in the preset interface and the gray value;
the acquisition module is further used for acquiring a historical two-dimensional coordinate, defining a gray value according to the historical two-dimensional coordinate, and acquiring a gray value corresponding to the historical two-dimensional coordinate;
carrying out similarity verification on the gray value corresponding to the historical two-dimensional coordinate to obtain a similarity verification result;
adjusting the gray value corresponding to the historical two-dimensional coordinate according to the similarity check result to obtain the gray value corresponding to the adjusted historical two-dimensional coordinate;
making a gray scale map according to the gray scale value corresponding to the adjusted historical two-dimensional coordinate, and drawing a canvas according to the gray scale map;
determining an interaction coordinate on the preset interface according to the interaction position;
converting the interaction coordinate into a two-dimensional coordinate on a canvas;
and determining a corresponding target gray value according to the two-dimensional coordinates on the canvas.
12. The area recognition apparatus of claim 11, wherein the area recognition apparatus further comprises: a generation module;
the generation module is used for defining a history identifier for the history area;
creating a corresponding relation between the history identification and the history gray value;
and generating a preset mapping table according to the corresponding relation.
13. The area recognition apparatus of claim 11, wherein the area recognition apparatus further comprises: a judgment module;
the judging module is used for judging whether the preset mapping table has the target gray value.
14. The area identification device of claim 13, wherein the determining module is further configured to determine whether the preset mapping table has the target gray scale value;
when the target gray value is not stored in the preset mapping table, acquiring a reference area corresponding to the target gray value;
setting a reference area identifier for the reference area, and determining the corresponding relation between the reference area identifier and the target gray value;
and updating the preset mapping table according to the corresponding relation between the reference area identifier and the target gray value to obtain the updated preset mapping table.
15. An area recognition apparatus, characterized in that the area recognition apparatus comprises: memory, processor and a region identification program stored on the memory and running on the processor, which when executed by the processor implements the steps of the region identification method according to any of claims 1 to 10.
16. A storage medium, characterized in that the storage medium has stored thereon an area identification program which, when executed by a processor, implements the steps of the area identification method according to any one of claims 1 to 10.
CN202011274838.1A 2020-11-13 2020-11-13 Region identification method, device, storage medium and device Active CN112433651B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011274838.1A CN112433651B (en) 2020-11-13 2020-11-13 Region identification method, device, storage medium and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011274838.1A CN112433651B (en) 2020-11-13 2020-11-13 Region identification method, device, storage medium and device

Publications (2)

Publication Number Publication Date
CN112433651A CN112433651A (en) 2021-03-02
CN112433651B true CN112433651B (en) 2022-03-11

Family

ID=74701016

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011274838.1A Active CN112433651B (en) 2020-11-13 2020-11-13 Region identification method, device, storage medium and device

Country Status (1)

Country Link
CN (1) CN112433651B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114255278A (en) * 2022-02-28 2022-03-29 深圳思谋信息科技有限公司 Graph click detection method and device, computer equipment and storage medium

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7113652B2 (en) * 2003-01-09 2006-09-26 Banner Engineering Corp. System and method for using normalized gray scale pattern find
US7296747B2 (en) * 2004-04-20 2007-11-20 Michael Rohs Visual code system for camera-equipped mobile devices and applications thereof
US20100253685A1 (en) * 2009-04-01 2010-10-07 Lightmap Limited Generating Data for Use in Image Based Lighting Rendering
US8787662B2 (en) * 2010-11-10 2014-07-22 Tandent Vision Science, Inc. Method and system for identifying tokens in an image
CN102855132B (en) * 2011-06-30 2016-01-20 大族激光科技产业集团股份有限公司 A kind of choosing method of Drawing Object and system
CN105320709A (en) * 2014-08-05 2016-02-10 阿里巴巴集团控股有限公司 Information reminding method and device on terminal equipment
US20160203379A1 (en) * 2015-01-12 2016-07-14 TigerIT Americas, LLC Systems, methods and devices for the automated verification and quality control and assurance of vehicle identification plates
CN104657458B (en) * 2015-02-06 2018-02-23 腾讯科技(深圳)有限公司 The methods of exhibiting and device of the target information of foreground target in scene image
US9922426B2 (en) * 2016-01-25 2018-03-20 Google Llc Reducing latency in presenting map interfaces at client devices
CN106846495B (en) * 2017-01-17 2022-10-25 腾讯科技(深圳)有限公司 Method and device for realizing augmented reality
CN110555894B (en) * 2019-07-19 2023-03-07 广东智媒云图科技股份有限公司 Intelligent robot painting method, electronic equipment and storage medium
CN110851050B (en) * 2019-10-17 2022-03-01 稿定(厦门)科技有限公司 Method and device for testing clicking of page elements

Also Published As

Publication number Publication date
CN112433651A (en) 2021-03-02

Similar Documents

Publication Publication Date Title
JP2010531010A (en) Matching images with shape descriptors
CN111259772A (en) Image annotation method, device, equipment and medium
CN110675940A (en) Pathological image labeling method and device, computer equipment and storage medium
CN105068918A (en) Page test method and device
CN107807841B (en) Server simulation method, device, equipment and readable storage medium
CN110033515B (en) Graph conversion method, graph conversion device, computer equipment and storage medium
CN115578433B (en) Image processing method, device, electronic equipment and storage medium
CN112433651B (en) Region identification method, device, storage medium and device
CN111208998A (en) Method and device for automatically laying out data visualization large screen and storage medium
CN111026938B (en) Space-time big data integration analysis method, device, equipment and storage medium
US20220392101A1 (en) Training method, method of detecting target image, electronic device and medium
CN111190595A (en) Method, device, medium and electronic equipment for automatically generating interface code based on interface design drawing
CN114491064A (en) Internet of things platform construction method and device, storage medium and terminal
CN114116514A (en) User interface test acceptance method
CN109542546A (en) A kind of throwing screen method and device at application process window and interface
CN113129155A (en) Multi-type personnel information processing method, equipment and storage medium
CN111127592B (en) Picture color filling method and device, electronic equipment and readable storage medium
JP2017111500A (en) Character recognizing apparatus, and program
US11782850B2 (en) Information processing method, server, terminal, and computer storage medium
CN114972500A (en) Checking method, marking method, system, device, terminal, equipment and medium
CN113204686A (en) Mass spectrum visualization method, terminal device and computer-readable storage medium
CN113870210A (en) Image quality evaluation method, device, equipment and storage medium
CN113780047A (en) Virtual makeup trying method and device, electronic equipment and storage medium
CN108270796B (en) Method, device and system for verifying image verification code
CN111124862A (en) Intelligent equipment performance testing method and device and intelligent equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: 100020 1773, 15 / F, 17 / F, building 3, No.10, Jiuxianqiao Road, Chaoyang District, Beijing

Patentee after: Sanliu0 Digital Security Technology Group Co.,Ltd.

Address before: 100020 1773, 15 / F, 17 / F, building 3, No.10, Jiuxianqiao Road, Chaoyang District, Beijing

Patentee before: Beijing Hongteng Intelligent Technology Co.,Ltd.

CP01 Change in the name or title of a patent holder