CN113011584A - Coding model training method, coding device and storage medium - Google Patents

Coding model training method, coding device and storage medium Download PDF

Info

Publication number
CN113011584A
CN113011584A CN202110293408.2A CN202110293408A CN113011584A CN 113011584 A CN113011584 A CN 113011584A CN 202110293408 A CN202110293408 A CN 202110293408A CN 113011584 A CN113011584 A CN 113011584A
Authority
CN
China
Prior art keywords
entity
neural network
trained
data
training
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110293408.2A
Other languages
Chinese (zh)
Other versions
CN113011584B (en
Inventor
吴龙祥
赖泽云
邹磊
贺正雄
丁先华
倪晓东
范铀
刘茜
樊瑾
吴良华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South Digital Technology Co ltd
Original Assignee
South Digital Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South Digital Technology Co ltd filed Critical South Digital Technology Co ltd
Priority to CN202110293408.2A priority Critical patent/CN113011584B/en
Publication of CN113011584A publication Critical patent/CN113011584A/en
Application granted granted Critical
Publication of CN113011584B publication Critical patent/CN113011584B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the application provides a method for training a coding model, a method for coding, a device and a storage medium, wherein the method comprises the following steps: acquiring training data, wherein the training data comprises geographic entity vector data marked with coding information; preprocessing the training data to obtain a preprocessing result, wherein the preprocessing result comprises corresponding characteristic attribute values, the number of characteristic attributes and the total number of entity categories; generating a neural network model to be trained according to the preprocessing result, wherein the number of input features of an input layer of the neural network model to be trained is equal to the number of the feature attributes, and an output layer of the neural network to be trained is used for outputting the recognition results with the same number as the total number of the entity classes; and training the neural network model to be trained to obtain an entity type coding model, and flexibly configuring the neural network model to perform entity coding.

Description

Coding model training method, coding device and storage medium
Technical Field
The embodiment of the application relates to the field of geographic information systems, in particular to a method for training a coding model, a coding method, a coding device and a storage medium.
Background
In the related technology, the scheme of coding and assigning values to geographic vector data mainly depends on a manual identification mode, and after geographic elements are drawn according to remote sensing images or three-dimensional geographic data, the drawn entities are endowed with coding attributes. In the related art deep learning technology, the input data of a deep learning network of a fixed model needs to unify the number of characteristic attributes, that is, the related art adopts a single neural network which is uniformly constructed for identification and coding. It can be understood that, when the characteristic attributes actually required by the entity to be encoded are less, a great number of useless parameters exist in the neural network model with the strictly limited input characteristic attribute quantity, and when the characteristic attributes actually required by the entity to be encoded are more, data cannot be input because the input characteristic quantity of the neural network with the strictly limited input characteristic attribute quantity is less than the input characteristic attributes, so the construction and training method of the deep learning network model with the related technology will result in higher requirement for limiting the input data, and at the same time, the neural network model actually required cannot be flexibly configured.
Therefore, how to flexibly configure the neural network model to perform multiple entity coding becomes an urgent problem to be solved.
Disclosure of Invention
The embodiments of the present application provide a method for training a coding model, a method for coding, a device for coding, and a storage medium, and at least some embodiments of the present application can implement flexible configuration of a neural network model to identify entity data, thereby coding according to an identification result.
In a first aspect, a method of coding model training, the method comprising: acquiring training data, wherein the training data comprises geographic vector data marked with coding information; preprocessing the training data to obtain a preprocessing result, wherein the preprocessing result comprises corresponding characteristic attribute values, the number of characteristic attributes and the total number of entity categories; generating a neural network model to be trained according to the preprocessing result, wherein the number of input features of an input layer of the neural network model to be trained is equal to the number of the feature attributes, and an output layer of the neural network to be trained is used for outputting the recognition results with the same number as the total number of the entity classes; and training the neural network model to be trained to obtain an entity type coding model.
Therefore, the embodiment of the application adapts to the input layer and the output layer of the matched neural network model to be trained according to the number of the characteristic attributes corresponding to the training data and the total number of the entity classes to be coded, so that the entity class coding model is obtained, the input and type identification marks of the characteristic attribute numbers of various quantities can be realized, the automatic identification degree of various entities is improved, omission is effectively prevented, artificial resources are greatly saved, the advantages are obvious when a large amount of entity data are processed, and the data processing efficiency is greatly improved.
With reference to the first aspect, in an embodiment, the preprocessing the training data to obtain a preprocessing result includes: extracting at least one characteristic attribute in the training data, wherein the at least one characteristic attribute is a basic element composing the entity; calculating at least one characteristic attribute value respectively corresponding to the at least one characteristic attribute; and taking the at least one characteristic attribute value as the preprocessing result.
With reference to the first aspect, in an embodiment, before the extracting at least one feature attribute in the training data, the method further includes: obtaining at least one optional characteristic attribute selected by a user in advance, wherein the at least one characteristic attribute comprises the at least one optional characteristic attribute, and the at least one optional characteristic attribute is selected by the user according to actual requirements; the extracting at least one feature attribute in the training data includes: extracting the self-selected feature attributes in the training data; the calculating at least one feature attribute value corresponding to each of the at least one feature attribute includes: calculating self-selection characteristic attribute values respectively corresponding to the self-selection characteristic attributes; the taking the at least one feature attribute value as the preprocessing result includes: and taking the self-selected characteristic attribute value as the preprocessing result.
Therefore, in the embodiment of the application, the customization of the training data can be realized by pre-selecting at least one self-selected characteristic attribute by the user, the training data is converted into at least one self-selected characteristic attribute value, the model can automatically adapt to data generated by different characteristic attributes, and the training learning can be performed on the characteristic attributes obtained by the self-selection of the user by adopting different neural network models.
With reference to the first aspect, in an embodiment, the generating a neural network model to be trained according to the preprocessing result includes: and inputting the self-selected characteristic attribute value into the neural network model to be trained, and training the neural network model to be trained.
Therefore, the embodiment of the application trains the neural network model to be trained by taking the self-selected characteristic attribute value as input, so that the neural network model matched with the self-selected characteristic attribute selected by the user can be obtained, and the entity data is automatically identified and encoded according to the identification result.
With reference to the first aspect, in one embodiment, the geographic vector data is constituted by vertices, line segments or faces.
In a second aspect, a method of encoding, the method comprising: receiving input entity data to be encoded, wherein the entity data to be encoded comprises all constituent elements of at least one entity, and all the constituent elements are obtained by analyzing the entity to be encoded; and encoding the entity data to be encoded by using the entity class encoding model obtained by the method according to the first aspect and all the embodiments of the first aspect.
Therefore, the entity type coding model is used, and the coding of the entity data to be coded can be realized.
In a third aspect, an apparatus for coding model training, the apparatus comprising: an acquisition unit configured to acquire training data, wherein the training data includes geographical vector data to which coded information has been tagged; the preprocessing unit is configured to preprocess the training data to obtain a preprocessing result, wherein the preprocessing result comprises a characteristic attribute value, the number of characteristic attributes and the total number of entity categories, which correspond to the training data; a generating unit, configured to generate a neural network model to be trained according to at least the preprocessing result, where the number of input nodes of an input layer of the neural network model to be trained is equal to the number of the feature attributes, and an output layer of the neural network to be trained is used for outputting the recognition results with the same number as the total number of the entity classes; and the training unit is configured to train the neural network model to be trained to obtain an entity class coding model.
With reference to the third aspect, in an embodiment, the preprocessing unit is specifically configured to: extracting at least one characteristic attribute in the training data; calculating at least one characteristic attribute value respectively corresponding to the at least one characteristic attribute; and taking the at least one characteristic attribute value as the preprocessing result.
With reference to the third aspect, in an embodiment, the preprocessing unit is specifically configured to: obtaining at least one optional characteristic attribute selected by a user in advance, wherein the at least one characteristic attribute comprises the at least one optional characteristic attribute, and the at least one optional characteristic attribute is selected by the user according to actual requirements; the extracting at least one feature attribute in the training data includes: extracting the self-selected feature attributes in the training data; the calculating at least one feature attribute value corresponding to each of the at least one feature attribute includes: calculating self-selection characteristic attribute values respectively corresponding to the self-selection characteristic attributes; the taking the at least one feature attribute value as the preprocessing result includes: and taking the self-selected characteristic attribute value as the preprocessing result.
With reference to the third aspect, in an embodiment, the generating unit is specifically configured to: and inputting the self-selected characteristic attribute value into the neural network model to be trained, and training the neural network model to be trained.
With reference to the third aspect, in one embodiment, the geographic vector data is constituted by vertices, line segments or faces.
In a fourth aspect, an apparatus for encoding, the apparatus comprising: the encoding device comprises a receiving unit, a coding unit and a decoding unit, wherein the receiving unit is configured to receive input entity data to be encoded, the entity data to be encoded comprises all constituent elements of at least one entity, and all the constituent elements are obtained by analyzing the entity to be encoded; an encoding unit configured to encode the entity data to be encoded using the entity class encoding model obtained by the method according to any one of the first aspect.
In a fifth aspect, an electronic device comprises: a processor, a memory, and a bus; the processor is connected to the memory via the bus, and the memory stores computer readable instructions for implementing the method according to any one of the first aspect, the embodiment of the second aspect, and the embodiment of the second aspect when the computer readable instructions are executed by the processor.
A sixth aspect is a computer-readable storage medium having stored thereon a computer program for implementing a method as described in any of the embodiments of the first aspect, the second aspect and the second aspect when executed by a server.
Drawings
FIG. 1 is a flowchart illustrating a method for coding model training according to an embodiment of the present disclosure;
FIG. 2 is a flow chart of a method of encoding according to an embodiment of the present application;
FIG. 3 is a diagram illustrating internal elements of an apparatus for coding model training according to an embodiment of the present application;
FIG. 4 is a diagram illustrating internal elements of an encoding apparatus according to an embodiment of the present application;
fig. 5 is a structural diagram of an internal unit of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, as presented in the figures, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
The method steps in the embodiments of the present application are described in detail below with reference to the accompanying drawings.
The embodiment of the application can be applied to various entity identification scenes, for example, the scenes comprise scenes for identifying and encoding geographic entities after the geographic entities are drawn according to remote sensing images or three-dimensional geographic data, for example, scenes for identifying and encoding geographic entities such as mountains and rivers in a map are needed, and the method for exemplarily illustrating the related art by encoding the geographic entities has the problem. In the related technology, after geographic entities are drawn into geographic entity vector data according to remote sensing images or three-dimensional geographic data, the entities are endowed with coding attributes, and in the deep learning process, the number of input data needs to be unified with the characteristic attributes, and a single neural network is used for identification and coding. This results in a large number of useless parameters when the actually required feature attributes are small, and data cannot be input when the actually required feature attributes are large, thereby resulting in high requirements for defining the input data and at the same time being incapable of flexibly configuring the neural network model.
At least to solve the above problem, some embodiments of the present application provide a method for preprocessing training data and constructing a neural network model to be trained according to a result of the preprocessing (for example, constructing the number of nodes of an input layer and the number of nodes of an output layer of the neural network model to be trained), and performing entity class identification coding on an entity to be coded according to the constructed neural network model to be trained. For example, in some embodiments of the present application, at least one self-selected feature attribute that is self-selected by a user according to actual needs is input into a neural network model to be trained, and a model that can identify a corresponding entity class is obtained after training, so that the neural network model can be flexibly configured to encode entity data. It is to be understood that the application scenarios of the embodiments of the present application are not limited thereto.
A method of coding model training and a method of coding performed by an electronic device will be described below.
And S110, acquiring training data.
In one embodiment, training data is obtained, wherein the training data includes geographic vector data labeled with encoded information.
It should be noted that each piece of the above-mentioned geographic vector data represents an element constituting an entity, for example: the vertices constituting mountains, the faces constituting houses or the lines constituting roads.
Before the training data is obtained, the entity data needs to be labeled, and the labeled entity data is used as the training data. As an embodiment, the constituent elements (including points, lines or planes, etc.) included by each different category of entity in the entity data are marked with corresponding codes. For example: marking each line segment forming a highway entity as 001 by adopting a code, and marking the surface forming a house as 002 by adopting a code; the following steps are repeated: a house has 6 surfaces, and the category of the 6 surfaces is marked as 110; a mountain is made up of 100 vertices, and the category of the 100 vertices is labeled 120.
In one embodiment, the geographic vector data is formed by vertices, line segments, or faces, for example, the tagged entity is a house, and the vector data corresponding to the house entity is the face forming the house, the vertices forming the house, and the line segments forming the house. As another example, the vector data corresponding to the house entity is the surface constituting the house, or the vertex constituting the house, or the line segment constituting the house.
An embodiment of S120 is exemplarily set forth below.
And S120, preprocessing the training data to obtain a preprocessing result.
In one embodiment, the training data is preprocessed to obtain a preprocessing result, where the preprocessing result includes feature attributes corresponding to the training data, the number of the feature attributes, and the total number of entity classes.
After the training data is obtained, preprocessing the training data, wherein the preprocessed result comprises the characteristic attributes and the number of the characteristic attributes in the training data and the number of the entity types needing to be coded.
It should be noted that the feature attributes corresponding to the training data may include the perimeter, area, number of vertices, and the like of the constituent elements of each entity, and the entity category includes mountains, streets, and rivers. For example, the present embodiment is not limited to a mountain denoted by 001, a street denoted by 002, a river denoted by 003, and the like.
The preprocessing is to formulate a feature extraction scheme for the geographic entity vector data in a visual mode, and execute the feature extraction scheme to obtain a preprocessing result, wherein the feature extraction scheme is as follows:
in one embodiment, preprocessing the training data to obtain a preprocessing result includes: extracting at least one characteristic attribute in the training data; calculating at least one characteristic attribute value respectively corresponding to the at least one characteristic attribute; and taking the at least one characteristic attribute value as the preprocessing result.
It should be noted that the at least one feature attribute includes all feature attributes and part of feature attributes, which are related to the entity type to be labeled.
The process of preprocessing the training data includes extracting at least one of feature attributes in the training data, wherein the feature attributes may include, but are not limited to: and calculating at least one characteristic attribute value corresponding to the extracted at least one characteristic attribute according to the geometric type, the closing state, the line width, the perimeter, the area, the number of vertexes and the like of the entity, and taking the calculated at least one characteristic attribute value as a preprocessing result.
For example: at least one characteristic attribute in the extracted training data is three, the three characteristic attributes comprise the perimeter of the side forming the entity, the area of the surface forming the entity and the number of vertexes forming the entity, and correspondingly, the obtained characteristic attribute value is the perimeter value corresponding to the perimeter, the area value corresponding to the area and the number of vertexes. The input layer of the to-be-trained neural network model constructed for three according to the number of the characteristic attributes comprises three input nodes, and the three input nodes are respectively used for receiving input perimeter characteristics, input area characteristics and input vertex characteristics.
In one embodiment, prior to said extracting at least one feature attribute in said training data, said method further comprises: obtaining at least one optional characteristic attribute selected by a user in advance, wherein the at least one characteristic attribute comprises the at least one optional characteristic attribute, and the at least one optional characteristic attribute is selected by the user according to actual requirements; the corresponding S120 includes: extracting the self-selected feature attributes of the training data; calculating self-selection characteristic attribute values respectively corresponding to the self-selection characteristic attributes; and taking the self-selected characteristic attribute value as the preprocessing result.
It can be understood that, in some embodiments of the present application, before extracting at least one feature attribute in training data, a piece of visualization software is built to provide a user with a selected feature attribute, the user may combine the feature attributes according to actual needs, and the type of the feature attribute may also be extended, and may be extended for other feature attributes, and the extended feature attribute is also displayed by the visualization software for the user to select. After the user selects at least one self-selection characteristic attribute in advance, at least one self-selection characteristic attribute selected in advance by the user is obtained, the at least one self-selection characteristic attribute is extracted, at least one self-selection characteristic attribute value corresponding to the at least one self-selection characteristic attribute is calculated, and the at least one self-selection characteristic attribute value is used as a preprocessing result.
Therefore, in the embodiment of the application, the customization of the training data can be realized by pre-selecting at least one self-selected characteristic attribute by the user, the training data is converted into at least one self-selected characteristic attribute value, the model can automatically adapt to data generated by different characteristic attributes, and the training learning can be performed on the characteristic attributes obtained by the self-selection of the user by adopting different neural network models.
An embodiment of S130 is exemplarily set forth below.
And S130, generating a neural network model to be trained according to the preprocessing result.
In one embodiment, a neural network model to be trained is generated according to the preprocessing result, wherein the number of input features of an input layer of the neural network model to be trained is equal to the number of feature attributes, and an output layer of the neural network to be trained is used for outputting the same number of recognition results as the total number of entity classes.
It should be noted that, in the process of executing the above method, the electronic device does not output multiple entity recognition results every time of execution, but the output layer has the capability of outputting the same number of recognition results as the total number of entity categories.
And taking the number of at least one characteristic attribute in the preprocessing result as a node of an input layer of the neural network model to be trained, and taking the number of the labeled entity categories as a node of an output layer of the neural network model to be trained. As an embodiment, the number of hidden layers may be appropriately reduced to increase the response speed of the neural network model to be trained, or may be appropriately increased to increase the accuracy of the neural network model to be trained. As another embodiment, a C + + neural network architecture LibTorch can be introduced, a machine learning technology is used for solving complex problems in the mapping field, single-machine training and analysis can be performed on data at a user end, networking training is not needed through a server, and the possibility that confidential data is leaked is avoided.
For example: the total number of the characteristic attributes corresponding to the preprocessing result is n, and the total number of the labeled entity classes is m, then the total number of the input layer nodes of the neural network to be trained generated according to the preprocessing result is n, and the total number of the output nodes is m, as an example, the generated neural network to be trained includes 5 fully-connected layers, and a specific neural network to be trained may generate a 5-layer fully-connected neural network, for example: a first layer: the input layer inputs n attributes and outputs n values; a second layer: inputting n values and outputting 2n values; and a third layer: inputting 2n values and outputting 3n values; a fourth layer: inputting 3n values and outputting 2n values; and a fifth layer: and an output layer for inputting 2n values and outputting m values. The embodiment of the application is not limited to the above, wherein n and m are both natural numbers greater than or equal to 1.
In one embodiment, generating the neural network model to be trained according to the preprocessing result includes: and inputting the self-selected characteristic attribute value into the neural network model to be trained, and training the neural network model to be trained.
And taking the number of the self-selected characteristic attribute values selected by the user according to the requirements as the number of the input layer nodes of the neural network model to be trained, and training the neural network model to be trained.
Therefore, the embodiment of the application trains the neural network model to be trained by taking the self-selected characteristic attribute value as input, so that the neural network model matched with the self-selected characteristic attribute selected by the user can be obtained, and the entity data is automatically identified and encoded according to the identification result.
An embodiment of S140 is exemplarily set forth below.
S140, training the neural network model to be trained to obtain an entity type coding model.
And after obtaining the neural network model to be trained, training the neural network model to be trained to obtain an entity class coding model.
It should be noted that the entity category is a category of various geographic entity objects, for example, the entity category includes a mountain, a river, and the like, and the entity category is characterized by using a predefined identifier, that is, the entity is coded.
Therefore, the embodiment of the application adaptively generates the input layer and the output layer of the matched neural network model to be trained according to the number of the characteristic attributes and the number of the entity classes corresponding to the training data, so that the entity class identification model is obtained, the automatic identification of the entity data can be realized, omission is effectively prevented, artificial resources are greatly saved, the advantages are obvious when a large amount of entity data are processed, and the data processing efficiency is greatly improved.
The above describes a method for training a coding model by an electronic device, and a specific embodiment of a coding method is described below.
As shown in fig. 2, as an encoding method in the present application, the method includes:
s210, receiving input entity data to be coded.
In one embodiment, input entity data to be encoded is received, wherein the entity data to be encoded includes all constituent elements of at least one entity, and all constituent elements are obtained by parsing a map to be encoded.
The method comprises the steps of firstly analyzing entity data to be coded from a map to be coded, inputting the entity data to be coded into an entity type identification model, and receiving the input entity data to be coded by electronic equipment after the entity type coding model and parameters in the model are loaded.
It should be noted that the entity data to be encoded may be composed of vertices, line segments, or faces, and all the constituent elements are points, lines, faces, and the like of the geographic entity vector data, and the embodiments of the present application are not limited thereto.
And S220, coding the entity data to be coded by using an entity type coding model.
After the electronic equipment receives the entity data to be coded, the entity data to be coded is classified and identified by using the entity class coding model obtained by training, and coding is carried out according to the classification result.
As an embodiment of the encoding method, the entity data to be encoded received by the entity type encoding model comprises vertexes and line segments, and the vertex representing the house is encoded to be 001 and the line segment representing the road is encoded to be 002 by using the entity type encoding model.
A specific embodiment of a method for encoding is described above, and an apparatus for training a model for encoding geographic vector data is described below.
As shown in fig. 4, an apparatus 300 for coding model training includes: an acquisition unit 310, a preprocessing unit 320, a generation unit 330, and a training unit 340.
In one embodiment, an apparatus for coding model training, the apparatus comprising: an acquisition unit configured to acquire training data, wherein the training data includes entity data that marks each geographic vector data on a map with a code; the preprocessing unit is configured to preprocess the training data to obtain a preprocessing result, wherein the preprocessing result comprises the characteristic attributes corresponding to the training data, the number of the characteristic attributes and the total number of entity categories; a generating unit configured to generate a neural network model to be trained according to the preprocessing result, wherein the number of input nodes of an input layer of the neural network model to be trained is equal to the number of the characteristic attributes, and an output layer of the neural network to be trained is used for outputting the recognition results with the same number as the total number of the entity classes; and the training unit is configured to train the neural network model to be trained to obtain an entity class coding model.
In an embodiment, the preprocessing unit is specifically configured to: extracting at least one characteristic attribute in the training data; calculating at least one characteristic attribute value respectively corresponding to the at least one characteristic attribute; and taking the at least one characteristic attribute value as the preprocessing result.
In an embodiment, the preprocessing unit is specifically configured to: obtaining at least one optional characteristic attribute selected by a user in advance, wherein the at least one characteristic attribute comprises the at least one optional characteristic attribute, and the at least one optional characteristic attribute is selected by the user according to actual requirements; the extracting at least one feature attribute in the training data includes: extracting the self-selected feature attributes in the training data; the calculating at least one feature attribute value corresponding to each of the at least one feature attribute includes: calculating self-selection characteristic attribute values respectively corresponding to the self-selection characteristic attributes; the taking the at least one feature attribute value as the preprocessing result includes: and taking the self-selected characteristic attribute value as the preprocessing result.
In one embodiment, the generating unit is specifically configured to: and inputting the self-selected characteristic attribute value into the neural network model to be trained, and training the neural network model to be trained.
In this embodiment of the application, the modules shown in fig. 3 can implement various processes in the method embodiment of fig. 1. The operations and/or functions of the respective modules in fig. 3 are respectively for implementing the corresponding flows in the method embodiment in fig. 1. Reference may be made specifically to the description of the above method embodiments, and a detailed description is appropriately omitted herein to avoid redundancy.
The above describes an apparatus for coding model training, and the following describes an apparatus for coding.
As shown in fig. 4, an apparatus 400 for encoding comprises: a receiving unit 410 and an encoding unit 420.
In one embodiment, an apparatus for encoding, the apparatus comprising: the encoding method comprises the steps of receiving input entity data to be encoded, wherein the entity data to be encoded comprises all constituent elements of at least one entity, and all the constituent elements are obtained by analyzing a geographical entity to be encoded; an encoding unit, configured to encode the entity data to be encoded using the entity class encoding model obtained by the method in any embodiment of the first aspect and the first aspect.
In this embodiment of the application, the modules shown in fig. 4 can implement various processes in the method embodiment of fig. 2. The operations and/or functions of the respective modules in fig. 4 are respectively for implementing the corresponding flows in the method embodiment in fig. 2. Reference may be made specifically to the description of the above method embodiments, and a detailed description is appropriately omitted herein to avoid redundancy.
As shown in fig. 5, an embodiment of the present application provides an electronic device 500, including: a processor 510, a memory 520 and a bus 530, the processor being connected to the memory via the bus, the memory storing computer readable instructions for implementing the method according to any one of the above embodiments when the computer readable instructions are executed by the processor, and in particular, refer to the description of the above method embodiments, and the detailed description is omitted here as appropriate to avoid redundancy.
Wherein the bus is used for realizing direct connection communication of the components. The processor in the embodiment of the present application may be an integrated circuit chip having signal processing capability. The Processor may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The Memory may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read Only Memory (PROM), an Erasable Read Only Memory (EPROM), an electrically Erasable Read Only Memory (EEPROM), and the like. The memory stores computer readable instructions that, when executed by the processor, perform the methods described in the embodiments above.
It will be appreciated that the configuration shown in fig. 5 is merely illustrative and may include more or fewer components than shown in fig. 5 or have a different configuration than shown in fig. 5. The components shown in fig. 5 may be implemented in hardware, software, or a combination thereof.
Embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a server, the method in any of the above-mentioned all embodiments is implemented, which may specifically refer to the description in the above-mentioned method embodiments, and in order to avoid repetition, detailed description is appropriately omitted here.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application. It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A method of coding model training, the method comprising:
acquiring training data, wherein the training data comprises geographic vector data marked with coding information;
preprocessing the training data to obtain a preprocessing result, wherein the preprocessing result comprises corresponding characteristic attribute values, the number of characteristic attributes and the total number of entity categories;
generating a neural network model to be trained according to the preprocessing result, wherein the number of input features of an input layer of the neural network model to be trained is equal to the number of the feature attributes, and an output layer of the neural network to be trained is used for outputting the recognition results with the same number as the total number of the entity classes;
and training the neural network model to be trained to obtain an entity type coding model.
2. The method of claim 1, wherein preprocessing the training data to obtain a preprocessing result comprises:
extracting at least one characteristic attribute in the training data;
calculating at least one characteristic attribute value respectively corresponding to the at least one characteristic attribute;
and taking the at least one characteristic attribute value as the preprocessing result.
3. The method of claim 2, wherein prior to said extracting at least one feature attribute in the training data, the method further comprises:
obtaining at least one optional characteristic attribute selected by a user in advance, wherein the at least one characteristic attribute comprises the at least one optional characteristic attribute, and the at least one optional characteristic attribute is selected by the user according to actual requirements;
the extracting at least one feature attribute in the training data includes:
extracting the self-selected feature attributes in the training data;
the calculating at least one feature attribute value corresponding to each of the at least one feature attribute includes:
calculating self-selection characteristic attribute values respectively corresponding to the self-selection characteristic attributes;
the taking the at least one feature attribute value as the preprocessing result includes:
and taking the self-selected characteristic attribute value as the preprocessing result.
4. The method of claim 3, wherein the generating a neural network model to be trained according to the preprocessing result comprises:
and inputting the self-selected characteristic attribute value into the neural network model to be trained, and training the neural network model to be trained.
5. The method of claim 1, wherein the geographic vector data is comprised of vertices, line segments, or faces.
6. A method of encoding, the method comprising:
receiving input entity data to be encoded, wherein the entity data to be encoded comprises all constituent elements of at least one entity, and all the constituent elements are obtained by analyzing the entity to be encoded;
entity class coding model obtained according to the method of any of claims 1 to 5, coding said entity data to be coded.
7. An apparatus for coding model training, the apparatus comprising:
an acquisition unit configured to acquire training data, wherein the training data includes geographical vector data to which coded information has been tagged;
the preprocessing unit is configured to preprocess the training data to obtain a preprocessing result, wherein the preprocessing result comprises a characteristic attribute value, the number of characteristic attributes and the total number of entity categories corresponding to the training data;
a generating unit configured to generate a neural network model to be trained according to the preprocessing result, wherein the number of input features of an input layer of the neural network model to be trained is equal to the number of the feature attributes, and an output layer of the neural network to be trained is used for outputting the recognition results with the same number as the total number of the entity classes;
and the training unit is configured to train the neural network model to be trained to obtain an entity class coding model.
8. An apparatus for encoding, the apparatus comprising:
the encoding device comprises a receiving unit, a coding unit and a decoding unit, wherein the receiving unit is configured to receive input entity data to be encoded, the entity data to be encoded comprises all constituent elements of at least one entity, and all the constituent elements are obtained by analyzing the entity to be encoded;
an encoding unit configured to encode the entity data to be encoded using the entity class encoding model obtained by the method of any one of claims 1 to 5.
9. An electronic device, comprising: a processor, a memory, and a bus;
the processor is connected to the memory via the bus, the memory storing computer readable instructions for implementing the method of any one of claims 1-6 when the computer readable instructions are executed by the processor.
10. A computer-readable storage medium, having stored thereon a computer program which, when executed by a server, implements the method of any one of claims 1-6.
CN202110293408.2A 2021-03-18 2021-03-18 Coding model training method, coding device and storage medium Active CN113011584B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110293408.2A CN113011584B (en) 2021-03-18 2021-03-18 Coding model training method, coding device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110293408.2A CN113011584B (en) 2021-03-18 2021-03-18 Coding model training method, coding device and storage medium

Publications (2)

Publication Number Publication Date
CN113011584A true CN113011584A (en) 2021-06-22
CN113011584B CN113011584B (en) 2024-04-16

Family

ID=76402741

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110293408.2A Active CN113011584B (en) 2021-03-18 2021-03-18 Coding model training method, coding device and storage medium

Country Status (1)

Country Link
CN (1) CN113011584B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103823845A (en) * 2014-01-28 2014-05-28 浙江大学 Method for automatically annotating remote sensing images on basis of deep learning
CN107273502A (en) * 2017-06-19 2017-10-20 重庆邮电大学 A kind of image geographical marking method learnt based on spatial cognition
CN108764263A (en) * 2018-02-12 2018-11-06 北京佳格天地科技有限公司 The atural object annotation equipment and method of remote sensing image
CN110309856A (en) * 2019-05-30 2019-10-08 华为技术有限公司 Image classification method, the training method of neural network and device
CN110390340A (en) * 2019-07-18 2019-10-29 暗物智能科技(广州)有限公司 The training method and detection method of feature coding model, vision relationship detection model
CN110909768A (en) * 2019-11-04 2020-03-24 北京地平线机器人技术研发有限公司 Method and device for acquiring marked data

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103823845A (en) * 2014-01-28 2014-05-28 浙江大学 Method for automatically annotating remote sensing images on basis of deep learning
CN107273502A (en) * 2017-06-19 2017-10-20 重庆邮电大学 A kind of image geographical marking method learnt based on spatial cognition
CN108764263A (en) * 2018-02-12 2018-11-06 北京佳格天地科技有限公司 The atural object annotation equipment and method of remote sensing image
CN110309856A (en) * 2019-05-30 2019-10-08 华为技术有限公司 Image classification method, the training method of neural network and device
CN110390340A (en) * 2019-07-18 2019-10-29 暗物智能科技(广州)有限公司 The training method and detection method of feature coding model, vision relationship detection model
CN110909768A (en) * 2019-11-04 2020-03-24 北京地平线机器人技术研发有限公司 Method and device for acquiring marked data

Also Published As

Publication number Publication date
CN113011584B (en) 2024-04-16

Similar Documents

Publication Publication Date Title
CN109816009B (en) Multi-label image classification method, device and equipment based on graph convolution
Sester Knowledge acquisition for the automatic interpretation of spatial data
CN112215248A (en) Deep learning model training method and device, electronic equipment and storage medium
CN115170575B (en) Method and equipment for remote sensing image change detection and model training
CN116978011B (en) Image semantic communication method and system for intelligent target recognition
CN111639700A (en) Target similarity recognition method and device, computer equipment and readable storage medium
CN111741329B (en) Video processing method, device, equipment and storage medium
CN111241298A (en) Information processing method, apparatus and computer readable storage medium
CN111563560A (en) Data stream classification method and device based on time sequence feature learning
CN115049919B (en) Remote sensing image semantic segmentation method and system based on attention regulation
CN114998583A (en) Image processing method, image processing apparatus, device, and storage medium
CN111753729B (en) False face detection method and device, electronic equipment and storage medium
CN113011584B (en) Coding model training method, coding device and storage medium
CN116977714A (en) Image classification method, apparatus, device, storage medium, and program product
CN111476308A (en) Remote sensing image classification method and device based on prior geometric constraint and electronic equipment
CN113869431A (en) False information detection method, system, computer device and readable storage medium
CN113344060A (en) Text classification model training method, litigation shape classification method and device
CN117611877B (en) LS-YOLO network-based remote sensing image landslide detection method
CN116150699B (en) Traffic flow prediction method, device, equipment and medium based on deep learning
CN113052661B (en) Method and device for acquiring attribute information, electronic equipment and storage medium
CN116957991B (en) Three-dimensional model completion method
CN117271819B (en) Image data processing method and device, storage medium and electronic device
CN112447187B (en) Device and method for identifying sound event
CN117708560A (en) Multi-information PointNet++ fusion method for constructing DEM (digital elevation model) based on airborne laser radar data
Dax Aspects of Algorithmic Information Theory in Spatial Machine Learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant