CN111857893A - Method and device for generating label graph - Google Patents

Method and device for generating label graph Download PDF

Info

Publication number
CN111857893A
CN111857893A CN201910277127.0A CN201910277127A CN111857893A CN 111857893 A CN111857893 A CN 111857893A CN 201910277127 A CN201910277127 A CN 201910277127A CN 111857893 A CN111857893 A CN 111857893A
Authority
CN
China
Prior art keywords
target
graph
data
labeling
drawn
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910277127.0A
Other languages
Chinese (zh)
Inventor
张达铭
周鸿轩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Baidu Online Network Technology Beijing Co Ltd
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201910277127.0A priority Critical patent/CN111857893A/en
Publication of CN111857893A publication Critical patent/CN111857893A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Abstract

The invention provides a method and a device for generating a marked graph, wherein the method comprises the following steps: acquiring original annotation data aiming at a target annotation object, and converting the original annotation data into target annotation data in a target data format; determining target drawing data of a marked graph to be drawn according to the target marking data; and drawing the graph according to the target drawing data to generate a marked graph to be drawn, and storing the target drawing data. The target drawing data matched with the marked graph to be drawn can be obtained only by acquiring the original marking data of the target marking object, so that the marked graph can be drawn quickly. Meanwhile, after the marked graph is drawn, the target drawing data is saved, so that the marked graph can be conveniently displayed again in a follow-up mode or edited again. Compared with the marked graph in the related art for storing the picture format, the method for storing the target drawing data in the non-picture format greatly reduces the storage capacity of the data and improves the processing speed of the system.

Description

Method and device for generating label graph
Technical Field
The invention relates to the technical field of graphical annotation, in particular to a method and a device for generating an annotated graph.
Background
The graphical annotation technology is that a series of annotation graphs such as rectangles, curves, dots and arrows are used for marking a designated area of an annotation object such as a webpage, a picture, an electronic file and the like, and annotations are added to the area. For example, when reading documents in various document readers, a user is supported to add labels, when previewing pictures, the user is supported to add labels, and when reading ordinary webpages, the user is supported to add labels, so that the reading experience of the user can be improved and the user stickiness can be improved through a graphical labeling technology.
In the related technology, a native interface provided by Canvas is adopted to draw and store the annotation graph in the picture format, and when a user views the annotation object again, the annotation graph in the picture format is obtained and loaded on the annotation object. The Canvas is an additional component of HTML5, and is like a curtain, and various diagrams, animations and the like can be drawn thereon by JavaScript. The marked graph in the picture format causes a large amount of stored data, and the processing speed of the system is reduced.
Disclosure of Invention
The present invention is directed to solving, at least to some extent, one of the technical problems in the related art.
Therefore, the first objective of the present invention is to provide a method for generating a label graph.
The second object of the present invention is to provide a label graph generating apparatus.
A third object of the invention is to propose a computer device.
A fourth object of the invention is to propose a computer-readable storage medium.
In order to achieve the above object, an embodiment of a first aspect of the present invention provides a method for generating a label graph, including:
acquiring original labeling data aiming at a target labeling object, and converting the original labeling data into target labeling data in a target data format, wherein the target labeling data comprises a target labeling shape, a coordinate information set of labeling points, a target brush color and a target brush width;
determining target drawing data of a marked graph to be drawn according to the target marking data;
and drawing a graph according to the target drawing data to generate the marked graph to be drawn, and storing the target drawing data.
According to the method for generating the labeled graph, the original labeling data aiming at the target labeled object are obtained and converted into the target labeling data in the target data format, and the target labeling data comprise the target labeling shape, the coordinate information set of the labeling point, the target brush color and the target brush width; determining target drawing data of a marked graph to be drawn according to the target marking data; and drawing a graph according to the target drawing data to generate the marked graph to be drawn, and storing the target drawing data. The target drawing data matched with the marked graph to be drawn can be obtained only by acquiring the original marking data of the target marking object, so that the marked graph can be drawn quickly. Meanwhile, after the marked graph is drawn, the target drawing data is stored, so that the marked graph can be conveniently displayed again in the follow-up process or edited again. Compared with the marked graph in the related art for storing the picture format, the method for storing the target drawing data in the non-picture format greatly reduces the storage capacity of the data and improves the processing speed of the system.
To achieve the above object, a second embodiment of the present invention provides an annotation graph generating apparatus, including:
the system comprises an acquisition module, a storage module and a display module, wherein the acquisition module is used for acquiring original labeling data aiming at a target labeling object and converting the original labeling data into target labeling data in a target data format, and the target labeling data comprises a target labeling shape, a coordinate information set of labeling points, a target brush color and a target brush width;
the processing module is used for determining target drawing data of a to-be-drawn marked graph according to the target marking data;
and the processing module is also used for carrying out graph drawing according to the target drawing data so as to generate the marked graph to be drawn and storing the target drawing data.
The annotation graph generation device provided by the embodiment of the invention is used for converting original annotation data of a target annotation object into target annotation data in a target data format by acquiring the original annotation data, wherein the target annotation data comprises a target annotation shape, a coordinate information set of annotation points, a target brush color and a target brush width; determining target drawing data of a marked graph to be drawn according to the target marking data; and drawing a graph according to the target drawing data to generate the marked graph to be drawn, and storing the target drawing data. The target drawing data matched with the marked graph to be drawn can be obtained only by acquiring the original marking data of the target marking object, so that the marked graph can be drawn quickly. Meanwhile, after the marked graph is drawn, the target drawing data is stored, so that the marked graph can be conveniently displayed again in the follow-up process or edited again. Compared with the marked graph in the related art for storing the picture format, the method for storing the target drawing data in the non-picture format greatly reduces the storage capacity of the data and improves the processing speed of the system.
To achieve the above object, a third embodiment of the present invention provides a computer device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and the processor implements the annotation graph generation method as described above when executing the computer program.
In order to achieve the above object, a fourth aspect of the present invention provides a computer-readable storage medium, wherein when the instructions in the storage medium are executed by a processor, the annotation graph generation method described above is implemented.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The foregoing and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a schematic flow chart of a method for generating a label graph according to an embodiment of the present invention;
FIG. 2 is an exemplary annotation interface;
FIG. 3 is an exemplary rectangle;
FIG. 4 is an exemplary graph;
FIG. 5 is an exemplary ellipse;
FIG. 6 is an exemplary arrow;
fig. 7 is a schematic structural diagram of a device for generating a label graph according to an embodiment of the present invention;
Fig. 8 is a schematic structural diagram of a computer device according to an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
The following describes a method and an apparatus for generating a label graph according to an embodiment of the present invention with reference to the drawings.
Fig. 1 is a schematic flow chart of a method for generating a label graph according to an embodiment of the present invention. The embodiment provides a method for generating a label graph, wherein an execution main body is a label graph generation method device, and the execution main body is composed of hardware and/or software. The method and apparatus for generating a label graph may specifically be a hardware device, such as a terminal device, a background server, or software or an application installed on the hardware device.
As shown in fig. 1, the method for generating a label graph includes the following steps:
s101, acquiring original annotation data aiming at a target annotation object, and converting the original annotation data into target annotation data in a target data format.
In this embodiment, the target annotation object may be, but is not limited to, a web page, an electronic document, a photo, and the like. FIG. 2 is an exemplary annotation interface. Fig. 2 shows that when a teacher browses a teaching courseware "set" on a mobile phone, and the teacher wants to record important contents of browsed teaching courseware, the teacher marks the important contents in the browsed teaching courseware "set" through a labeling function control. Specifically, when the annotation function control is enabled, a canvas is newly built in a page corresponding to the browsed teaching courseware 'set', and a user interacts with the annotation function control to mark important contents in the canvas. The coordinate origin O of the canvas coordinate system in fig. 2 is a vertex of the upper left corner of the canvas, the rightward direction of the origin O is the positive direction of the X axis, and the downward direction of the origin O is the positive direction of the Y axis.
The left part of fig. 2 includes a teaching courseware "set", and the labeling function controls in the right part of fig. 2 include function controls corresponding to the labeling shapes (the function controls corresponding to the labeling shapes are respectively a curve control 1, a point control 2, an oval control 3, a rectangular control 4, and an arrow control 5), a brush color control 6, a brush width control 7, a hidden control 8, a back-to-top control 9, and an annotation control 10. In the left part of fig. 2, 6 labels are made for the teaching courseware 'set', wherein the two labels are rectangular, one is dotted, the other is curved, the other is an arrow of the label, and the other is elliptical. In the right part of fig. 2, in the note list, the user inputs the comment information corresponding to the dot shape and the comment information corresponding to the curve.
When receiving the triggering operation of a user on the function control corresponding to the labeling shape, determining a target labeling shape; when receiving a trigger operation of a user on the brush color control 6, popping up a brush color selection interface for the user to select, and determining a target brush color according to the selection operation of the user; when receiving a trigger operation of a user on the brush width control 7, popping up a brush width selection interface for the user to select, and determining a target brush width according to the selection operation of the user; when the triggering operation of the user on the hidden control 8 is received, hiding the labeling function control; when receiving the triggering operation of the user on the top control 9, returning the labeling function control to the top area of the current page; when receiving the triggering operation of the user on the annotation control 10, providing an annotation page, receiving and displaying annotation information input by the user through the annotation page, so as to record the thought point of the user.
In order to display the annotation information more intuitively, the annotation information may be displayed at a preset position of the annotation figure, for example, at a lower position, a left position, a right position, or the like of the annotation figure.
In this embodiment, the original annotation data includes, but is not limited to, the following annotation data: the source of the labeling data, the unique identification of the labeling object, the width of the labeling object, the height of the labeling object, the color of the painting brush, the width of the painting brush, the annotation information, the labeling shape and the coordinate information set of the labeling point.
Taking fig. 2 as an example, after determining that the target labeling shape is a rectangle, the target brush color is black, and the target brush width (width is 1), the user drags the mouse at the computer end to control the brush to label the important content of the teaching courseware "set" according to the target labeling shape (rectangle), the target brush color (black), and the target brush width (width is 1). The obtained original labeling data includes that the source of the labeling data is a computer terminal, the unique identifier of the labeling object corresponds to a "teaching courseware set", the width of the labeling object (the width of the teaching courseware set is calculated by taking a canvas coordinate system as a reference coordinate system), the height of the labeling object (the height of the teaching courseware set is calculated by taking the canvas coordinate system as the reference coordinate system), the color of a painting brush (black), the width of a target painting brush (the width is 1), annotation information (no annotation information is added), a labeling shape (rectangle), and a coordinate information set of a labeling point (a vertex coordinate of the upper left corner of the rectangle, a vertex coordinate of the lower right corner of the rectangle).
In this embodiment, in order to facilitate generation of a labeling graph in a target labeling object, data format conversion is performed on the obtained original labeling data, standardization of the labeling data is achieved, and the data format of the target labeling data is unified into a target data format. The target data format may be a JSON (JavaScript Object Notation) data format, or an XML (eXtensible Markup Language) data format, but is not limited thereto.
In this embodiment, the target annotation data at least includes a target annotation shape, a coordinate information set of annotation points, a target brush color, and a target brush width.
The JSON format records data in a key-value mode, so that the JSON format is very visual and is simpler than XML. For this reason, the target annotation data encapsulated in the JSON format in this embodiment at least includes the following attribute key: attribute fr, attribute markId, attribute materialWidth, attribute materialHeight, attribute penColor, attribute penWidth, attribute markType, attribute markMsg and attribute points.
The attribute fr is used for representing the source of the annotation data, for example, when the attribute value of the attribute fr is 1, the representation annotation data is from the mobile phone terminal, and when the attribute value of the attribute fr is 2, the representation annotation data is from the computer terminal.
The attribute markId is used for representing the mark of the marked object, for example, when the marked object is a certain webpage, the attribute value of the attribute markId is 1001; when the annotation object is a certain photograph, the attribute value of the attribute markId is 1011.
The attribute materialWidth is used for representing the width of the marked object, and the attribute materialHeight is used for representing the height of the marked object.
The attribute penColor is used for representing the color of the brush, and when the attribute value of the attribute penColor is #000000, the color representing the brush is black; the attribute value of the attribute penColor is # FF0000, and the color of the painting brush is represented as red; the attribute value of the attribute penColor is #00FF00, and the color of the brush is represented as green; the attribute value of the attribute penColor is #0000FF, and the color of the brush is represented as blue; the attribute value of the attribute penColor is # FFFF00, which characterizes the color of the brush as yellow.
The property value of the property pendWidth can be any value of 1 value and 10 value, the value of the value is positively correlated with the width of the brush, and the larger the value is, the wider the brush is.
The attribute markMsg is used for representing annotation information. The attribute value of attribute markMsg is the inputted comment information.
The attribute markType is used for representing the marked shape, and when the attribute value of the attribute markType is 1, the marked shape is represented as a curve; when the attribute value of the attribute markType is 2, the representation marking shape is a point; when the attribute value of the attribute markType is 3, representing that the marked shape is a rectangle; when the attribute value of the attribute markType is 4, the characterization labeling shape is an ellipse. When the attribute value of the attribute markType is 5, the characterization mark shape is an arrow.
And the attribute points are used for representing the coordinate information set of the labeling points. The data structure of the attribute value of the attribute points is an array, and the points array is used for storing the coordinate information of the labeling points. The number of array elements in the points array is related to the labeling shape, and the coordinate information of each labeling point is sequentially stored in the points array according to the labeling sequence of the labeling points.
Fig. 3 is an exemplary rectangle. When the callout shape is rectangular, there are two array elements in the points array. And recording a vertex 11 coordinate of the upper left corner of the labeled graph and a vertex 12 coordinate of the lower right corner of the labeled graph, storing the vertex coordinate of the upper left corner as a first array element into an array, and storing the vertex coordinate of the lower right corner as a second array element into the array.
Fig. 4 is an exemplary graph. When the shape is labeled as a curve, there are multiple array elements in the points array. In the labeling process, the coordinate information of each labeling point which encloses a labeling graph (curve) is recorded in real time, the coordinate information of each labeling point is sequentially stored into an array, and each array element in the array corresponds to the coordinate information of one labeling point on one curve. The individual black dots in fig. 4 are the individual label dots.
Fig. 5 is an exemplary ellipse. The dashed box in fig. 5 is the outer box that encloses the ellipse. When the callout shape is oval, there are two array elements in the points array. In the labeling process, the vertex 21 coordinate of the upper left corner and the vertex coordinate of the lower right corner 22 of the external frame enclosing the ellipse are recorded, the vertex coordinate of the upper left corner 21 is stored in an array as a first array element, and the vertex coordinate of the lower right corner 22 is stored in the array as a second array element.
When the label shape is a dot, there is an array element in the points array. During the labeling process, the coordinates of the point are recorded and stored in an array.
Fig. 6 is an exemplary arrow. When the callout shape is an arrow, there are two array elements in the points array. In the labeling process, the coordinates of a starting point 01 departing from the arrow side and the coordinates of an ending point 02 close to the arrow side are recorded, the coordinates of the starting point 01 are stored in an array as a first array element, and the coordinates of the ending point 02 are stored in the array as a second array element.
It should be noted that the reference coordinate system of the annotation point is the canvas coordinate system. The curve can cover most of irregular marked graphs, and the rectangle, the ellipse, the dot and the arrow can cover most of regular marked graphs. Of course, the labeling shape is not limited to rectangle, ellipse, dot, arrow, and curve, and the labeling shape can be expanded according to the actual situation and specify the coordinates of the labeling point to be recorded in the labeling process of the expanded labeling shape.
The following is target annotation data in an example target data format of JSON format.
Figure BDA0002020402080000071
And S102, determining target drawing data of the marked graph to be drawn according to the target marking data.
In this embodiment, in order to draw a labeled graph in a target labeled object according to target labeled data, the target labeled data is converted into target drawing data adapted to the labeled graph to be drawn.
It is noted that the target drawing data is related to the drawing method employed. Since SVG (Scalable vector graphics) is an image format based on XML syntax, and other image formats are pixel-processing based, SVG belongs to shape description of images, is essentially a text file, is small in size, and is not distorted regardless of magnification. In this embodiment, SVG (Scalable Vector Graphics) is used to draw a label graphic.
In the left part of fig. 2, for example, reference characters 010 and 011 are shown twice as rectangular shapes, and an elliptical reference character 012, a curved reference character 013, a dot reference character 014, and an arrow shape 015 are shown. The rectangular marking graph is drawn by adopting a rect method in the SVG; the elliptic marking graph adopts an ellipsose method in SVG; the method for marking the graph in the curve shape adopts a polyline method in SVG; the dotted labeling graph adopts a circle method in the SVG; the marking graph in the shape of an arrow adopts a polyline method in SVG.
In this embodiment, if the target annotation shape is a rectangle, the rect method in SVG is adopted for rendering. Because the drawing data required for drawing the rectangle by adopting the rect method at least comprises the following steps: vertex coordinates of the upper left corner of the rectangle, width of the rectangle, height of the rectangle, color of the target brush and width of the target brush, and the specific implementation manner of the step S102 is as follows: when the target labeling shape is a rectangle, taking a first numerical value in the coordinate information set as a vertex coordinate of the upper left corner of the labeling graph to be drawn, and taking a second numerical value in the coordinate information set as a vertex coordinate of the lower right corner of the labeling graph to be drawn; calculating the width value and the height value of the marked graph to be drawn according to the vertex coordinate of the upper left corner, the vertex coordinate of the lower right corner and the geometric relationship; and taking the vertex coordinate of the upper left corner, the width value, the height value, the color of the target brush and the width of the target brush as target drawing data of the marked graph to be drawn.
Taking fig. 2 and fig. 3 as an example, when the user triggers the rectangular control 4, in the annotation process, the recorded original annotation data includes original annotation data such as a vertex 11 coordinate of an upper left corner of the rectangle, a vertex 12 coordinate of a lower right corner, a target brush color, and a target brush width. When the original labeling data is converted into target labeling data in a target data format, a first numerical value in a coordinate information set of a labeling point is a vertex 11 coordinate of the upper left corner, and a second numerical value in the coordinate information set of the labeling point is a vertex 12 coordinate of the lower right corner; after extracting the 11 coordinates of the top left corner and the 12 coordinates of the bottom right corner from the target labeling data, calculating to obtain the width and the height of a rectangle according to the geometric relationship; and finally, according to the coordinates of the vertex 11 at the upper left corner, the width of the rectangle, the height of the rectangle, the target brush color and the target brush width, the target drawing data is used for drawing the rectangle 010 or 011.
In this embodiment, if the target labeling shape is a curve, a polyline method in SVG is adopted for drawing. Because the drawing data required by drawing the graph by adopting the polyline method at least comprises the following steps: coordinates of the plurality of points, a target brush color, and a target brush width, the specific implementation manner of step S102 is: when the target labeling shape is a curve, taking each numerical value in the coordinate information set as the coordinate of each point which is enclosed into the labeling graph to be drawn; and taking the coordinates of each point, the color of the target painting brush and the width of the target painting brush as target drawing data of the marked graph to be drawn.
Taking fig. 2 and fig. 4 as an example, when the user triggers the curve control 1, in the annotation process, the recorded original annotation data includes coordinates of each point on the curve (coordinates of each black point in fig. 4), a target brush color, and a target brush width. After the original annotation data is converted into target annotation data in a target data format, the coordinates of each point, the target brush color and the target brush width extracted from the target annotation data are used as target drawing data for drawing the curve 013.
In this embodiment, if the target labeling shape is an ellipse, the object labeling shape is drawn by the ellipsose method in SVG. Because the drawing data required by drawing the graph by adopting the ellipsose method at least comprises the following steps: coordinates of the center point, a horizontal radius and a vertical radius, a target brush color, and a target brush width, the specific implementation manner of step S102 is: when the target labeling shape is an ellipse, taking a first numerical value in the coordinate information set as a vertex coordinate of the upper left corner of an external frame surrounding the labeling graph to be drawn, and taking a second numerical value in the coordinate information set as a vertex coordinate of the lower right corner of the external frame; calculating the coordinate, the horizontal radius and the vertical radius of the central point of the labeled graph according to the vertex coordinate of the upper left corner, the vertex coordinate of the lower right corner and the geometric relationship; and taking the coordinate of the central point, the horizontal radius and the vertical radius, the color of the target brush and the width of the target brush as target drawing data of the marked graph to be drawn.
Taking fig. 2 and fig. 5 as an example, when the user triggers the oval control 3, in the annotation process, the recorded original annotation data includes the coordinates of the vertex 21 at the upper left corner of the external frame (the dashed box in fig. 5) corresponding to the oval, the coordinates of the vertex 22 at the lower right corner of the external frame, the target brush color, the target brush width, and the like; after the original labeling data are converted into target labeling data in a target data format, extracting a vertex 21 coordinate of the upper left corner of the external frame and a vertex 22 coordinate of the lower right corner of the external frame from the target labeling data, and calculating to obtain a coordinate, a horizontal radius and a vertical radius of the center point of the ellipse according to a geometric relationship; finally, the coordinates of the center point of the ellipse, the horizontal radius and the vertical radius, the target brush color, and the target brush width are used as target drawing data for drawing the ellipse 012.
In this embodiment, when the target annotation shape is an arrow, the polyline method in SVG is used for drawing. Because the drawing data required by drawing the graph by adopting the polyline method at least comprises the following steps: coordinates of the plurality of points, a target brush color, and a target brush width, the specific implementation manner of step S102 is: when the target labeling shape is an arrow, taking a first numerical value in the coordinate information set as a coordinate of a starting point of the to-be-drawn labeling graph, which is far away from the arrow side, and taking a second numerical value in the coordinate information set as a coordinate of an ending point of the to-be-drawn labeling graph, which is close to the arrow side; determining the coordinates of each point which encloses the arrow according to the coordinates of the starting point, the coordinates of the ending point, the preset arrow width and the geometric relation; and taking the coordinates of each point, the color of the target painting brush and the width of the target painting brush as target drawing data of the marked graph to be drawn.
It should be noted that the marking direction in marking is from the starting point away from the arrow side to the end point close to the arrow side, so the starting point away from the arrow side can be understood as the first marking point, and the end point close to the arrow side can be understood as the last marking point.
It is noted that the preset arrow width is set according to the actual situation. Taking fig. 6 as an example, the preset arrow width includes a width L2 of the base of the triangle corresponding to the arrow and a width L1 of the tail of the arrow.
Taking fig. 2 and fig. 6 as an example, when the user triggers the arrow control 5, in the annotation process, the recorded original annotation data includes coordinates of a starting point 01 away from the arrow side, coordinates of an ending point 02 close to the arrow side, a target brush color, a target brush width, and the like; after the original annotation data is converted into target annotation data in a target data format, extracting coordinates of a starting point 01 deviating from an arrow side, coordinates of an ending point 02 close to the arrow side, a target brush color and a target brush width from the target annotation data; next, the coordinates of the point 03, the point 04, the point 05, the point 06, the point 07, and the point 08 which enclose the arrow are calculated from the coordinates of the start point 01, the coordinates of the end point 02, the width L2 of the base of the triangle corresponding to the arrow, the width L1 of the end portion of the arrow, and the geometric relationship. The coordinates of the point 02, the coordinates of the point 03, the coordinates of the point 04, the coordinates of the point 05, the coordinates of the point 06, the coordinates of the point 07, the coordinates of the point 08, the target brush color, and the target brush width are taken as target drawing data of the drawing arrow 015.
In this embodiment, when the target annotation shape is a dot shape, the circle method in SVG is adopted for rendering. Because the drawing data required by drawing the graph by using the circle method at least comprises the following steps: the coordinates of the circle center, the radius of the circle, the color of the target brush and the width of the target brush, the specific implementation manner of the step S102 is as follows: when the target marking shape is a point shape, taking a first numerical value in the coordinate information set as a circle center coordinate of the marking graph to be drawn; and taking the circle center coordinate, the preset radius of the circle, the color of the target painting brush and the width of the target painting brush as target drawing data of the marked graph to be drawn.
It is to be noted that a point is to be understood as a circle with a very small radius. The drawing of the dot shape can be realized by setting the radius of the preset circle to a small value.
Taking fig. 2 and fig. 6 as an example, when the user triggers the arrow control 2, in the annotation process, the recorded original annotation data includes the coordinates of the point 014, the target brush color, the target brush width, and the like. After the original labeling data is converted into target labeling data in a target data format, extracting coordinates of a point 014, a target brush color and a target brush width from the target labeling data; and taking the coordinates of the point 014 as circle center coordinates, and taking the circle center coordinates, the preset radius of the circle, the color of the target brush and the width of the target brush as target drawing data for drawing the point 014.
S103, drawing a graph according to the target drawing data to generate the marked graph to be drawn, and storing the target drawing data.
In this embodiment, after the target drawing data is obtained, a corresponding drawing method is called to perform drawing of the graph, so as to obtain a labeled graph to be drawn. The target drawing data matched with the marked graph to be drawn can be obtained only by acquiring the original marking data of the target marking object, so that the marked graph can be drawn quickly.
Meanwhile, after the marked graph is drawn, the target drawing data is stored, so that the marked graph can be conveniently displayed again in the follow-up process or edited again. Compared with the marked graph in the related art for storing the picture format, the method for storing the target drawing data in the non-picture format greatly reduces the storage capacity of the data and improves the processing speed of the system.
Further, after the annotation graph is generated, a deformation operation can be performed on the annotation graph. Specifically, a target operation instruction of a user on the labeled graph is received, and corresponding operation is executed on the labeled graph according to the target operation instruction. The target operation instruction comprises at least one of a dragging instruction, a rotating instruction and a zooming instruction; the dragging instruction is used for indicating that the marked graph is dragged to a target position from a current position; the rotation instruction is used for indicating that the marked graph is rotated; the scaling instruction is used for indicating that the marked graph is scaled.
For example, the drag command is { type: move, vertical:50}, wherein type: move is used for representing the operation type as dragging; vertical:50 is used to characterize the operation value as drag 50.
For example, the rotation command is { type: rotate, default: 50}, wherein type: rotate is used for representing the operation type as dragging; degree:50 is used to characterize the operating value as 50 degrees of rotation.
For example, the zoom instruction is { type: scale, size:1.1}, where type: scale is used for representing the operation type as scaling; size:1.1 was used to characterize the operation values as scaled by 1.1.
Further, besides the deformation operation of the marked graph, the color and/or width of the brush of the marked graph can be edited again. Specifically, after performing graph drawing according to the target drawing data to generate the to-be-drawn annotation graph, the method further includes: modifying the color and/or width of the target brush in the target labeling data to obtain updated target labeling data; determining updated target drawing data according to the updated target marking data; and drawing a graph according to the updated target drawing data so as to update the marked graph.
For example, the color of the painting brush corresponding to the first drawn marking graph is red, and the width of the painting brush is 1; and the color of a brush corresponding to the re-edited marked graph is yellow, and the width of the brush is 2.
According to the method for generating the labeled graph, the original labeling data aiming at the target labeled object are obtained and converted into the target labeling data in the target data format, and the target labeling data comprise the target labeling shape, the coordinate information set of the labeling point, the target brush color and the target brush width; determining target drawing data of a marked graph to be drawn according to the target marking data; and drawing a graph according to the target drawing data to generate the marked graph to be drawn, and storing the target drawing data. The target drawing data matched with the marked graph to be drawn can be obtained only by acquiring the original marking data of the target marking object, so that the marked graph can be drawn quickly. Meanwhile, after the marked graph is drawn, the target drawing data is stored, so that the marked graph can be conveniently displayed again in the follow-up process or edited again. Compared with the marked graph in the related art for storing the picture format, the method for storing the target drawing data in the non-picture format greatly reduces the storage capacity of the data and improves the processing speed of the system.
Fig. 7 is a schematic structural diagram of a device for generating a label graph according to an embodiment of the present invention. The embodiment provides a marked graph generating device, which is an execution main body of the marked graph generating method, and the execution main body is composed of hardware and/or software. As shown in fig. 7, the annotation figure generation apparatus includes: the device comprises an acquisition module 101 and a processing module 102.
The system comprises an acquisition module 101, a storage module and a display module, wherein the acquisition module 101 is used for acquiring original labeling data of a target labeling object and converting the original labeling data into target labeling data in a target data format, and the target labeling data comprises a target labeling shape, a coordinate information set of labeling points, a target brush color and a target brush width;
the processing module 102 is configured to determine target drawing data of a to-be-drawn labeled graph according to the target labeling data;
the processing module 102 is further configured to perform graph drawing according to the target drawing data to generate the labeled graph to be drawn, and store the target drawing data.
Further, the processing module 102 is specifically configured to:
when the target labeling shape is a rectangle, taking a first numerical value in the coordinate information set as a vertex coordinate of the upper left corner of the labeling graph to be drawn, and taking a second numerical value in the coordinate information set as a vertex coordinate of the lower right corner of the labeling graph to be drawn;
Calculating the width value and the height value of the marked graph to be drawn according to the vertex coordinate of the upper left corner, the vertex coordinate of the lower right corner and the geometric relationship;
and taking the vertex coordinate of the upper left corner, the width value, the height value, the color of the target brush and the width of the target brush as target drawing data of the marked graph to be drawn.
Further, the processing module 102 is specifically configured to:
when the target labeling shape is a curve, taking each numerical value in the coordinate information set as the coordinate of each point which is enclosed into the labeling graph to be drawn;
and taking the coordinates of each point, the color of the target painting brush and the width of the target painting brush as target drawing data of the marked graph to be drawn.
Further, the processing module 102 is specifically configured to:
when the target labeling shape is an ellipse, taking a first numerical value in the coordinate information set as a vertex coordinate of the upper left corner of an external frame surrounding the labeling graph to be drawn, and taking a second numerical value in the coordinate information set as a vertex coordinate of the lower right corner of the external frame;
calculating the coordinate, the horizontal radius and the vertical radius of the central point of the labeled graph according to the vertex coordinate of the upper left corner, the vertex coordinate of the lower right corner and the geometric relationship;
And taking the coordinate of the central point, the horizontal radius and the vertical radius, the color of the target brush and the width of the target brush as target drawing data of the marked graph to be drawn.
Further, the processing module 102 is specifically configured to:
when the target labeling shape is an arrow, taking a first numerical value in the coordinate information set as a coordinate of a starting point of the to-be-drawn labeling graph, which is far away from the arrow side, and taking a second numerical value in the coordinate information set as a coordinate of an ending point of the to-be-drawn labeling graph, which is close to the arrow side;
determining the coordinates of each point which encloses the arrow according to the coordinates of the starting point, the coordinates of the ending point, the preset arrow width and the geometric relation;
and taking the coordinates of each point, the color of the target painting brush and the width of the target painting brush as target drawing data of the marked graph to be drawn.
Further, the processing module 102 is specifically configured to:
when the target marking shape is a point shape, taking a first numerical value in the coordinate information set as a circle center coordinate of the marking graph to be drawn;
and taking the circle center coordinate, the preset radius of the circle, the color of the target painting brush and the width of the target painting brush as target drawing data of the marked graph to be drawn.
Further, the processing module 102 is further configured to: and after the graph is drawn according to the target drawing data to generate the marked graph to be drawn, receiving a target operation instruction of a user on the marked graph, and executing corresponding operation on the marked graph according to the target operation instruction.
Further, the processing module 102 is further configured to: after the graph is drawn according to the target drawing data to generate the marked graph to be drawn, modifying the color and/or the width of the target brush in the target marking data to obtain updated target marking data;
determining updated target drawing data according to the updated target marking data;
and drawing a graph according to the updated target drawing data so as to update the marked graph.
It should be noted that the foregoing explanation on the embodiment of the method for generating a label graph is also applicable to the label graph generating apparatus of this embodiment, and is not repeated here.
The annotation graph generation device provided by the embodiment of the invention is used for converting original annotation data of a target annotation object into target annotation data in a target data format by acquiring the original annotation data, wherein the target annotation data comprises a target annotation shape, a coordinate information set of annotation points, a target brush color and a target brush width; determining target drawing data of a marked graph to be drawn according to the target marking data; and drawing a graph according to the target drawing data to generate the marked graph to be drawn, and storing the target drawing data. The target drawing data matched with the marked graph to be drawn can be obtained only by acquiring the original marking data of the target marking object, so that the marked graph can be drawn quickly. Meanwhile, after the marked graph is drawn, the target drawing data is stored, so that the marked graph can be conveniently displayed again in the follow-up process or edited again. Compared with the marked graph in the related art for storing the picture format, the method for storing the target drawing data in the non-picture format greatly reduces the storage capacity of the data and improves the processing speed of the system.
Fig. 8 is a schematic structural diagram of a computer device according to an embodiment of the present invention. The computer device includes:
memory 1001, processor 1002, and computer programs stored on memory 1001 and executable on processor 1002.
The processor 1002 executes the program to implement the annotation graph generation method provided in the above-described embodiments.
Further, the computer device further comprises:
a communication interface 1003 for communicating between the memory 1001 and the processor 1002.
A memory 1001 for storing computer programs that may be run on the processor 1002.
Memory 1001 may include high-speed RAM memory and may also include non-volatile memory (e.g., at least one disk memory).
The processor 1002 is configured to implement the annotation graph generation method according to the foregoing embodiment when executing the program.
If the memory 1001, the processor 1002, and the communication interface 1003 are implemented independently, the communication interface 1003, the memory 1001, and the processor 1002 may be connected to each other through a bus and perform communication with each other. The bus may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended ISA (Extended Industry Standard Architecture) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in FIG. 8, but this is not intended to represent only one bus or type of bus.
Optionally, in a specific implementation, if the memory 1001, the processor 1002, and the communication interface 1003 are integrated on one chip, the memory 1001, the processor 1002, and the communication interface 1003 may complete communication with each other through an internal interface.
The processor 1002 may be a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits configured to implement embodiments of the present invention.
The present embodiment also provides a computer-readable storage medium on which a computer program is stored, wherein the program is configured to implement the annotation figure generation method described above when executed by a processor.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing steps of a custom logic function or process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. If implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (11)

1. A method for generating a label graph is characterized by comprising the following steps:
acquiring original labeling data aiming at a target labeling object, and converting the original labeling data into target labeling data in a target data format, wherein the target labeling data comprises a target labeling shape, a coordinate information set of labeling points, a target brush color and a target brush width;
determining target drawing data of a marked graph to be drawn according to the target marking data;
and drawing a graph according to the target drawing data to generate the marked graph to be drawn, and storing the target drawing data.
2. The method of claim 1, wherein determining target drawing data for a marked graphic to be drawn from the target marking data comprises:
When the target labeling shape is a rectangle, taking a first numerical value in the coordinate information set as a vertex coordinate of the upper left corner of the labeling graph to be drawn, and taking a second numerical value in the coordinate information set as a vertex coordinate of the lower right corner of the labeling graph to be drawn;
calculating the width value and the height value of the marked graph to be drawn according to the vertex coordinate of the upper left corner, the vertex coordinate of the lower right corner and the geometric relationship;
and taking the vertex coordinate of the upper left corner, the width value, the height value, the color of the target brush and the width of the target brush as target drawing data of the marked graph to be drawn.
3. The method of claim 1, wherein determining target drawing data for a marked graphic to be drawn from the target marking data comprises:
when the target labeling shape is a curve, taking each numerical value in the coordinate information set as the coordinate of each point which is enclosed into the labeling graph to be drawn;
and taking the coordinates of each point, the color of the target painting brush and the width of the target painting brush as target drawing data of the marked graph to be drawn.
4. The method of claim 1, wherein determining target drawing data for a marked graphic to be drawn from the target marking data comprises:
when the target labeling shape is an ellipse, taking a first numerical value in the coordinate information set as a vertex coordinate of the upper left corner of an external frame surrounding the labeling graph to be drawn, and taking a second numerical value in the coordinate information set as a vertex coordinate of the lower right corner of the external frame;
calculating the coordinate, the horizontal radius and the vertical radius of the central point of the labeled graph according to the vertex coordinate of the upper left corner, the vertex coordinate of the lower right corner and the geometric relationship;
and taking the coordinate of the central point, the horizontal radius and the vertical radius, the color of the target brush and the width of the target brush as target drawing data of the marked graph to be drawn.
5. The method of claim 1, wherein determining target drawing data for a marked graphic to be drawn from the target marking data comprises:
when the target labeling shape is an arrow, taking a first numerical value in the coordinate information set as a coordinate of a starting point of the to-be-drawn labeling graph, which is far away from the arrow side, and taking a second numerical value in the coordinate information set as a coordinate of an ending point of the to-be-drawn labeling graph, which is close to the arrow side;
Determining the coordinates of each point which encloses the arrow according to the coordinates of the starting point, the coordinates of the ending point, the preset arrow width and the geometric relation;
and taking the coordinates of each point, the color of the target painting brush and the width of the target painting brush as target drawing data of the marked graph to be drawn.
6. The method of claim 1, wherein determining target drawing data for a marked graphic to be drawn from the target marking data comprises:
when the target marking shape is a point shape, taking a first numerical value in the coordinate information set as a circle center coordinate of the marking graph to be drawn;
and taking the circle center coordinate, the preset radius of the circle, the color of the target painting brush and the width of the target painting brush as target drawing data of the marked graph to be drawn.
7. The method as claimed in claim 1, wherein after performing graph drawing according to the target drawing data to generate the annotation graph to be drawn, further comprising:
and receiving a target operation instruction of the user on the labeled graph, and executing corresponding operation on the labeled graph according to the target operation instruction.
8. The method as claimed in claim 1, wherein after performing graph drawing according to the target drawing data to generate the annotation graph to be drawn, further comprising:
modifying the color and/or width of the target brush in the target labeling data to obtain updated target labeling data;
determining updated target drawing data according to the updated target marking data;
and drawing a graph according to the updated target drawing data so as to update the marked graph.
9. An annotation pattern generation device, comprising:
the system comprises an acquisition module, a storage module and a display module, wherein the acquisition module is used for acquiring original labeling data aiming at a target labeling object and converting the original labeling data into target labeling data in a target data format, and the target labeling data comprises a target labeling shape, a coordinate information set of labeling points, a target brush color and a target brush width;
the processing module is used for determining target drawing data of a to-be-drawn marked graph according to the target marking data;
the processing module is further configured to perform graph drawing according to the target drawing data to generate the labeled graph to be drawn, and store the target drawing data.
10. A computer device, comprising:
memory, processor and computer program stored on the memory and executable on the processor, characterized in that the processor implements the annotation graph generation method according to any one of claims 1 to 8 when executing the program.
11. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the annotation figure generation method according to any one of claims 1 to 8.
CN201910277127.0A 2019-04-08 2019-04-08 Method and device for generating label graph Pending CN111857893A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910277127.0A CN111857893A (en) 2019-04-08 2019-04-08 Method and device for generating label graph

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910277127.0A CN111857893A (en) 2019-04-08 2019-04-08 Method and device for generating label graph

Publications (1)

Publication Number Publication Date
CN111857893A true CN111857893A (en) 2020-10-30

Family

ID=72951903

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910277127.0A Pending CN111857893A (en) 2019-04-08 2019-04-08 Method and device for generating label graph

Country Status (1)

Country Link
CN (1) CN111857893A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113066174A (en) * 2021-04-27 2021-07-02 深圳市商汤科技有限公司 Point cloud data processing method and device, computer equipment and storage medium
CN113157166A (en) * 2021-05-20 2021-07-23 读书郎教育科技有限公司 Method for realizing self-adaptive note taking by intelligent terminal, storage medium and electronic equipment
CN113178079A (en) * 2021-04-06 2021-07-27 青岛以萨数据技术有限公司 Marking system, method and storage medium for signal lamp and lane line
CN113343636A (en) * 2021-06-02 2021-09-03 北京百度网讯科技有限公司 Method and device for setting width of marking line, electronic equipment and storage medium
CN113407083A (en) * 2021-06-24 2021-09-17 上海商汤科技开发有限公司 Data labeling method and device, electronic equipment and storage medium
CN113934876A (en) * 2021-12-21 2022-01-14 成都泰盟软件有限公司 Web-based job approval method, device and system

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040143796A1 (en) * 2000-03-07 2004-07-22 Microsoft Corporation System and method for annotating web-based document
CN101206640A (en) * 2006-12-22 2008-06-25 深圳市学之友教学仪器有限公司 Method and system for annotations and commentaries of electric data in portable electronic equipment
CN101727674A (en) * 2008-10-30 2010-06-09 北大方正集团有限公司 Method for marking picture in file, and method for reproducing mark of picture in file
CN104156163A (en) * 2013-05-15 2014-11-19 福建福昕软件开发股份有限公司北京分公司 Method for displaying handwriting in PDF file
US20150205398A1 (en) * 2013-12-30 2015-07-23 Skribb.it Inc. Graphical drawing object management methods and apparatus
CN105608319A (en) * 2015-12-21 2016-05-25 江苏康克移软软件有限公司 Digital pathological section labeling method and device
CN107450906A (en) * 2017-06-12 2017-12-08 积成电子股份有限公司 A kind of method for drafting with energy information acquisition system distribution wiring diagram
CN108573279A (en) * 2018-03-19 2018-09-25 精锐视觉智能科技(深圳)有限公司 Image labeling method and terminal device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040143796A1 (en) * 2000-03-07 2004-07-22 Microsoft Corporation System and method for annotating web-based document
CN101206640A (en) * 2006-12-22 2008-06-25 深圳市学之友教学仪器有限公司 Method and system for annotations and commentaries of electric data in portable electronic equipment
CN101727674A (en) * 2008-10-30 2010-06-09 北大方正集团有限公司 Method for marking picture in file, and method for reproducing mark of picture in file
CN104156163A (en) * 2013-05-15 2014-11-19 福建福昕软件开发股份有限公司北京分公司 Method for displaying handwriting in PDF file
US20150205398A1 (en) * 2013-12-30 2015-07-23 Skribb.it Inc. Graphical drawing object management methods and apparatus
CN105608319A (en) * 2015-12-21 2016-05-25 江苏康克移软软件有限公司 Digital pathological section labeling method and device
CN107450906A (en) * 2017-06-12 2017-12-08 积成电子股份有限公司 A kind of method for drafting with energy information acquisition system distribution wiring diagram
CN108573279A (en) * 2018-03-19 2018-09-25 精锐视觉智能科技(深圳)有限公司 Image labeling method and terminal device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王楠;封雷;: "基于浏览器的矢量数据可视化系统", 计算机与现代化, no. 01 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113178079A (en) * 2021-04-06 2021-07-27 青岛以萨数据技术有限公司 Marking system, method and storage medium for signal lamp and lane line
CN113066174A (en) * 2021-04-27 2021-07-02 深圳市商汤科技有限公司 Point cloud data processing method and device, computer equipment and storage medium
CN113157166A (en) * 2021-05-20 2021-07-23 读书郎教育科技有限公司 Method for realizing self-adaptive note taking by intelligent terminal, storage medium and electronic equipment
CN113157166B (en) * 2021-05-20 2022-03-29 读书郎教育科技有限公司 Method for realizing self-adaptive note taking by intelligent terminal, storage medium and electronic equipment
CN113343636A (en) * 2021-06-02 2021-09-03 北京百度网讯科技有限公司 Method and device for setting width of marking line, electronic equipment and storage medium
CN113343636B (en) * 2021-06-02 2023-11-03 北京百度网讯科技有限公司 Method and device for setting marking line width, electronic equipment and storage medium
CN113407083A (en) * 2021-06-24 2021-09-17 上海商汤科技开发有限公司 Data labeling method and device, electronic equipment and storage medium
CN113934876A (en) * 2021-12-21 2022-01-14 成都泰盟软件有限公司 Web-based job approval method, device and system

Similar Documents

Publication Publication Date Title
CN111857893A (en) Method and device for generating label graph
KR101334483B1 (en) Apparatus and method for digitizing a document, and computer-readable recording medium
US9355486B2 (en) Image compositing device and image compositing method
US11455502B2 (en) Learning device, classification device, learning method, classification method, learning program, and classification program
CN102693425A (en) Image processing apparatus and image processing method
JP4945813B2 (en) Print structured documents
CN111752557A (en) Display method and device
CN105517681A (en) Chart conversion system using metadata and method therefor
KR101890831B1 (en) Method for Providing E-Book Service and Computer Program Therefore
EP3472807B1 (en) Automatically identifying and displaying object of interest in a graphic novel
US20170329502A1 (en) Method and device for processing image
US9286668B1 (en) Generating a panel view for comics
US20230196008A1 (en) Semantically-guided template generation from image content
CN116402020A (en) Signature imaging processing method, system and storage medium based on OFD document
CN115988170A (en) Method and device for clearly displaying Chinese and English characters in real-time video screen combination in cloud conference
CN105956133B (en) Method and device for displaying file on intelligent terminal
JP5067882B2 (en) Image processing apparatus, image processing method, and program
CN112685998A (en) Automatic labeling method, device, equipment and readable storage medium
CN107608733A (en) Image display method, device and terminal device
CN116245052A (en) Drawing migration method, device, equipment and storage medium
WO2023272495A1 (en) Badging method and apparatus, badge detection model update method and system, and storage medium
JP6414475B2 (en) Computer program and control device
CN113936187A (en) Text image synthesis method and device, storage medium and electronic equipment
CN113378526A (en) PDF paragraph processing method, device, storage medium and equipment
CN117079084B (en) Sample image generation method, device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination