CN117197212A - Graphics processing method, system, device and medium - Google Patents
Graphics processing method, system, device and medium Download PDFInfo
- Publication number
- CN117197212A CN117197212A CN202311050958.7A CN202311050958A CN117197212A CN 117197212 A CN117197212 A CN 117197212A CN 202311050958 A CN202311050958 A CN 202311050958A CN 117197212 A CN117197212 A CN 117197212A
- Authority
- CN
- China
- Prior art keywords
- data
- cad file
- processing
- processing unit
- scene template
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000003672 processing method Methods 0.000 title abstract description 14
- 238000012545 processing Methods 0.000 claims abstract description 298
- 238000000034 method Methods 0.000 claims abstract description 94
- 238000004458 analytical method Methods 0.000 claims abstract description 52
- 238000001914 filtration Methods 0.000 claims description 43
- 230000008569 process Effects 0.000 claims description 42
- 238000013507 mapping Methods 0.000 claims description 6
- 230000000007 visual effect Effects 0.000 claims description 6
- 238000011960 computer-aided design Methods 0.000 description 130
- 239000010410 layer Substances 0.000 description 24
- 238000003860 storage Methods 0.000 description 17
- 238000010586 diagram Methods 0.000 description 16
- 238000005259 measurement Methods 0.000 description 14
- 230000006870 function Effects 0.000 description 9
- 230000009471 action Effects 0.000 description 5
- 238000004590 computer program Methods 0.000 description 5
- 238000012800 visualization Methods 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 238000007726 management method Methods 0.000 description 3
- 238000009877 rendering Methods 0.000 description 3
- 238000013075 data extraction Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 239000011229 interlayer Substances 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000004931 aggregating effect Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000004806 packaging method and process Methods 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 230000001105 regulatory effect Effects 0.000 description 1
- 230000001502 supplementing effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Landscapes
- Processing Or Creating Images (AREA)
Abstract
The embodiment of the application provides a graphic processing method, a system, equipment and a medium, wherein the method specifically comprises the following steps: receiving a CAD file; determining a scene template corresponding to the CAD file; processing the CAD file by using the scene template to obtain object data corresponding to a data object contained in the CAD file; wherein, the scene template includes: an analysis unit and a processing unit; the analysis unit is used for analyzing the CAD file to obtain elements contained in the CAD file; the processing unit is used for processing the elements contained in the CAD file to obtain object data corresponding to the data object; and outputting object data corresponding to the data object. The embodiment of the application can save the labor cost of graphic processing, can improve the accuracy of graphic processing, and has the advantages of high flexibility and wide scene application range.
Description
Technical Field
The embodiment of the application relates to the technical field of computer information processing, in particular to a graphic processing method, a graphic processing system, graphic processing equipment and graphic processing media.
Background
CAD (computer aided design ) software is a drawing software for assisting a user in performing a design work using a computer and its graphic device, and has been widely used in the fields of construction, electronics and electricity, mechanical design, clothing industry, computer arts, logistics, and the like. The graphic drawn by CAD software may be referred to as a CAD graphic, which typically includes a number of primitives. Taking the building field as an example, the primitives may represent wall, room, shelf, etc. data objects in the building field.
Object data corresponding to the data object is extracted from the CAD graph, and the method has important significance for auditing the CAD graph and processing the data object.
Currently, data extraction of data objects is typically performed manually. Specifically, the user may perform manual measurement in CAD software to obtain object data corresponding to a data object included in the CAD drawing. However, manual measurement not only consumes a lot of labor cost, but also inevitably involves measurement errors.
Disclosure of Invention
The embodiment of the application provides a graphic processing method, which can save the labor cost of graphic processing, can improve the accuracy of graphic processing, and has the advantages of high flexibility and wide scene application range.
Correspondingly, the embodiment of the application also provides a graphics processing system, electronic equipment and a storage medium, which are used for realizing the realization and application of the method.
In order to solve the above problems, an embodiment of the present application discloses a graphics processing method, which includes:
receiving a CAD file;
determining a scene template corresponding to the CAD file;
processing the CAD file by using the scene template to obtain object data corresponding to a data object contained in the CAD file; wherein, the scene template includes: an analysis unit and a processing unit; the analysis unit is used for analyzing the CAD file to obtain elements contained in the CAD file; the processing unit is used for processing the elements contained in the CAD file to obtain object data corresponding to the data object;
and outputting object data corresponding to the data object.
In order to solve the above problems, an embodiment of the present application discloses a graphics processing method, which includes:
determining object data corresponding to the CAD file; the object data is structured data;
performing preset processing according to the object data; the preset processing comprises the following steps: generating a ledger or a visual model, or executing warehouse operation;
The process for determining the object data corresponding to the CAD file comprises the following steps: determining a scene template corresponding to the CAD file; processing the CAD file by using the scene template to obtain object data corresponding to a data object contained in the CAD file; wherein, the scene template includes: an analysis unit and a processing unit; the analysis unit is used for analyzing the CAD file to obtain elements contained in the CAD file; the processing unit is used for processing the elements contained in the CAD file to obtain object data corresponding to the data object.
To solve the above problems, an embodiment of the present application discloses a graphics processing system, including: a graphics processing device and data objects in a repository;
the graphics processing device is configured to execute the foregoing method, determine, for a CAD file containing the data object, object data corresponding to the data object, output the object data corresponding to the data object, and perform a preset process according to the object data.
In order to solve the above problems, an embodiment of the present application discloses an electronic device, including: a processing unit; and a memory having executable code stored thereon that, when executed, causes the processing unit to perform the method of any of the above embodiments.
To address the above issues, embodiments of the present application disclose one or more machine-readable media having stored thereon executable code which, when executed, causes a processing unit to perform a method as in any of the above embodiments.
The embodiment of the application has the following advantages:
in the technical scheme of the embodiment of the application, the CAD file is processed by utilizing the scene template corresponding to the CAD file so as to obtain the object data corresponding to the data object. Because the scene template can process the elements contained in the CAD file by adopting a computer technology, the embodiment of the application can save the cost of manual measurement in CAD software and avoid measurement errors; in other words, the embodiment of the application can save the labor cost of graphic processing and can improve the accuracy of the graphic processing.
Moreover, the scene template of the embodiment of the application can comprise: and the analyzing unit and the processing unit. Wherein, different processing results can be realized by different combinations of processing units or different combinations of processing units and analyzing units. For example, the same analysis unit is matched with different processing units, so that different processing results can be obtained; alternatively, the same processing unit may be matched with different parsing units to obtain different processing results. Therefore, the structure of the scene template provided by the embodiment of the application has the advantages of high flexibility and wide scene application range.
Furthermore, one parsing unit or one processing unit may be applied to a variety of scene templates, in other words, different scene template options may use the same parsing unit or processing unit. The analysis unit or the processing unit has reusability, so the embodiment of the application can further save the cost of graphic processing.
Drawings
FIG. 1 is a flow chart of the steps of a graphics processing method of one embodiment of the present application;
FIG. 2 is a schematic diagram of a CAD data structure according to one embodiment of the present application;
FIG. 3 is a schematic diagram of a scene template according to one embodiment of the application;
FIG. 4 (a) is a schematic diagram of a polygon corresponding to a line element;
FIG. 4 (b) is a schematic view of a hollow portion in a polygon;
FIG. 4 (c) is a schematic diagram of the merging result after merging neighboring graphics in the polygon;
FIG. 5 is a flow chart of steps of a graphics processing method of one embodiment of the present application;
FIG. 6 is a schematic diagram of a graphics processing system in accordance with one embodiment of the application;
fig. 7 is a schematic diagram of an exemplary apparatus provided in one embodiment of the application.
Detailed Description
In order that the above-recited objects, features and advantages of the present application will become more readily apparent, a more particular description of the application will be rendered by reference to the appended drawings and appended detailed description.
The embodiment of the application can be used for processing the CAD file to obtain the object data corresponding to the data objects such as the wall, the room, the goods shelf and the like contained in the CAD file.
The embodiments of the present application may involve the use of user data, and in practical applications, user-specific personal data may be used in the schemes described herein within the scope allowed by applicable laws and regulations under conditions that meet applicable legal and regulatory requirements of the country where the user explicitly agrees (e.g., practical notification to the user, etc.).
In the related art, data extraction of a data object is generally performed manually. Specifically, the user may perform manual measurement in CAD software to obtain object data corresponding to a data object included in the CAD drawing. However, manual measurement not only consumes a lot of labor cost, but also inevitably involves measurement errors.
Aiming at the technical problems that manual measurement in the related art not only consumes a great deal of labor cost, but also is difficult to avoid measurement errors, the embodiment of the application provides a graph processing method, which specifically comprises the following steps: receiving a CAD file; determining a scene template corresponding to the CAD file; processing the CAD file by using the scene template to obtain object data corresponding to a data object contained in the CAD file; the scene template specifically comprises: an analysis unit and a processing unit; the analysis unit is used for analyzing the CAD file to obtain elements contained in the CAD file; the processing unit is used for processing the elements contained in the CAD file to obtain object data corresponding to the data object; and outputting object data corresponding to the data object.
According to the embodiment of the application, the elements contained in the CAD file are processed by utilizing the scene template corresponding to the CAD file so as to obtain the object data corresponding to the data object. Because the scene template can process the CAD file by adopting a computer technology, the embodiment of the application can save the cost of manual measurement in CAD software and avoid measurement errors; in other words, the embodiment of the application can save the labor cost of graphic processing and can improve the accuracy of the graphic processing.
Method embodiment one
Referring to fig. 1, a flowchart illustrating steps of a graphics processing method according to an embodiment of the present application may specifically include the following steps:
step 101, receiving a CAD file;
102, determining a scene template corresponding to the CAD file; the scene template specifically comprises: a processing unit;
step 103, processing the CAD file by using the scene template to obtain object data corresponding to the data object contained in the CAD file; the scene template specifically comprises: an analysis unit and a processing unit; the analysis unit is used for analyzing the CAD file to obtain elements contained in the CAD file; the processing unit is used for processing the elements contained in the CAD file to obtain object data corresponding to the data object;
Step 104, outputting the object data corresponding to the data object.
The embodiment of the method shown in fig. 1 may be used to parse a CAD file to obtain object data corresponding to a data object included in the CAD file. At least one step included in the method embodiment shown in fig. 1 may be performed by a graphics processing device, which may be running on a client or server. It will be appreciated that embodiments of the present application are not limited to the specific implementation of the method shown in fig. 1.
In step 101, the client may receive a CAD file uploaded by the user. The CAD file may be a file output by the drawing software, and it is understood that the embodiment of the application does not limit the specific format of the CAD file.
The CAD file which can be processed by the embodiment of the application can accord with the preset specification. Examples of preset specifications may include: drawing specifications, layer specifications, frame specifications, etc.
For example, the drawing specification requires that the CAD file contain: block, line, polygon, etc. The layer specification may constrain the line color, linearity, etc. characteristics of the layer, e.g., requiring the line color of the fire-blocking partition to be blue, etc. The frame specification may constrain the line shape of the frame, e.g., the line shape of the frame is a thick solid line, etc. It will be appreciated that, according to practical application requirements, a person skilled in the art may determine the preset specification, and the embodiment of the present application is not limited to the specific preset specification.
In step 102, a scene corresponding to the scene template may characterize an parsed scene of the CAD file. The parsed scenes may be associated with scenes corresponding to CAD files to enable processing of the CAD files.
In the embodiment of the application, the scene corresponding to the CAD file can represent the environment or the object in the environment where the element in the CAD file is located, and the object in the environment can comprise: building, road, campus object, etc., the campus object may include: garden, greenbelt, appliance, etc.
The analysis scene of the CAD file can be preset by a person skilled in the art according to the actual application requirement. For example, in the field of logistics technology, examples of parsing a scene may include: logistics parks, warehouse floors, warehouse interiors, warehouse corollary buildings, automated assembly lines, industrial parks and the like.
The embodiment of the application can provide the scene template corresponding to the analysis scene. Correspondingly, the scene template acquisition process specifically comprises the following steps: creating scene template options; determining an analysis unit and a processing unit corresponding to the scene template options; and saving the mapping relation between the analysis unit and the processing unit and the scene template options.
Wherein the scene template options may correspond to the parsed scene. The embodiment of the application can determine a scene template option corresponding to a scene aiming at an analysis scene.
The processing unit corresponding to the scene template option may be matched with the parsing scene corresponding to the scene template option, in other words, the processing unit may be capable of processing the CAD file corresponding to the parsing scene.
The processing unit corresponding to the scene template option can be an existing processing unit or a newly built processing unit. In the embodiment of the application, different scene template options can use the same processing unit.
The processing unit may correspond to processing logic or code to which the processing logic corresponds. In case of using an existing processing unit, the existing code may be multiplexed to save the writing cost of the code. In the case of using a new processing unit, writing of code may be performed for the new processing unit.
The processing unit corresponding to the scene template option may be one or more. Different processing units may be used in combination, and processing units may also be combined with other classes of units. Other classes of units may include: a parsing unit and/or a filtering unit.
The analysis unit is used for analyzing the CAD file to obtain elements such as lines, circles, polygons, texts and the like contained in the CAD file. In one example, the parsing unit may convert the CAD file into dxf (graphics interchange format, drawing Exchange Format) data; then, the data of the element is extracted from the dxf data according to the CAD format specification.
Referring to table 1, an example of attribute information of a parsing unit according to an embodiment of the present application is shown. The attribute information of the parsing unit may specifically include: resolving a unit name, CAD shape, output element format and description, and the like.
TABLE 1
Referring to FIG. 2, a schematic diagram of a CAD data structure according to one embodiment of the present application is shown. Wherein, the CAD file may include: one or more layers; one layer may include: one or more object entities. A block is a named group of object entities, which may include: one or more object entities. "block nesting" may refer to one block also referring to another block. An object entity may refer to an object having a graphical representation, examples of which may include: lines, circles, arcs, text, ellipses, etc.
In practical applications, different CAD shapes may be subjected to different parsing units to obtain different elements. The same CAD shape can also be obtained by different analysis units. For example, a polygon can be obtained by analyzing a polygon using a polygon analyzing means, and a line can be obtained by analyzing a polygon using a line analyzing means.
The filtering unit can be used for determining a preset range and/or target graphic data corresponding to a preset layer from the CAD file.
The filtering unit may include: a range filtering unit and/or a layer filtering unit. The range filtering unit may determine target graphic data corresponding to the preset range from the CAD file. The preset range may be determined via a graphic name, and for example, a range filtering unit may be used to determine target graphic data corresponding to the preset range "library elevation view No. 1". The layer unit may determine target graphic data corresponding to the preset layer from the CAD file. The preset layer may be determined via a layer name, for example, a range filtering unit may be used to determine target graphic data (elevation mark text) corresponding to "elevation mark" of the preset layer. Of course, the scope filtering unit and the layer filtering unit can be utilized simultaneously to determine the elevation drawing annotation text which is commonly corresponding to the 'No. 1 library elevation drawing' and the 'elevation drawing annotation'.
The processing unit in the embodiment of the application can be a unit for processing elements contained in the CAD file in the scene model. The type of processing unit may be various. For example, the processing units may be divided into: a graphics processing unit, a functional processing unit and a data object processing unit.
And the graphic processing unit is used for processing the graphic elements contained in the CAD file. The function processing unit is used for realizing a preset function. The data object processing unit is used for determining object data corresponding to the data object according to the data provided by the other processing units.
Referring to table 2, an example of attribute information of a processing unit of one embodiment of the present application is shown. The attribute information of the processing unit may specifically include: the category of the processing unit, the name of the processing unit, the analysis unit and description of collocation, and the like. The categories of processing units may include: graphics processing unit, functional processing unit, and data object processing unit, etc. Wherein a class of processing units may further comprise the corresponding processing unit. For example, the graphics processing unit may include: a line-to-polygon processing unit, etc. For example, in a scene template corresponding to a logistics park or an industrial park scene, the line-to-polygon processing unit may be used for processing data objects such as roads, ground or bicycle sheds.
TABLE 2
In summary, the scene template of the embodiment of the present application may include: a plurality of processing units. One processing unit may use the output data of another processing unit. In the case where the processing unit is a graphics processing unit, the graphics processing unit may use output data of the parsing unit and the filtering unit. Different processing results can be realized by different combinations of processing units, or different combinations of processing units and analyzing units, or different combinations of processing units, analyzing units and filtering units.
The parsing unit may correspond to processing logic or code corresponding to the processing logic. In the case of using an existing parsing unit, existing codes can be multiplexed to save the writing cost of the codes. In the case of using a new parsing unit, writing of code may be performed for the new parsing unit.
Similarly, the filtering unit may correspond to processing logic or code corresponding to processing logic. In case of using an existing filtering unit, the existing code can be multiplexed to save the writing cost of the code. In the case of using a new filter unit, the writing of code may be performed for the new filter unit.
The embodiment of the application can combine different processing units, or processing units and analyzing units in a code combination mode. Alternatively, embodiments of the present application may utilize a visualization manner to combine different processing units, or processing units and parsing units.
An implementation of a visualization is provided herein. The implementation process specifically comprises the following steps: providing a first region and a second region in an interface; the first region may be a scene template region and the second region may be a cell region; the selected units contained in the scene template can be presented in the scene template area, and the second area can present the units to be selected, such as the processing unit to be selected, the analysis unit to be selected, the filtering unit to be selected and the like. The embodiment of the application can receive the selection operation for the to-be-selected units in the second area and add the to-be-selected units selected by the user to the second area.
The second region may include: connection relationship between selected cells and selected cells. For example, the second region may include: a selected processing unit area, a selected parsing unit area, and a selected filtering unit area. The embodiment of the application can support the movement of the unit to be selected to any one of the selected processing unit area, the selected analysis unit area and the selected filtering unit area through a drag operation. The embodiment of the application can also support editing operations, such as deleting operations or moving operations, for any selected unit in the selected processing unit area, the selected analysis unit area and the selected filtering unit area. The embodiment of the application can also support the connection operation among different selected units and the like.
Referring to FIG. 3, a schematic diagram of a scene template of one embodiment of the application is shown, which may specifically include: a processing unit 301, a parsing unit 302 and a filtering unit 303.
The processing unit 301 is configured to process a graphic element included in the CAD file. The processing unit 301 may specifically include: graphics processing unit 311, functional processing unit 312, and data object processing unit 313.
The graphic processing unit 311 is configured to obtain a graphic element from the parsing unit, and determine a graphic corresponding to the graphic element.
The function processing unit 312 is used for performing processing such as coordinate transformation on the graphics corresponding to the graphics element.
The data object processing unit 313 is configured to process the graphics corresponding to the graphic element to obtain object graphics data corresponding to the data object. The object graphic data may be graphic data corresponding to a data object such as a shelf.
The parsing unit 302 is configured to obtain elements such as lines, circles, polygons, text, etc. from a CAD file.
The filtering unit 303 may be used to filter the graphics data contained in the CAD file. The filtering unit 303 may include: a range filtering unit and/or a layer filtering unit. The range filtering unit may determine target graphic data corresponding to the preset range from the CAD file. Alternatively, the preset range may be determined via a graphic name. The layer unit may determine target graphic data corresponding to the preset layer from the CAD file. Alternatively, the preset layer may be determined via a layer name or a block name.
It will be appreciated that the scene template shown in fig. 3 is only an example of a scene template according to an embodiment of the present application, and is not intended to limit the scene template according to an embodiment of the present application. Indeed, the scene template of the embodiment of the present application may include: a processing unit, or a processing unit and a parsing unit, or a processing unit, a parsing unit and a filtering unit.
The scene template obtaining process in the embodiment of the application can further comprise the following steps: determining a filter unit corresponding to the analysis unit; and saving the mapping relation between the filtering unit and the scene template options.
In practical application, the step 102 of determining the scene template corresponding to the CAD file may specifically include: presenting at least one scene template option; receiving a target scene template option selected by a user; and determining the scene template corresponding to the CAD file according to the scene template corresponding to the target scene template option. For example, scene template options such as a logistics park, a warehouse floor, a warehouse interior, a warehouse matching building, an automated assembly line, an industrial park and the like can be displayed in the interface for selection by a user. The user may select the target scene template option by clicking or the like. The embodiment of the application can take the scene template corresponding to the target scene template option as the scene template corresponding to the CAD file.
In step 103, the data object may be a composite information representation understood by the software. The data objects may correspond to entities, which may be objectively existing and distinguishable from one another. For example, in a resolved scenario of warehouse floors, the data objects may include: shelves, etc. It will be appreciated that the above-described shelves are merely examples of data objects, and embodiments of the present application are not limited to particular data objects.
The embodiment of the application can provide the following technical scheme for processing the CAD file:
technical solution 1
In claim 1, the elements may include: a graphic element; the processing unit may include: a graphic processing unit and a data object processing unit;
the process of processing the CAD file may specifically include: determining a graph corresponding to the graph element by utilizing the graph processing unit; and processing the graph corresponding to the graph element by using the data object processing unit so as to obtain object graph data corresponding to the data object.
In practical applications, the graphic elements may include: line elements, point elements, text elements, polygon elements, etc.
In one example, the graphical element includes: in the case of line elements, the graphic corresponding to the graphic element may include a polygon; the process of processing the graphics corresponding to the graphic element may specifically include: and identifying the hollow part in the polygon by using the data object processing unit, and merging adjacent graphs in the polygon to obtain corresponding object graph data.
Referring to fig. 4 (a) to 4 (c), there are illustrated schematic diagrams of a process of processing polygons according to an embodiment of the present application, wherein fig. 4 (a) illustrates polygons corresponding to line elements, fig. 4 (b) illustrates hollow portions in the polygons, and fig. 4 (c) illustrates merging results of neighboring graphics in the polygons, the merging results characterizing object graphics data corresponding to walls. The hollow portion in the polygon may be a hollow portion to which a plurality of polygons commonly correspond.
According to the embodiment of the application, the object graph data corresponding to the first data object can be obtained according to the graph corresponding to the first data object; and object graph data corresponding to the second data object can be obtained according to the graph corresponding to the first data object.
The first data object may be a wall and the second data object may be a room. As shown in fig. 4 (b), data corresponding to the hollow portion in the polygon may be used as object graphics data corresponding to the room.
The first data object may be a wall and a window and the second data object may be a room. In this case, object graphic data corresponding to a room surrounded by walls and windows may be determined from polygons corresponding to the walls and windows.
The object graphics data may characterize graphically corresponding data of the data object, which may specifically include: coordinate data of points included in the graph of the data object, and the like.
In an alternative implementation manner of the present application, the processing unit may include: an attribute processing unit; the process of processing the CAD file may further include: and determining object attribute data corresponding to the data object by utilizing the attribute processing unit.
The object attribute data may characterize object attributes of the data object. The object attribute data may include, but is not limited to: name, space, code, type, model, remark, rotation angle, floor height and height, etc.
The attribute processing unit can determine object attribute data such as names, codes, types, models, remarks and the like corresponding to the data objects according to the text corresponding to the data objects. The attribute processing unit may obtain object attribute data from other processing units, such as a text processing unit or a table processing unit.
The text corresponding to the data object may include: text contained in the layer name, and/or text contained in a table of data objects, and/or text contained within a graphical scope of data objects, and the like. Taking the data object as a door or window as an example, the table of data objects may comprise a door window table. Taking the data object as a gate as an example, the text contained within the graphical scope of the data object may be text that appears on the edges of the gate. The text may include: text strings, the types of which may include, but are not limited to: chinese characters, letters, numbers, etc.
The text contained within the scope of the data object may also include: text contained in the elevation view. Taking the data object as an example of the door and window, the default height of the door and window and the default height of the floor can be obtained according to the text contained in the vertical view of the door and window.
In one example, the name of the data object may be first determined from text contained in the layer name; object attribute data such as type, size, description and the like of the data object is determined according to text contained in a table of the data object in the layer. The model of the data object may also be determined based on text contained within the graphical scope of the data object. The embodiment of the application can also generate the codes of the data objects according to the information such as the positions of the data objects in the graph.
The attribute processing unit or the text processing unit can perform semantic analysis on the text corresponding to the data object to obtain object attribute data. The semantic analysis can obtain standardized object attribute data.
The embodiment of the application can adopt the regular expression to carry out semantic analysis on the text corresponding to the data object. For example, when the text of the data object in the warehouse floor scene is semantically parsed, the spatial data included in the object attribute data such as the warehouse name, the warehouse number, the number of floors, the matched building and the like can be identified by using the regular expression. Taking the text "library No. 2 layer 1" as an example, a semantic parsing result can be obtained: warehouse name: no. 2 library, warehouse No.: 2. floor number: 1. taking the "number 19 matched floor 5" as an example, the semantic analysis result can be obtained: name of matched building: 19 number match, warehouse number: 19. floor number: 5.
the embodiment of the application can adopt a classification method to determine the object attribute data corresponding to the data object. The classification method may include: a classification dictionary method, a machine classification method, or the like. The classification dictionary method may include: and a double-array dictionary tree method, wherein the double-array dictionary tree method can utilize the common prefix of the character strings to reduce the expenditure of the query time so as to achieve the aim of improving the efficiency.
Referring to Table 3, an illustration of determining a room type from a room type description is shown in accordance with one embodiment of the present application. The room type description may be a standard type obtained by mapping the type description, and the room type may be a standard type obtained by mapping the type description.
TABLE 3 Table 3
The object attribute data of the embodiment of the application can further comprise: building area, sleeve area, yield of rooms, etc. Specifically, the embodiment of the application can calculate the building area, the sleeve area, the room yield and other data of the room according to the room and the object graphic data of the corresponding wall of the room.
Technical solution 2
In claim 2, the elements may include: a line element; the processing unit may include: a form processing unit and a text processing unit; the above process of processing the CAD file may specifically include: determining a table corresponding to the line element by using a table processing unit; and analyzing the text contained in the table by using a text processing unit to obtain object attribute data corresponding to the data object.
Taking a data object as a door and window as an example, the table processing unit can determine a door and window table corresponding to the line element, and the text processing unit can perform semantic analysis on texts contained in the table to obtain object attribute data such as width, height, model and the like of the door and window. The semantic analysis can obtain standardized object attribute data.
In the embodiment of the application, the analysis unit can be utilized to acquire the elements contained in the CAD file.
In an embodiment of the present application, the scene template may further include: a filtering unit; the method may further comprise, prior to processing the CAD file: determining target graphic data corresponding to a preset range and/or a preset layer from the CAD file by utilizing the filtering unit; the parsing unit is further configured to obtain an element included in the target graphics data.
In summary, the embodiment of the application processes the CAD file by using the scene model, and can obtain the object data corresponding to the data object. The object data may include: object graphics data and object attribute data.
The object graphic data may be graphic data corresponding to a data object such as a shelf. Object graphics data may be used for presentation of data objects. In other words, presentation of data objects may be achieved from object graphics data.
The object attribute data may characterize object attributes of the data object. The object attribute data may include, but is not limited to: name, code, type, model, remark, rotation angle, floor height and height, etc. Object attribute data may be used for analysis of data objects. In other words, from the object attribute data, analysis of the data object may be achieved. The bottom surface height may be the height of the bottom surface of the data object relative to the ground, or the bottom surface height may be the height of the floor where the data object is located.
Referring to Table 4, there is shown an example of object data of a data object of one embodiment of the present application, wherein the data object may be a window, the object data may be structured data, and the fields of the structured data may include: name, code, type, model, remark, object graphic data, rotation angle, bottom surface height, object height, and the like.
A process for processing a CAD file in the event that a door or window is contained in the CAD file is provided herein. Assuming that the analysis scene corresponding to the CAD file is a matched building of the warehouse, the CAD file can be processed by utilizing a scene template corresponding to the matched building of the warehouse.
In one example, the scene templates corresponding to the warehouse-matched building may include: the processing unit may include: graphics processing unit, data object processing unit, table processing unit, elevation view processing unit, attribute processing unit, etc.
Taking a data object as an example of a door and window, the parsing unit is used for obtaining elements such as line elements from the CAD file. The graphic processing unit is used for acquiring the line elements from the parsing unit and converting the line elements into polygons. The data object processing unit is used for acquiring object graphic data of the doors and windows according to the polygons. The table processing unit is used for determining a door and window table corresponding to the line elements, and carrying out semantic analysis on texts contained in the table by utilizing the text processing unit so as to obtain the width, the height, the model and the like of the door and window. The elevation view processing unit can be used for processing the elevation view of the door and window to obtain the default height of the door and window and the default height of the floor. The attribute processing unit is used for acquiring object graphic data of the doors and windows from the data object processing unit and acquiring object attribute data of the doors and windows such as width, height, model and the like from the table processing unit. In the case where the table processing unit does not provide the width, height, etc. data of the door and window, the attribute processing unit may acquire the default height of the door and window from the elevation view processing unit as the door and window height.
In step 104, object data corresponding to the data object may be output for use by other devices or other systems. The object data may be structured data, and a database may be used to store and output the structured data corresponding to the object data.
In practical applications, a plurality of data objects may be contained in a CAD file. According to the embodiment of the application, the object data corresponding to each of the plurality of data objects can be determined according to the steps 102 and 103.
In summary, according to the graphic processing method of the embodiment of the application, the scene template is utilized to process the elements contained in the CAD file so as to obtain the object data corresponding to the data object. Because the scene template can process the elements contained in the CAD file by adopting a computer technology, the embodiment of the application can save the cost of manual measurement in CAD software and avoid measurement errors; in other words, the embodiment of the application can save the labor cost of graphic processing and can improve the accuracy of the graphic processing.
Moreover, the scene template of the embodiment of the application can comprise: and the analyzing unit and the processing unit. Wherein, different processing results can be realized by different combinations of processing units or different combinations of processing units and analyzing units. For example, the same processing unit can be matched with different analyzing units to obtain different processing results. Therefore, the structure of the scene template provided by the embodiment of the application has the advantages of high flexibility and wide scene application range.
Moreover, the scene template of the embodiment of the application can comprise: the device comprises a processing unit, a parsing unit and a filtering unit. Wherein, different processing results can be realized by different combinations of processing units, or different combinations of processing units and analyzing units, or different combinations of processing units, analyzing units and filtering units. For example, the same processing unit may be matched with different parsing units or filtering units to obtain different processing results. Therefore, the structure of the scene template provided by the embodiment of the application has the advantages of high flexibility and wide scene application range.
Further, in addition, one parsing unit or one processing unit may be applied to a variety of scene templates, in other words, different scene template options may use the same parsing unit or processing unit. The analysis unit or the processing unit has reusability, so the embodiment of the application can further save the cost of graphic processing.
In addition, the embodiment of the application can provide the configuration items corresponding to the processing units for the user to configure. For example, the first configuration item is used to configure whether to generate a code corresponding to the data object. As another example, the second configuration item is used to configure whether to merge adjacent graphics, and so on. The configuration result of the configuration item can control the processing logic corresponding to the processing unit, so as to control the processing result corresponding to the processing unit.
Method embodiment II
Referring to fig. 5, a flowchart illustrating steps of a graphics processing method according to an embodiment of the present application may specifically include the following steps:
step 501, determining object data corresponding to a CAD file; the object data may be structured data;
step 502, performing preset processing according to the object data; the preset processing specifically comprises the following steps: generating a ledger or a visual model, or executing warehouse operation;
the process for determining the object data corresponding to the CAD file comprises the following steps: determining a scene template corresponding to the CAD file; processing the CAD file by using the scene template to obtain object data corresponding to a data object contained in the CAD file; wherein, the scene template includes: an analysis unit and a processing unit; the analysis unit is used for analyzing the CAD file to obtain elements contained in the CAD file; the processing unit is used for processing the elements contained in the CAD file to obtain object data corresponding to the data object.
The embodiment of the method shown in fig. 5 may be used to illustrate a specific application of the object data corresponding to the CAD file. The object data can be structured data, so that the embodiment of the application can play a role in digitizing the data objects such as building units, equipment and the like contained in the CAD file. The object data can provide basic data for applications such as asset accounting, operation management, layout planning, data analysis and the like.
The embodiment of the application can perform preset processing according to the object data. The above-mentioned preset process can be determined by those skilled in the art according to the actual application requirements. For example, the preset process may specifically include: a ledger or visualization model is generated or a warehouse job is performed.
The ledger refers to an account book or a spreadsheet file for recording the content of a fixed asset, and is used for recording detailed information such as the purchase time, price, sponsor or department of the fixed asset.
In a specific implementation, the embodiment of the application can generate the corresponding standing book aiming at the data objects such as the window, the lifting door, the vertical hinged door and the like. Fields of the ledger may originate from object data of the data object. In other words, a data record of the ledger may be generated from object data of a data object.
Referring to table 5, an example of a ledger for a window of one embodiment of the present application is shown, where fields of the ledger may include: belonging to space, code, name, type, remark, etc.
TABLE 5
The visualization model may be a two-dimensional model or a three-dimensional model. The generated two-dimensional model or three-dimensional model can be used for carrying out visual display on the building corresponding to the CAD file, and can be applied to application scenes such as visual renting of houses and presentation of Internet of things equipment.
In one example, the two-dimensional model or the three-dimensional model may be a map model. Map components corresponding to the data objects may be included in the map model. The map component may include: a standard map component. The information of the standard map component may include: name, coding, rendering mode, drawing mode, spatial attribute, etc.
The map component may further include: and customizing the map component. Custom map components add custom properties relative to standard map components. For example, a standard shelf corresponds to a standard map component, while a user-created shelf corresponds to a custom map component.
In a specific implementation, a three-dimensional model may be presented. For example, corresponding map elements may be generated from map components. The map elements can be examples corresponding to the map components, tree-shaped relations can be formed between the map elements, and a user can configure attribute values of attribute information such as names, codes, rendering modes, spatial attributes, drawing modes, custom attributes and the like on the map elements. In this way, map elements can be rendered and displayed according to the rendering mode corresponding to the map component.
The warehouse operation may involve any operation link from the receiving of the warehouse object such as materials, commodities and the like to the delivering of the warehouse object. Accordingly, the warehouse operations may include: receiving, warehousing, supplementing, selecting, collecting, packaging, sorting, checking and the like.
The picking operation can be a process of conveying storage objects such as target commodities in a goods shelf to a picking workstation according to order requirements.
Examples of determination of picking information are provided herein, including in particular:
a1, placing an order to a transaction system by a user, and generating the order by the transaction system;
step A2, the transaction system sends the order to the order system;
the order may include: SKU (stock keeping unit ) of the commodity and the required quantity for SKU.
A3, establishing a wave order task aiming at an order by the order system;
the order system establishes a wave order task for orders of goods, gathers a plurality of orders to perform sorting operation for one job, and the batch of the job is generally called as the wave order task in the industry.
The embodiment of the application can combine and classify the received at least one order according to different dimensions to obtain at least one wave task, wherein the dimensions can comprise at least one of the following: commodity type, commodity name, receiver, warehouse area, warehouse-out type, carrier, time of cut and order priority.
For example, multiple orders over a period of time may be aggregated into one wave order task. As another example, multiple orders with the same order structure may be aggregated into one wave order task; the information of the order structure may include: commodity category, commodity name, or receiver, or warehouse area, or shipment type, or carrier, or time of receipt, or order priority, etc. The summarizing process is equivalent to aggregating orders of a certain batch, so that subsequent inventory occupying actions can be performed according to the dimension of the wave order task.
Step A4, the order system sends an inventory occupation command according to a preset picking strategy;
after the wave order task is established, the order system can determine which stock position in the warehouse system the SKU in the order picks from, and first stock position information is obtained; or according to the inventory cleaning rule, calling a warehouse system to carry out the occupation aiming at the SKU in the order, determining which bin in the warehouse system the SKU is occupied to, and obtaining second bin information, wherein the process is called inventory occupation. For example, 10 SKUs 1 are required in the order, the order system will inform the warehouse system that 10 SKUs 1 are occupied for the order at the bin corresponding to SKU1.
The picking strategy is a classification strategy defined according to the picking operation mode or the package structure of the warehouse system. For example: the goods are classified according to the number and the number of the goods, such as single goods, multiple goods and multiple goods, and the like, and the goods can be classified according to the wrapping structure attributes of the goods, such as fragile goods, upward placement on the front surface and the like.
And step A5, the warehouse system performs the warehouse position occupation of the SKU.
The warehouse system can carry out inventory occupation corresponding to the inventory units according to the first inventory information or the second inventory information generated by the order system. And after the warehouse system occupies the warehouse site successfully, the first warehouse site information or the second warehouse site information is sent to the order system.
And A6, the order system generates a picking order or a picking task according to the storage position occupation information returned by the warehouse system.
Pick sheets or pick orders may include: name, number and occupation position information of the target commodity.
And A7, the order system sends the picking order or the picking task to the warehouse system.
And A8, the warehouse system picks the target commodity according to the picking order or the picking task.
According to the embodiment of the application, warehouse operations such as picking operations are executed according to the object data, and the execution efficiency of warehouse operations such as picking operations can be improved.
The process of executing the picking job according to the embodiment of the present application may include: determining the shelf distance between any two shelves according to the object data corresponding to any two shelves; and determining a target picking path corresponding to the picking task according to the shelf distance between any two shelves. Wherein the target picking path may be a picking path between the task start point and the task end point meeting a predetermined condition. The destination picking path may be a picking workstation at the end of the task.
If the picking order requires to carry a storage device corresponding to the A, B, C, D, E, F items, the storage device may be a bin or the like. Assuming that storage devices corresponding to A, B, C, D, E, F six commodities are respectively located on different shelves, the embodiment of the application can conduct path planning according to the shelf distance between any two shelves so as to obtain the carrying sequence corresponding to A, B, C, D, E, F six commodities, and therefore the path sequence corresponding to the target picking path can be obtained.
In practical applications, the task starting point may be a location point corresponding to a shelf where the commodity is located. For example, the first sorting may be performed in order of increasing distance between the shelf where the commodity is located and the task end point, and the position point corresponding to the shelf in the first X-bit row in the first sorting result may be used as the task start point, where X may be a positive integer such as 1. Of course, the task starting point may be determined in other manners, and it is understood that embodiments of the present application are not limited to specific task starting points.
The predetermined condition may be a predetermined condition, which may be used to constrain the path length. In this way, the embodiment of the application can determine the carrying sequence among the plurality of commodities contained in the picking task according to the path length corresponding to the preset condition.
In embodiments of the present application, the route points may correspond to shelf location points other than the task start point. For example, the shelf location point corresponding to the commodity a may be a route point when the shelf location point corresponding to the commodity B, C, D, E, F or the like is the task start point.
The predetermined condition may be used to constrain the path length. For example, the predetermined condition may be: the path length is less than a length threshold; alternatively, in the case where the path lengths of the plurality of picking paths are sorted in the order from small to large, the path length of the target picking path may be arranged in the first Y bits or the like in the second sorting result, and Y may be a positive integer such as 1.
The embodiment of the application does not limit the specific path planning method. For example, the embodiments of the present application may use a graph search method, a fast extended random tree method, or the like. The graph searching method may include: visual methods or dijkstra methods, and the like. The dijkstra method may perform multiple searching, where in one searching process, a target position point (including a passing point and a task end point) closest to the task start point and not visited is searched from position points (including a passing point and a task end point), the target position point is taken as an intermediate position point, and distances from the task start point to other position points are updated until all the position points are taken as intermediate position points.
In the path planning process, the embodiment of the application can determine the path length of the picking path according to the shelf distance between any two shelves, so that the target picking path meeting the preset condition can be determined from a plurality of picking paths.
For example, the pick path includes: the method and the device can determine the target picking path meeting the preset condition from a plurality of picking paths according to the path length of the picking paths.
In practical application, the shelf distance between the shelf a and the shelf B may be determined according to the point coordinates included in the object data corresponding to any two shelves, such as the point coordinates corresponding to the shelf a and the shelf B, respectively. And, the shelf distance matrix can be generated according to the shelf distance between any two shelves. The elements in the shelf distance matrix may represent the shelf distance between any two shelves.
In summary, the graphics processing method of the embodiment of the application can output corresponding object data aiming at the CAD file meeting the preset specification, so that one-key importing and one-key analyzing of the CAD file can be realized.
And the object data corresponding to the output data object can be used for a two-dimensional scene and/or a two-dimensional scene.
In addition, the scene template of the embodiment of the application can comprise: the device comprises a processing unit, a parsing unit and a filtering unit. Wherein, different processing results can be realized by different combinations of processing units, or different combinations of processing units and analyzing units, or different combinations of processing units, analyzing units and filtering units. For example, the same processing unit may be matched with different parsing units or filtering units to obtain different processing results. Therefore, the structure of the scene template provided by the embodiment of the application has the advantages of high flexibility and wide scene application range.
Furthermore, the embodiment of the application can express the object data by using the structured data, so that the object data can be directly used, and secondary development is facilitated.
Furthermore, in the embodiment of the application, the standing book can be generated for the data objects such as rooms, devices and the like, and the fields of the standing book can comprise: codes, names, attributes, etc., facilitating asset management.
In addition, the embodiment of the application can combine the processing unit, the analysis unit and the filtering unit to quickly obtain the scene model corresponding to the analysis scene.
Under the condition that the structure of the scene template of the embodiment of the application has the advantages of high flexibility and wide scene application range, the embodiment of the application can be suitable for a plurality of technical fields using CAD.
In addition, the embodiment of the application can realize the self-defining function by utilizing the self-defining processing module.
The embodiment of the application can reduce the cost of three-dimensional modeling. Specifically, the embodiment of the application can acquire the object height and the bottom surface height of the data objects such as doors and windows, floors, interlayers and the like by using the elevation processing unit, so that the three-dimensional data of the data objects can be determined according to the two-dimensional data (such as the point coordinates of the graph) and the height data (the object height and the bottom surface height).
The embodiment of the application does not limit the graphic elements in the CAD file. Specifically, the embodiment of the application can process graphic elements such as lines, points, polygons and the like, and can process graphic elements such as blocks, nested blocks, stretching blocks, visibility blocks, ellipses, circular arcs and the like.
It should be noted that, for simplicity of description, the method embodiments are shown as a series of acts, but it should be understood by those skilled in the art that the embodiments are not limited by the order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred embodiments, and that the acts are not necessarily required by the embodiments of the application.
System embodiment
Referring to FIG. 6, a block diagram of a graphics processing system embodying the present application is shown, the system comprising: a graphics processing means 601 and a data object 602 in a repository;
the graphics processing apparatus 601 is configured to perform the foregoing method, determine object data corresponding to a data object for a CAD file including the data object, output object data corresponding to the data object 602, and perform a preset process according to the object data.
In actual use, the data objects 602 in the repository may contain: rooms and equipment, etc. Examples of devices may include: shelves, etc.
In a particular implementation, graphics processing device 601 may receive CAD files; determining a scene template corresponding to the CAD file; processing the CAD file by using the scene template to obtain object data corresponding to a data object contained in the CAD file; wherein, the scene template includes: an analysis unit and a processing unit; the analysis unit is used for analyzing the CAD file to obtain elements contained in the CAD file; the processing unit is used for processing the elements contained in the CAD file to obtain object data corresponding to the data object.
In a specific implementation, the elements include: a graphic element; the processing unit includes: a graphic processing unit and a data object processing unit;
processing the CAD file includes: determining a graph corresponding to the graph element by using the graph processing unit; and processing the graph corresponding to the graph element by using the data object processing unit so as to obtain object graph data corresponding to the data object.
In a specific implementation, the elements include: a line element; the processing unit includes: a form processing unit and a text processing unit;
the processing of the CAD file comprises the following steps: determining a table corresponding to the line element by using a table processing unit; and analyzing the text contained in the table by using a text processing unit to obtain object attribute data corresponding to the data object.
In a specific implementation, the scene template may further include: a filtering unit; the filtering unit is used for determining a preset range and/or target graphic data corresponding to a preset graphic layer from the CAD file, and the analyzing unit is also used for acquiring elements contained in the target graphic data.
In a specific implementation, the process of determining the scene template corresponding to the CAD file may specifically include: presenting at least one scene template option; receiving a target scene template option selected by a user; and determining the scene template corresponding to the CAD file according to the scene template corresponding to the target scene template option.
In a specific implementation, the scene template obtaining process specifically includes: creating scene template options; determining an analysis unit and a processing unit corresponding to the scene template options; and saving the mapping relation between the analysis unit and the processing unit and the scene template options.
In a specific implementation, the object data may be structured data; the graphics processing apparatus 601 may further perform a preset process according to the object data; the preset processing comprises the following steps: a ledger or visualization model is generated or a warehouse job is performed.
In summary, the graphics processing system of the embodiment of the application can output corresponding object data for the CAD file conforming to the preset specification, so that one-key importing and one-key analyzing of the CAD file can be realized.
And the object data corresponding to the output data object can be used for a two-dimensional scene and/or a two-dimensional scene.
In addition, the scene template of the embodiment of the application can comprise: the device comprises a processing unit, a parsing unit and a filtering unit. Wherein, different processing results can be realized by different combinations of processing units, or different combinations of processing units and analyzing units, or different combinations of processing units, analyzing units and filtering units. For example, the same processing unit may be matched with different parsing units or filtering units to obtain different processing results. Therefore, the structure of the scene template provided by the embodiment of the application has the advantages of high flexibility and wide scene application range.
Furthermore, the embodiment of the application can express the object data by using the structured data, so that the object data can be directly used, and secondary development is facilitated.
Furthermore, in the embodiment of the application, the standing book can be generated for the data objects such as rooms, devices and the like, and the fields of the standing book can comprise: codes, names, attributes, etc., facilitating asset management.
In addition, the embodiment of the application can combine the processing unit, the analysis unit and the filtering unit to quickly obtain the scene model corresponding to the analysis scene.
Under the condition that the structure of the scene template of the embodiment of the application has the advantages of high flexibility and wide scene application range, the embodiment of the application can be suitable for a plurality of technical fields using CAD.
In addition, the embodiment of the application can realize the self-defining function by utilizing the self-defining processing module.
The embodiment of the application can reduce the cost of three-dimensional modeling. Specifically, the embodiment of the application can acquire the object height and the bottom surface height of the data objects such as doors and windows, floors, interlayers and the like by using the elevation processing unit, so that the three-dimensional data of the data objects can be determined according to the two-dimensional data (such as the point coordinates of the graph) and the height data (the object height and the bottom surface height).
The embodiment of the application does not limit the graphic elements in the CAD file. Specifically, the embodiment of the application can process graphic elements such as lines, points, polygons and the like, and can process graphic elements such as blocks, nested blocks, stretching blocks, visibility blocks, ellipses, circular arcs and the like.
The embodiment of the application also provides a non-volatile readable storage medium, in which one or more modules (programs) are stored, where the one or more modules are applied to a device, and the device can execute instructions (instructions) of each method step in the embodiment of the application.
Embodiments of the application provide one or more machine-readable media having instructions stored thereon that, when executed by one or more processors, cause an electronic device to perform a method as described in one or more of the above embodiments. In the embodiment of the application, the electronic equipment comprises a server, terminal equipment and other equipment.
Embodiments of the present disclosure may be implemented as an apparatus for performing a desired configuration using any suitable hardware, firmware, software, or any combination thereof, which may include a server (cluster), terminal, or the like. Fig. 7 schematically illustrates an example apparatus 1700 that may be used to implement various embodiments described in the present disclosure.
For one embodiment, FIG. 7 illustrates an example apparatus 1700 having one or more processors 1702, a control module (chipset) 1704 coupled to at least one of the processor(s) 1702, a memory 1706 coupled to the control module 1704, a non-volatile memory (NVM)/storage device 1708 coupled to the control module 1704, one or more input/output devices 1710 coupled to the control module 1704, and a network interface 1712 coupled to the control module 1704.
The processor 1702 may include one or more single-core or multi-core processors, and the processor 1702 may include any combination of general-purpose or special-purpose processors (e.g., graphics processors, application processors, baseband processors, etc.). In some embodiments, the apparatus 1700 can be used as a server, a terminal, or the like in the embodiments of the present application.
In some embodiments, the apparatus 1700 may include one or more computer-readable media (e.g., memory 1706 or NVM/storage 1708) having instructions 1714 and one or more processors 1702 combined with the one or more computer-readable media configured to execute the instructions 1714 to implement the modules to perform the actions described in this disclosure.
For one embodiment, the control module 1704 may include any suitable interface controller to provide any suitable interface to at least one of the processor(s) 1702 and/or any suitable device or component in communication with the control module 1704.
The control module 1704 may include a memory controller module to provide an interface to the memory 1706. The memory controller modules may be hardware modules, software modules, and/or firmware modules.
Memory 1706 may be used to load and store data and/or instructions 1714 for device 1700, for example. For one embodiment, memory 1706 may include any suitable volatile memory, such as, for example, a suitable DRAM. In some embodiments, memory 1706 may comprise double data rate type four synchronous dynamic random access memory (DDR 4 SDRAM).
For one embodiment, the control module 1704 may include one or more input/output controllers to provide interfaces to the NVM/storage 1708 and the input/output device(s) 1710.
For example, NVM/storage 1708 may be used to store data and/or instructions 1714. NVM/storage 1708 may include any suitable nonvolatile memory (e.g., flash memory) and/or may include any suitable nonvolatile storage device(s) (e.g., one or more Hard Disk Drives (HDDs), one or more Compact Disc (CD) drives, and/or one or more Digital Versatile Disc (DVD) drives).
NVM/storage 1708 may include a storage resource as part of the device on which apparatus 1700 is installed or may be accessible by the device without necessarily being part of the device. For example, NVM/storage 1708 may be accessed over a network via input/output device(s) 1710.
The input/output device(s) 1710 may provide an interface for the apparatus 1700 to communicate with any other suitable device, and the input/output device 1710 may include a communication component, an audio component, a sensor component, and the like. The network interface 1712 may provide the device 1700 with an interface to communicate over one or more networks, and the device 1700 may wirelessly communicate with one or more components of a wireless network according to any of one or more wireless network standards and/or protocols, such as accessing a wireless network based on a communication standard, such as WiFi, 2G, 3G, 4G, 5G, etc., or a combination thereof.
For one embodiment, at least one of the processor(s) 1702 may be packaged together with logic of one or more controllers (e.g., memory controller modules) of the control module 1704. For one embodiment, at least one of the processor(s) 1702 may be packaged together with logic of one or more controllers of the control module 1704 to form a System In Package (SiP). For one embodiment, at least one of the processor(s) 1702 may be integrated on the same die as logic of one or more controllers of the control module 1704. For one embodiment, at least one of the processor(s) 1702 may be integrated on the same die as logic of one or more controllers of the control module 1704 to form a system on a chip (SoC).
In various embodiments, the apparatus 1700 may be, but is not limited to being: a server, a desktop computing device, or a mobile computing device (e.g., a laptop computing device, a handheld computing device, a tablet, a netbook, etc.), among other terminal devices. In various embodiments, the device 1700 may have more or fewer components and/or different architectures. For example, in some embodiments, the apparatus 1700 includes one or more cameras, a keyboard, a Liquid Crystal Display (LCD) screen (including a touch screen display), a non-volatile memory port, multiple antennas, a graphics chip, an Application Specific Integrated Circuit (ASIC), and a speaker.
The device 1700 may employ a main control chip as a processor or a control module, the sensor data, the location information, etc. are stored in a memory or NVM/storage device, the sensor group may be an input/output device, and the communication interface may include a network interface.
The embodiment of the application also provides electronic equipment, which comprises: a processor; and a memory having executable code stored thereon that, when executed, causes the processor to perform a method as described in one or more of the embodiments of the application.
Embodiments of the application also provide one or more machine-readable media having stored thereon executable code that, when executed, causes a processor to perform a method as described in one or more of the embodiments of the application.
For the device embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference is made to the description of the method embodiments for relevant points.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described by differences from other embodiments, and identical and similar parts between the embodiments are all enough to be referred to each other.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable graphics processing terminal device to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable graphics processing terminal device, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable graphics processing terminal apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable graphics processing terminal apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiment and all such alterations and modifications as fall within the scope of the embodiments of the application.
Finally, it is further noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or terminal device comprising the element.
The foregoing has outlined rather broadly the principles and embodiments of the present application in order that the detailed description of the application that follows may be better understood, and in order that the present application may be better suited for use in conjunction with the detailed description that follows; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in accordance with the ideas of the present application, the present description should not be construed as limiting the present application in view of the above.
Claims (10)
1. A method of graphics processing, the method comprising:
receiving a CAD file;
determining a scene template corresponding to the CAD file;
processing the CAD file by using the scene template to obtain object data corresponding to a data object contained in the CAD file; wherein, the scene template includes: an analysis unit and a processing unit; the analysis unit is used for analyzing the CAD file to obtain elements contained in the CAD file; the processing unit is used for processing the elements contained in the CAD file to obtain object data corresponding to the data object;
and outputting object data corresponding to the data object.
2. The method of claim 1, wherein the element comprises: a graphic element; the processing unit includes: a graphic processing unit and a data object processing unit;
the processing of the CAD file comprises the following steps:
determining a graph corresponding to the graph element by using the graph processing unit;
and processing the graph corresponding to the graph element by using the data object processing unit so as to obtain object graph data corresponding to the data object.
3. The method of claim 1, wherein the element comprises: a line element; the processing unit includes: a form processing unit and a text processing unit;
the processing of the CAD file comprises the following steps:
determining a table corresponding to the line element by using a table processing unit;
and analyzing the text contained in the table by using a text processing unit to obtain object attribute data corresponding to the data object.
4. The method of claim 1, wherein the scene template further comprises: a filtering unit; the filtering unit is used for determining a preset range and/or target graphic data corresponding to a preset graphic layer from the CAD file, and the analyzing unit is also used for acquiring elements contained in the target graphic data.
5. The method according to any one of claims 1 to 4, wherein the determining a scene template corresponding to the CAD file includes:
presenting at least one scene template option;
receiving a target scene template option selected by a user;
and determining the scene template corresponding to the CAD file according to the scene template corresponding to the target scene template option.
6. The method according to any one of claims 1 to 4, wherein the scene template acquisition process comprises:
creating scene template options;
determining an analysis unit and a processing unit corresponding to the scene template options;
and storing the mapping relation among the analysis unit, the processing unit and the scene template options.
7. A method of graphics processing, the method comprising:
determining object data corresponding to the CAD file; the object data is structured data;
performing preset processing according to the object data; the preset processing comprises the following steps: generating a ledger or a visual model, or executing warehouse operation;
the process for determining the object data corresponding to the CAD file comprises the following steps: determining a scene template corresponding to the CAD file; processing the CAD file by using the scene template to obtain object data corresponding to a data object contained in the CAD file; wherein, the scene template includes: an analysis unit and a processing unit; the analysis unit is used for analyzing the CAD file to obtain elements contained in the CAD file; the processing unit is used for processing the elements contained in the CAD file to obtain object data corresponding to the data object.
8. A graphics processing system, the system comprising: a graphics processing device and data objects in a repository;
the graphics processing apparatus is configured to perform the method according to any one of claims 1 to 7, determine object data corresponding to the data object for a CAD file containing the data object, output the object data corresponding to the data object, and perform a preset process according to the object data.
9. An electronic device, comprising: a processor; and
a memory having executable code stored thereon that, when executed, causes the processor to perform the method of any of claims 1-7.
10. One or more machine readable media having executable code stored thereon that, when executed, causes a processor to perform the method of any of claims 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311050958.7A CN117197212A (en) | 2023-08-18 | 2023-08-18 | Graphics processing method, system, device and medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311050958.7A CN117197212A (en) | 2023-08-18 | 2023-08-18 | Graphics processing method, system, device and medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117197212A true CN117197212A (en) | 2023-12-08 |
Family
ID=88995218
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311050958.7A Pending CN117197212A (en) | 2023-08-18 | 2023-08-18 | Graphics processing method, system, device and medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117197212A (en) |
-
2023
- 2023-08-18 CN CN202311050958.7A patent/CN117197212A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20230084389A1 (en) | System and method for providing bottom-up aggregation in a multidimensional database environment | |
US10255302B1 (en) | Systems, methods, apparatuses, and/or interfaces for associative management of data and inference of electronic resources | |
CN104484790A (en) | Address match method and device of logistics business | |
US20130325673A1 (en) | Coordinate model for inventory visualization in conjunction with physical layout | |
CN113760891B (en) | Data table generation method, device, equipment and storage medium | |
US10818082B2 (en) | Method and system for parametrically creating an optimal three dimensional building structure | |
US12106257B2 (en) | Data processing system and method for operating an enterprise application | |
CN112328224A (en) | Service interface docking method and device, storage medium and electronic equipment | |
CN113283355A (en) | Form image recognition method and device, computer equipment and storage medium | |
US10783325B1 (en) | Visual data mapping | |
US20170300531A1 (en) | Tag based searching in data analytics | |
US9152617B2 (en) | System and method for processing objects | |
EP3617910A1 (en) | Method and apparatus for displaying textual information | |
CN113326314A (en) | Data visualization method and device, electronic equipment and readable storage medium | |
CN116126809A (en) | Building information model data storage conversion method based on national standard | |
US20240193622A1 (en) | Interactive user interface (iui) and associated systems and methods for display of sustainability information corresponding to an item | |
CN114022702A (en) | Intelligent warehouse management method and device, electronic equipment and storage medium | |
US9996535B1 (en) | Efficient hierarchical user interface | |
US11048864B2 (en) | Digital annotation and digital content linking techniques | |
CN101388018A (en) | Computer aided design document management method | |
CN104268191A (en) | Document display method and device | |
CN117197212A (en) | Graphics processing method, system, device and medium | |
CN111444368A (en) | Method and device for constructing user portrait, computer equipment and storage medium | |
KR20130083899A (en) | Hyper-lattice model for optimized sequencing of online analytical processing (olap) operations on data warehouses | |
CN117425887A (en) | Data processing method, device, electronic equipment and computer readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |