CN116339336A - Electric agricultural machinery cluster collaborative operation method, device and system - Google Patents

Electric agricultural machinery cluster collaborative operation method, device and system Download PDF

Info

Publication number
CN116339336A
CN116339336A CN202310322177.2A CN202310322177A CN116339336A CN 116339336 A CN116339336 A CN 116339336A CN 202310322177 A CN202310322177 A CN 202310322177A CN 116339336 A CN116339336 A CN 116339336A
Authority
CN
China
Prior art keywords
agricultural
agricultural machine
panoramic image
cluster
machines
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310322177.2A
Other languages
Chinese (zh)
Inventor
刘宁
胡可君
赵文江
李连鹏
刘福朝
李羚
袁超杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Information Science and Technology University
Original Assignee
Beijing Information Science and Technology University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Information Science and Technology University filed Critical Beijing Information Science and Technology University
Priority to CN202310322177.2A priority Critical patent/CN116339336A/en
Publication of CN116339336A publication Critical patent/CN116339336A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0219Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory ensuring the processing of the whole working surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • G06Q10/047Optimisation of routes or paths, e.g. travelling salesman problem
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Forestry; Mining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • Multimedia (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Health & Medical Sciences (AREA)
  • Remote Sensing (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • General Health & Medical Sciences (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Quality & Reliability (AREA)
  • Operations Research (AREA)
  • Game Theory and Decision Science (AREA)
  • Development Economics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Agronomy & Crop Science (AREA)
  • Animal Husbandry (AREA)
  • Marine Sciences & Fisheries (AREA)
  • Mining & Mineral Resources (AREA)
  • Primary Health Care (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application discloses a method, a device and a system for collaborative operation of an electric agricultural machine cluster. Wherein the method comprises the following steps: acquiring a panoramic image of an operation land block, and preprocessing the panoramic image; planning the operation route of each agricultural machine on the operation land based on the preprocessed panoramic image to obtain the corresponding shortest path of each agricultural machine; and acquiring the positions of the agricultural machines in real time, and controlling the agricultural machines in the agricultural machine cluster to respectively operate according to the corresponding shortest paths based on the acquired positions and the corresponding shortest paths. The technical problem of agricultural machine cluster collaborative operation inefficiency among the prior art has been solved to this application.

Description

Electric agricultural machinery cluster collaborative operation method, device and system
Technical Field
The application relates to the field of agricultural machine control, in particular to a method, a device and a system for cooperative operation of an electric agricultural machine cluster.
Background
After the agricultural 4.0 era is entered, the refinement degree of the agricultural machine operation task allocation is continuously enhanced, under the agricultural 4.0 framework, the cooperative work of clustered agricultural machines can more comprehensively and flexibly improve the operation efficiency and the safety, reduce the operation cost and the operation time, and facilitate centralized arrangement and management along with the continuous development of technologies such as sensors, information systems and the like.
At present, methods related to agricultural machinery cooperation are various, and representative methods are as follows: (1) Point-to-point agricultural machinery, wherein the agricultural machinery involved is usually similar/identical, the whole task is split into a plurality of similar or identical parallel tasks, each machine is used for completing a part of weeding operation, such as a group of weeding robots, the point-to-point agricultural machinery has obvious defects in cooperation, namely, the agricultural machinery type and the tasks are similar, only one ring in the agricultural tasks, such as weeding, tilling and the like, can be completed, and the land information of a work land area cannot be obtained before the operation, and manual setting or remote operation is needed; (2) The master-slave agricultural machine comprises a main agricultural machine and an accompanying agricultural machine, such as a combined harvesting-threshing machine, and one operation task is split into different parts which are respectively solved by the master-slave agricultural machine. The master-slave cooperative agricultural machinery is mostly large-scale combined agricultural machinery, is suitable for large-area operation plots, has high purchase and use costs, and needs manual operation or input of plot information in advance; (3) The team-type agricultural machinery is closely related to the linkage between the cooperative agricultural machinery, and the cooperative agricultural machinery is performed simultaneously, for example, a group of small robots are used for moving large articles, the agricultural machinery cooperation also depends on a preliminary remote sensing measurement task to obtain land block information to a great extent, and the cooperation precision requirement is high. When the same/similar cluster agricultural machinery uses a single cooperation method, the problems of low efficiency, too high participation of operators and high labor intensity are difficult to solve.
In view of the above problems, no effective solution has been proposed at present.
Disclosure of Invention
The embodiment of the application provides a method, a device and a system for collaborative operation of an electric agricultural machine cluster, which are used for at least solving the technical problem of low collaborative operation efficiency of the agricultural machine cluster in the prior art.
According to an aspect of the embodiments of the present application, there is provided a method for collaborative operation of an electric farm machine cluster, including: acquiring a panoramic image of an operation land block, and preprocessing the panoramic image; planning the operation route of each agricultural machine on the operation land based on the preprocessed panoramic image to obtain the corresponding shortest path of each agricultural machine; and acquiring the positions of the agricultural machines in real time, and controlling the agricultural machines in the agricultural machine cluster to respectively operate according to the corresponding shortest paths based on the acquired positions and the corresponding shortest paths.
According to another aspect of the embodiments of the present application, there is also provided an electric agricultural machinery cluster collaborative operation apparatus, including: the image processing module is configured to acquire a panoramic image of the operation land block and preprocess the panoramic image; the route planning module is configured to plan the operation route of each agricultural machine on the operation land block in the agricultural machine cluster based on the preprocessed panoramic image to obtain a corresponding shortest path of each agricultural machine; and the agricultural machine control module is configured to acquire the positions of the agricultural machines in real time, and control the agricultural machines in the agricultural machine cluster to respectively operate according to the corresponding shortest paths based on the acquired positions and the corresponding shortest paths.
According to still another aspect of the embodiments of the present application, there is further provided an electric agricultural machinery cluster collaborative operation system, an unmanned aerial vehicle configured to scan an operation plot by a photogrammetry method, and obtain a panoramic image of the operation plot; the Beidou navigation system is configured to provide the positions of all the agricultural machines in the agricultural machine cluster in real time; electric agricultural machinery cluster collaborative operation device includes: the image processing module is configured to acquire a panoramic image of the operation land block and preprocess the panoramic image; the route planning module is configured to plan the operation route of each agricultural machine on the operation land block in the agricultural machine cluster based on the preprocessed panoramic image to obtain a corresponding shortest path of each agricultural machine; and the agricultural machine control module is configured to acquire the positions of the agricultural machines in real time, and control the agricultural machines in the agricultural machine cluster to respectively operate according to the corresponding shortest paths based on the acquired positions and the corresponding shortest paths.
In the embodiment of the application, based on the preprocessed panoramic image, the operation route of each agricultural machine on the operation land is planned, the corresponding shortest path of each agricultural machine is obtained, the position of each agricultural machine is obtained in real time, and based on the obtained position and the corresponding shortest path, each agricultural machine in the agricultural machine cluster is controlled to operate according to the corresponding shortest path, so that the technical effect of greatly improving the operation efficiency is achieved, and the technical problem of low collaborative operation efficiency of the agricultural machine cluster in the prior art is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute an undue limitation to the application. In the drawings:
FIG. 1 is a flow chart of a method of electric farm machine cluster collaborative operation according to an embodiment of the present application;
FIG. 2 is a flow chart of another method of electric farm cluster collaborative operation according to an embodiment of the present application;
FIG. 3 is a schematic diagram of centralized queuing based on the Leader-Follower method according to an embodiment of the present application;
FIG. 4 is a flow chart of yet another method of electric farm cluster collaborative operation according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of an electric farm cluster collaborative operation system according to an embodiment of the present application.
Detailed Description
In order to make the present application solution better understood by those skilled in the art, the following description will be made in detail and with reference to the accompanying drawings in the embodiments of the present application, it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, shall fall within the scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that embodiments of the present application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
According to an embodiment of the present application, there is provided a method for collaborative operation of an electric farm machine cluster, as shown in fig. 1, including:
step S102, obtaining a panoramic image of the operation land block, and preprocessing the panoramic image.
For example, removing noise in the panoramic image by filtering, and performing distortion correction on the panoramic image after removing the noise; and performing space conversion on lens coordinates of the panoramic image after distortion correction, and converting the panoramic image into a two-dimensional image coordinate system. Extracting key information in the distortion corrected panoramic image, and calculating the association between different distortion corrected panoramic images; based on the correlation, pose estimation is performed, and based on the estimated pose, a two-dimensional grid map construction is constructed.
For example, sequentially extracting two adjacent frames of images from the panoramic images after different distortion corrections; calculating the relative pose between two adjacent frame images by utilizing the 2D projection position of the 3D position of the characteristic point in one frame image in the other frame image; the pose estimation is performed based on the relative pose.
Then, based on the estimated pose, carrying out depth recovery on projection points of road signs in the panoramic image by adopting a camera model to obtain three-dimensional points of the road signs in space, wherein the three-dimensional points of all the road signs form a sparse point cloud map under the same coordinate system; and converting the sparse point cloud map into an octree map, and converting the octree map into the two-dimensional grid map.
Step S104, planning the operation route of each agricultural machine on the operation land block in the agricultural machine cluster based on the preprocessed panoramic image to obtain the corresponding shortest path of each agricultural machine.
And step S106, acquiring the positions of the agricultural machines in real time, and controlling the agricultural machines in the agricultural machine cluster to respectively operate according to the corresponding shortest paths based on the acquired positions and the corresponding shortest paths.
Firstly, dividing each agricultural machine into a pilot agricultural machine and a following agricultural machine by adopting a centralized formation method, wherein the precision of navigation equipment arranged on the pilot agricultural machine is greater than that of navigation equipment arranged on the following agricultural machine.
Secondly, based on the acquired positions and the corresponding shortest paths, utilizing an inertial sensor array collaborative navigation system to navigate each agricultural machine so as to control the speed, the inter-agricultural machine spacing and the positions of each agricultural machine; controlling the agricultural machines to cooperatively and parallelly perform different operations while navigating the agricultural machines, wherein the operations comprise at least one of the following: ditching, sowing, covering soil and watering.
The embodiment provides a multistage type small-sized motor farm machinery group collaborative operation method, firstly, an unmanned rotorcraft carrying a 3D sensor (such as LiDAR or depth camera) is used for scanning an operation land by a photogrammetry, preprocessing is carried out on a panoramic image of the operation land, information of the operation land can be obtained after processing, manual measurement is not needed, a route planning module is used for planning an operation route of an farm machinery group, finally, a Beidou navigation system is used for obtaining the position of the farm machinery, and the shortest route is transmitted to a control module of each farm machinery to control the farm machinery group to operate, so that the operation efficiency is greatly improved.
Example 2
According to an embodiment of the present application, there is provided another electric agricultural machine cluster collaborative operation method, as shown in fig. 2, including:
step S202, acquiring an heaven and earth panoramic image through the unmanned aerial vehicle.
Panoramic images of the work area are acquired by an unmanned gyroplane that carries a 3D sensor (such as a LiDAR or depth camera).
Step S204, preprocessing the panoramic image through an image processing module.
And preprocessing the panoramic image of the land parcels through an image processing module. Because the obtained panoramic image has noise and distortion to a certain extent, the panoramic image needs to be input into an image processing module for denoising, correction, segmentation and other processing.
The image processing unit includes a smoothing unit, a correction unit, and a segmentation unit. The smoothing unit removes noise in the panoramic image through filtering, and the correction unit corrects distortion of the panoramic image of the work area. And then, carrying out space conversion on the lens coordinates, and converting the lens coordinates into a two-dimensional image coordinate system.
After the depth camera carried by the unmanned aerial vehicle acquires the panoramic image of the operation land, key information such as edges, angles and the like in the image is extracted, and the association between different images is calculated, so that the subsequent pose estimation and map construction are facilitated.
The pose estimation method adopts a PnP algorithm, and calculates and obtains the relative pose between two frames of images by means of the 2D projection position of the 3D position of the characteristic point in one frame of image in the other frame of image. For monocular cameras, the 3D position of the feature points can be obtained by triangulation, and for binocular and RGBD cameras, directly from the depth information.
The pose equation is:
Figure BDA0004152173240000061
wherein (X Y Z1) T Is the homogeneous coordinates of the spatial point P, (u) 1 v 1 1) T The 3 x 4 matrix on the right is the augmented matrix of the camera pose R, t to be solved. The three-dimensional point cloud map is mainly divided into a sparse point cloud map and a dense point cloud map.
In the embodiment of the application, the sparse point cloud map is used, and after the camera pose is obtained, the depth recovery is carried out on the projection points of the road signs by adopting a camera model according to the current camera pose, so that the three-dimensional points of the road signs in the space are obtained, and the sparse point cloud map is formed by the three-dimensional points of all the road signs under the same coordinate system. Wherein, the camera model is as follows:
ZP uv =KTP W
wherein P is uv Projection of road sign, K is an internal reference, T is camera pose, Z is depth, and P W Is a road sign three-point. The three-dimensional sparse point cloud map is converted into a two-dimensional grid map by means of an octree model, the three-dimensional sparse point cloud is converted into the grid map by first converting into the octree map, and octrees can be used for describing a three-dimensional space. The octree map is then reconverted to a two-dimensional grid map. Through coordinate transformation and data elimination, the conversion from a three-dimensional map to a two-dimensional map can be realized, and a navigable two-dimensional grid map is obtained.
Step S206, planning the operation route of the farm machine group through a route planning module based on the preprocessed panoramic image information, and generating the shortest route.
First, a start point and an end point are determined. In a given two-dimensional map, explicit starting and ending points are required. The start point is the node from where to start finding the shortest path, and the end point is the target node of the shortest path to find.
Next, the distance and precursor nodes are initialized. Before starting to find the shortest path, the distance of each node needs to be initialized to infinity, and the precursor node of each node needs to be initialized to null. It is also necessary to set the distance from the start point to 0 because the distance from the start point to the start point is 0.
A queue is then created, and a queue is used to store nodes that have not yet been processed. The starting point is added to the queue.
Finally, find the shortest path. The first node is fetched from the queue and all nodes reachable from that node are looked up. For each reachable node, the distance from the origin to that node is calculated. If the distance is shorter than the currently stored distance, the distance of the node and the precursor node are updated and added to the queue. And repeatedly executing the step, and continuously searching the shortest path until the queue is empty or the end point is found. When the endpoint is found, the shortest path can be traced back from the endpoint. From the end point, the precursor node is moved back until it returns to the start point. This path is the shortest path from the start point to the end point.
The process goes to step S210.
Step S208, acquiring the positions of all the agricultural machinery through the Beidou navigation system.
The Beidou navigation system comprises an antenna, a transmitting unit, a processing unit and a positioning engine. The transmitting unit amplifies and converts satellite signals received by the antenna, the processing unit tracks and captures the converted signals, and the signals are sent to the positioning engine to obtain the position of the agricultural machinery.
Each Beidou navigation agricultural machine in the application mainly uses a differential positioning system. The differential reference station using the R TK technology transmits differential signals outwards through a radio station, and an automatic driving control system receiving differential data and satellite navigation signals calculates the data to obtain information such as heading, position and speed so as to realize high-precision automatic navigation.
Step S210, an agricultural machine control module is utilized to control the operation of the agricultural machine.
And inputting the shortest path and the farm machinery group position into a control module to control the farm machinery group operation. The control module comprises a receiving unit and a servo motor, wherein the receiving unit receives the position and the optimal path of the agricultural machinery, and controls the speed, the inter-agricultural machinery spacing and the position of the agricultural machinery.
The cooperative mode of each ground agricultural machine is a centralized formation method based on a Leader-Follower method. Compared with the distributed formation, the pilot formation can be provided with navigation equipment with different precision on pilot agricultural machinery and following agricultural machinery, initial alignment, time correction and the like can be completed through GPS or Beidou, and the cooperative control capability of the agricultural machinery group can be greatly improved.
As shown in figure 3, when the small-sized motor farm machines work cooperatively, a single or multiple groups of I-shaped queues are adopted, and the I-shaped farm machines positioned at the head of the queues are responsible for ditching so that seeds can be planted at proper depth. The second agricultural machinery group is responsible for scattering a certain number of crop seeds into the ditches dug by the first agricultural machinery at a certain interval. The third agricultural machinery group is responsible for earthing operation, covering quantitative soil on seeds sowed by the second agricultural machinery group and compacting and watering. The farm machinery group keeps the distance between the farm machinery fixed in the operation process, and the farm machinery group proceeds to operate according to the shortest path planned by the path planning module.
In the method, a collaborative navigation technology is adopted during operation among the agricultural machinery, and a plurality of micro-electromechanical system inertial measurement units (MEMS-IMU) are used in the collaborative navigation technology, so that observability of errors of inertial navigation elements can be improved, and positioning accuracy is improved. The inertial sensor array collaborative navigation system is very typical for use in this application. The micro-electromechanical system inertial measurement unit (MEMS-IMU) array collaborative navigation technology has the outstanding advantage of being capable of directly estimating a plurality of uncertain factors of an inertial sensor, thereby improving the precision, reliability and robustness of the system.
The agricultural machinery control module performs collaborative navigation according to the inertial sensor array set on the agricultural machinery, the positioning information of the Beidou satellite and the shortest path planned by the path planning module.
The application uses the low-cost inertial sensor array collaborative navigation system consisting of 4 MEMS-IMUs, and can realize indoor and outdoor high-precision seamless navigation positioning without GNSS. In agricultural machinery collaborative navigation, various complex situations such as sudden shaking, sideslip, uneven road surface and the like are inevitably encountered. Therefore, there are many highly random, self-motion uncertainties in unstructured and dynamic scenarios in the work environment, requiring filtering to suppress noise, improving system accuracy. In order to efficiently and accurately fuse information of all subsystems in the 4 MEMS-IMU array collaborative navigation system, a robust federal filtering algorithm based on DoO and DoA is used under a Kalman filtering framework. For each subsystem in the collaborative navigation system consisting of 4 MEMS-IMUs arrays, the state and measurement equations in discrete time form are:
Figure BDA0004152173240000091
wherein the method comprises the steps of
Figure BDA0004152173240000092
And->
Figure BDA0004152173240000093
A discrete state conversion matrix, an input matrix and a measurement matrix for the ith sub-filter at the k moment; />
Figure BDA0004152173240000094
For the error state vector of the ith sub-filter, the covariance matrix is +.>
Figure BDA0004152173240000095
For the process noise of the ith sub-filter, the covariance matrix is +.>
Figure BDA0004152173240000096
Is the measurement vector of the ith sub-filter; />
Figure BDA0004152173240000097
For the measurement noise of the ith sub-filter, the covariance matrix is +.>
Figure BDA0004152173240000098
The algorithm of the robust federal filter is as follows:
1): initializing an ith sub-filter.
Figure BDA0004152173240000099
Figure BDA00041521732400000910
2) Time update of the ith sub-filter.
Figure BDA0004152173240000101
Figure BDA0004152173240000102
3): the DoA of the ith sub-filter is calculated.
Figure BDA0004152173240000103
4) And carrying out adaptive filtering according to the DoA in the ith sub-filter.
Figure BDA0004152173240000104
Figure BDA0004152173240000105
Figure BDA0004152173240000106
5) Obtaining the observability matrix of the ith sub-filter
Figure BDA0004152173240000107
6) DoO for the error state in the i-th subsystem is calculated.
Figure BDA0004152173240000108
7) And solving the output vector of each subsystem.
Figure BDA0004152173240000109
8) Information fusion based on DoO analysis results.
Figure BDA00041521732400001010
In step 4), when the DoA is at different levels:
Figure BDA00041521732400001011
Figure BDA00041521732400001012
Figure BDA0004152173240000111
in this algorithm, the coordinate transformation formula is:
Figure BDA0004152173240000112
because various complex conditions such as uneven road surfaces can be met by considering the characteristics of agricultural machine navigation, in order to improve the positioning and orientation precision of the agricultural machine, an abnormal degree calculation method is provided, and the current working state of the system is quantitatively analyzed. Based on the kalman filter algorithm, the true covariance matrix can be directly expressed as:
Figure BDA0004152173240000113
wherein δZ k The calculated innovation is a correction matrix by comparing the measured value and the estimated value output by the current system, so that the accuracy of a result covariance matrix can be improved, error accumulation caused by random noise is greatly inhibited, and the calculated innovation covariance can be recursively obtained according to a Kalman filtering algorithm:
Figure BDA0004152173240000114
wherein H is k Is the measurement matrix at time period k, P k,k-1 Is the state covariance matrix after the time update at time period k, R K Is the measurement noise covariance matrix. The addition of the correction matrix can greatly inhibit error accumulation caused by random noise, and the working state of the system can be better estimated in real time.
The embodiment of the application provides a multistage type small-sized motor farm machinery group collaborative operation method, firstly, an unmanned rotorcraft carrying a 3D sensor (such as LiDAR or depth camera) is used for scanning an operation land block through a photogrammetry, preprocessing is carried out on a panoramic image of the operation land block, information of the operation land block can be obtained after processing, manual measurement is not needed, a route planning module is used for planning an operation route of an agricultural machinery group, and finally, a Beidou module is used for obtaining the position of the agricultural machinery, transmitting the shortest route to a control module of each agricultural machinery, and controlling the agricultural machinery group to operate.
According to the method and the device, the problem that a single agricultural machine cannot finish ditching, sowing, soil covering and watering at the same time during sowing is solved, the problem can be solved through cooperation of a plurality of agricultural machines of different industrial types, and the working efficiency is greatly improved. Meanwhile, the small agricultural machinery group collaborative operation control method based on Beidou navigation is provided, and the problems that in the prior art, driving route planning is inaccurate, early-stage preparation work is complex, and the working efficiency of the small agricultural machinery is low can be solved.
Example 3
The embodiment of the application also provides a cooperative operation method of the electric agricultural machinery cluster, as shown in fig. 4, the method comprises the following steps:
step S402, acquiring operation land parcel information.
The system can be simplified when information of an operation land is obtained before the operation is performed by the small-sized motor farm, or when the information is obtained without the need of scanning the land in advance by using an unmanned gyroplane carrying a 3D camera.
The system comprises a ground small-sized motor farm machine group, a path planning module, an agricultural machine control module and a Beidou navigation system.
In this embodiment, the land parcel information is acquired and recorded in advance, and the unmanned gyroplane carrying the 3D camera is not required to be used for scanning the land parcel in advance, so that the system is simplified. Firstly, processing land parcel information, planning a working line of an agricultural machine group through a path planning module based on the land parcel information input in advance, and generating a shortest path. The path optimization method of the small farm machine group used by the system is mainly a real-time optimization method of cooperative operation of the special farm machine group, and aims at the problem of cooperative operation of multiple farm machines with different operation widths and operation speeds. According to the special-shaped cluster operation characteristics, cluster operation is divided into a side-by-side operation stage and a residual farmland operation stage after the side-by-side operation is finished, when the clusters are operated side by side, the clusters are arranged in order from the big operation speed to the small operation speed to serve as an optimization target, and when the side-by-side operation is finished, the residual farmland operation takes the lowest operation cost as the optimization target.
In the embodiment of the application, each Beidou navigation agricultural machine mainly uses a differential positioning system. The differential reference station using RTK technology transmits differential signals outwards through a radio station, and an automatic driving control system receiving differential data and satellite navigation signals calculates the data to obtain information such as heading, position, speed and the like to realize high-precision automatic navigation.
And inputting the shortest path and the farm machinery group position into the farm machinery control module through the Beidou navigation system to control the farm machinery group operation. The Beidou navigation system comprises an antenna, a transmitting unit, a processing unit and a positioning engine. The transmitting unit amplifies and converts satellite signals received by the antenna, the processing unit tracks and captures the converted signals, and the signals are sent to the positioning engine to obtain the position of the agricultural machinery. The control module comprises a receiving unit and a servo motor, wherein the receiving unit receives the position and the optimal path of the agricultural machinery, and controls the speed, the inter-agricultural machinery spacing and the position of the agricultural machinery.
The cooperative mode of each ground agricultural machine is a centralized formation method based on a Leader-Follower method. Compared with the distributed formation, the pilot formation can be provided with navigation equipment with different precision on pilot agricultural machinery and following agricultural machinery, initial alignment, time correction and the like can be completed through GPS or Beidou, and the cooperative control capability of the agricultural machinery group can be greatly improved.
As shown in figure 3, when the small-sized motor farm machines work cooperatively, a single or multiple groups of I-shaped queues are adopted, and the I-shaped farm machines positioned at the head of the queues are responsible for ditching so that seeds can be planted at proper depth. The second agricultural machinery group is responsible for scattering a certain number of crop seeds into the ditches dug by the first agricultural machinery at a certain interval. The third agricultural machinery group is responsible for earthing operation, covering quantitative soil on seeds sowed by the second agricultural machinery group and compacting and watering. The farm machinery group keeps the distance between the farm machinery fixed in the operation process, and the farm machinery group proceeds to operate according to the shortest path planned by the path planning module.
In the embodiment of the application, a collaborative navigation technology is adopted during operation among the agricultural machinery, and the collaborative navigation technology can very effectively realize error compensation of a micro-electromechanical system inertial measurement unit (MEMS-IMU). The embodiments of the present application use a very typical inertial sensor array collaborative navigation system. A significant advantage of micro-electromechanical system inertial measurement unit (MEMS-IMU) array collaborative navigation techniques is the ability to directly estimate many of the uncertainty factors of inertial sensors, thereby improving the accuracy, reliability, and robustness of the system.
The embodiment of the application uses the low-cost inertial sensor array collaborative navigation system consisting of 4 MEMS-IMUs, and can realize indoor and outdoor high-precision seamless navigation positioning without GNSS. In agricultural machinery collaborative navigation, various complex situations such as sudden shaking, sideslip, uneven road surface and the like are inevitably encountered. Therefore, there are many highly random, self-motion uncertainties in unstructured and dynamic scenarios in the work environment, requiring filtering to suppress noise, improving system accuracy. In order to efficiently and accurately fuse information of all subsystems in the 4 MEMS-IMU array collaborative navigation system, a robust federal filtering algorithm based on DoO and DoA is used under a Kalman filtering framework.
In the embodiment of the application, the small cluster agricultural machines are used for transmitting the motion state of each agricultural machine to other agricultural machines in the queue while receiving the motion state information sent by the agricultural machines of other members in the queue by means of the internet of vehicles technology. Compared with the mode of obtaining the information of the other vehicles after measurement and processing by the sensor, the mode of directly transmitting the movement state information of the agricultural machinery queues by the vehicle-mounted network can greatly shorten the required time. The communication topology structure of the queue defines an information transmission network among the vehicles of the queue members, and the communication topology structure commonly used at present comprises a front vehicle following type (PF), a front vehicle bidirectional following type (BPF), a pilot vehicle-front vehicle following type (LPF), a double front vehicle following Type (TPF) and other communication topology structures.
It should be noted that, for simplicity of description, the foregoing method embodiments are all expressed as a series of action combinations, but it should be understood by those skilled in the art that the present application is not limited by the order of actions described, as some steps may be performed in other order or simultaneously in accordance with the present application. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily required in the present application.
From the description of the above embodiments, it will be clear to a person skilled in the art that the method according to the above embodiments may be implemented by means of software plus the necessary general hardware platform, but of course also by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk), comprising several instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method described in the embodiments of the present application.
The unmanned gyroplane based on the depth camera obtains operation land parcel information, and after the unmanned plane obtains the panoramic photograph of the land parcel, the image processing module processes the panoramic photograph of the operation land parcel to obtain the information of the land parcel, and manual measurement before operation is not needed.
In addition, the small-sized electric agricultural machinery and the unmanned aerial vehicle of different kinds of work are adopted for coordinated operation, so that the operation efficiency of the small-sized and medium-sized fields can be improved, the agricultural machinery of different kinds of work are used in the operation, different operation tasks are parallel, the burden of operators is reduced, and the operation time is shortened.
Finally, the small electric agricultural machine group based on Beidou navigation is used, so that the small electric agricultural machines of different kinds can perform automatic navigation operation in a certain queue during operation, the characteristics of agricultural machines of various kinds are fully exerted, and the efficiency of agricultural machine cooperative operation is improved.
Example 4
According to an embodiment of the present application, there is further provided an electric agricultural machine cluster collaborative operation system, as shown in fig. 5, the apparatus includes: unmanned aerial vehicle 52, electronic agricultural machinery cluster collaborative operation device 54, beidou navigation system 56, a plurality of agricultural machinery 58.
The drone 52 is configured to scan the work area by photogrammetry, resulting in a panoramic image of the work area.
Beidou navigation system 56 is configured to provide the location of each agricultural machine in the agricultural machine cluster in real time.
The electric agricultural cluster collaborative work device 54 includes an image processing module 542, a route planning module 544, and an agricultural control module 546.
The image processing module 542 is configured to acquire a panoramic image of the work area and to pre-process the panoramic image.
The route planning module 544 is configured to plan a working route of each agricultural machine in the agricultural machine cluster on the working parcel based on the preprocessed panoramic image, so as to obtain a corresponding shortest path of each agricultural machine.
The agricultural machinery control module 546 is configured to obtain the positions of the respective agricultural machinery in real time, and control the respective agricultural machinery in the agricultural machinery cluster to perform the operations according to the respective shortest paths based on the obtained positions and the respective shortest paths.
Alternatively, specific examples in this embodiment may refer to examples described in embodiments 1 to 3 above, and this embodiment will not be described here again.
Example 5
Embodiments of the present application also provide a storage medium. Alternatively, in the present embodiment, the above-described storage medium may be configured to store program codes for executing the methods in embodiments 1 to 3 above.
Alternatively, in the present embodiment, the storage medium may include, but is not limited to: a U-disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing embodiment numbers of the present application are merely for describing, and do not represent advantages or disadvantages of the embodiments.
The integrated units in the above embodiments may be stored in the above-described computer-readable storage medium if implemented in the form of software functional units and sold or used as separate products. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions to cause one or more computer devices (which may be personal computers, servers or network devices, etc.) to perform all or part of the steps of the methods described in the various embodiments of the present application.
In the foregoing embodiments of the present application, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In several embodiments provided in the present application, it should be understood that the disclosed client may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, such as the division of the units, is merely a logical function division, and may be implemented in another manner, for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The foregoing is merely a preferred embodiment of the present application and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present application and are intended to be comprehended within the scope of the present application.

Claims (10)

1. The electric agricultural machinery cluster collaborative operation method is characterized by comprising the following steps of:
acquiring a panoramic image of an operation land block, and preprocessing the panoramic image;
planning the operation route of each agricultural machine on the operation land based on the preprocessed panoramic image to obtain the corresponding shortest path of each agricultural machine;
and acquiring the positions of the agricultural machines in real time, and controlling the agricultural machines in the agricultural machine cluster to respectively operate according to the corresponding shortest paths based on the acquired positions and the corresponding shortest paths.
2. The method of claim 1, wherein preprocessing the panoramic image comprises:
removing noise in the panoramic image through filtering, and carrying out distortion correction on the panoramic image after removing the noise;
and performing space conversion on lens coordinates of the panoramic image after distortion correction, and converting the panoramic image into a two-dimensional image coordinate system.
3. The method of claim 2, wherein after converting the panoramic image into a two-dimensional image coordinate system, the method further comprises:
extracting key information in the distortion corrected panoramic image, and calculating the association between different distortion corrected panoramic images;
based on the correlation, pose estimation is performed, and based on the estimated pose, a two-dimensional grid map construction is constructed.
4. A method according to claim 3, wherein performing pose estimation comprises:
sequentially extracting two adjacent frames of images from the panoramic images after different distortion correction;
calculating the relative pose between two adjacent frame images by utilizing the 2D projection position of the 3D position of the characteristic point in one frame image in the other frame image;
the pose estimation is performed based on the relative pose.
5. A method according to claim 3, wherein constructing a two-dimensional grid map construction based on the estimated pose comprises:
based on the estimated pose, carrying out depth recovery on projection points of road signs in the panoramic image by adopting a camera model to obtain three-dimensional points of the road signs in space, wherein the three-dimensional points of all the road signs form a sparse point cloud map under the same coordinate system;
and converting the sparse point cloud map into an octree map, and converting the octree map into the two-dimensional grid map.
6. The method of claim 1, wherein controlling the individual agricultural machines in the cluster of agricultural machines to operate according to the respective shortest paths based on the acquired locations and the respective shortest paths comprises:
based on the acquired positions and the corresponding shortest paths, utilizing an inertial sensor array collaborative navigation system to navigate each agricultural machine so as to control the speed, the inter-agricultural machine spacing and the positions of each agricultural machine;
controlling the agricultural machines to cooperatively and parallelly perform different operations while navigating the agricultural machines, wherein the operations comprise at least one of the following: ditching, sowing, covering soil and watering.
7. The method of claim 6, wherein navigating the respective agricultural machine with an inertial sensor array co-navigation system comprises:
innovation δZ calculated by comparing measured values and estimated values output by the inertial sensor array collaborative navigation system k And a correction matrix ζ to determine a true covariance matrix;
recursively calculating an innovation covariance based on the real covariance matrix;
calculating the abnormality degree of each agricultural machine based on the innovation covariance so as to quantitatively analyze the current working state of the inertial sensor array collaborative navigation system;
and navigating each agricultural machine based on the current working state.
8. An electric agricultural machinery cluster collaborative operation device, characterized by comprising:
the image processing module is configured to acquire a panoramic image of the operation land block and preprocess the panoramic image;
the route planning module is configured to plan the operation route of each agricultural machine on the operation land block in the agricultural machine cluster based on the preprocessed panoramic image to obtain a corresponding shortest path of each agricultural machine;
and the agricultural machine control module is configured to acquire the positions of the agricultural machines in real time, and control the agricultural machines in the agricultural machine cluster to respectively operate according to the corresponding shortest paths based on the acquired positions and the corresponding shortest paths.
9. An electric agricultural machinery cluster cooperative operation system is characterized in that,
the unmanned aerial vehicle is configured to scan the operation land block through a photogrammetry method to obtain a panoramic image of the operation land block;
the Beidou navigation system is configured to provide the positions of all the agricultural machines in the agricultural machine cluster in real time;
electric agricultural machinery cluster collaborative operation device includes:
the image processing module is configured to acquire a panoramic image of the operation land block and preprocess the panoramic image;
the route planning module is configured to plan the operation route of each agricultural machine on the operation land block in the agricultural machine cluster based on the preprocessed panoramic image to obtain a corresponding shortest path of each agricultural machine;
and the agricultural machine control module is configured to acquire the positions of the agricultural machines in real time, and control the agricultural machines in the agricultural machine cluster to respectively operate according to the corresponding shortest paths based on the acquired positions and the corresponding shortest paths.
10. A computer-readable storage medium, on which a program is stored, characterized in that the program, when run, causes a computer to perform the method of any one of claims 1 to 7.
CN202310322177.2A 2023-03-29 2023-03-29 Electric agricultural machinery cluster collaborative operation method, device and system Pending CN116339336A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310322177.2A CN116339336A (en) 2023-03-29 2023-03-29 Electric agricultural machinery cluster collaborative operation method, device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310322177.2A CN116339336A (en) 2023-03-29 2023-03-29 Electric agricultural machinery cluster collaborative operation method, device and system

Publications (1)

Publication Number Publication Date
CN116339336A true CN116339336A (en) 2023-06-27

Family

ID=86892698

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310322177.2A Pending CN116339336A (en) 2023-03-29 2023-03-29 Electric agricultural machinery cluster collaborative operation method, device and system

Country Status (1)

Country Link
CN (1) CN116339336A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117217693A (en) * 2023-09-13 2023-12-12 上海联适导航技术股份有限公司 Multi-machine collaborative operation method, device, equipment and storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103389506A (en) * 2013-07-24 2013-11-13 哈尔滨工程大学 Adaptive filtering method for strapdown inertia/Beidou satellite integrated navigation system
CN106052686A (en) * 2016-07-10 2016-10-26 北京工业大学 Full-autonomous strapdown inertial navigation system based on DSPTMS 320F28335
CN106123921A (en) * 2016-07-10 2016-11-16 北京工业大学 Latitude the unknown Alignment Method of SINS under the conditions of dynamic disturbance
CN111462135A (en) * 2020-03-31 2020-07-28 华东理工大学 Semantic mapping method based on visual S L AM and two-dimensional semantic segmentation
CN112381841A (en) * 2020-11-27 2021-02-19 广东电网有限责任公司肇庆供电局 Semantic SLAM method based on GMS feature matching in dynamic scene
CN112419409A (en) * 2020-11-18 2021-02-26 合肥湛达智能科技有限公司 Pose estimation method based on real-time video
CN112650255A (en) * 2020-12-29 2021-04-13 杭州电子科技大学 Robot indoor and outdoor positioning navigation system method based on vision and laser radar information fusion
CN113744337A (en) * 2021-09-07 2021-12-03 江苏科技大学 Synchronous positioning and mapping method integrating vision, IMU and sonar
CN115328114A (en) * 2022-07-05 2022-11-11 扬州大学 Beidou navigation agricultural machinery operation control method and system
CN115639823A (en) * 2022-10-27 2023-01-24 山东大学 Terrain sensing and movement control method and system for robot under rugged and undulating terrain

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103389506A (en) * 2013-07-24 2013-11-13 哈尔滨工程大学 Adaptive filtering method for strapdown inertia/Beidou satellite integrated navigation system
CN106052686A (en) * 2016-07-10 2016-10-26 北京工业大学 Full-autonomous strapdown inertial navigation system based on DSPTMS 320F28335
CN106123921A (en) * 2016-07-10 2016-11-16 北京工业大学 Latitude the unknown Alignment Method of SINS under the conditions of dynamic disturbance
CN111462135A (en) * 2020-03-31 2020-07-28 华东理工大学 Semantic mapping method based on visual S L AM and two-dimensional semantic segmentation
CN112419409A (en) * 2020-11-18 2021-02-26 合肥湛达智能科技有限公司 Pose estimation method based on real-time video
CN112381841A (en) * 2020-11-27 2021-02-19 广东电网有限责任公司肇庆供电局 Semantic SLAM method based on GMS feature matching in dynamic scene
CN112650255A (en) * 2020-12-29 2021-04-13 杭州电子科技大学 Robot indoor and outdoor positioning navigation system method based on vision and laser radar information fusion
CN113744337A (en) * 2021-09-07 2021-12-03 江苏科技大学 Synchronous positioning and mapping method integrating vision, IMU and sonar
CN115328114A (en) * 2022-07-05 2022-11-11 扬州大学 Beidou navigation agricultural machinery operation control method and system
CN115639823A (en) * 2022-10-27 2023-01-24 山东大学 Terrain sensing and movement control method and system for robot under rugged and undulating terrain

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117217693A (en) * 2023-09-13 2023-12-12 上海联适导航技术股份有限公司 Multi-machine collaborative operation method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
EP2503510B1 (en) Wide baseline feature matching using collaborative navigation and digital terrain elevation data constraints
CN110178048B (en) Method and system for generating and updating vehicle environment map
Rovira-Más et al. Stereo vision three-dimensional terrain maps for precision agriculture
US8712144B2 (en) System and method for detecting crop rows in an agricultural field
US8855405B2 (en) System and method for detecting and analyzing features in an agricultural field for vehicle guidance
Rovira-Más et al. Creation of three-dimensional crop maps based on aerial stereoimages
KR20180079428A (en) Apparatus and method for automatic localization
US20040264762A1 (en) System and method for detecting and analyzing features in an agricultural field
CN113899375B (en) Vehicle positioning method and device, storage medium and electronic equipment
Kunz et al. Map building fusing acoustic and visual information using autonomous underwater vehicles
CN116908810B (en) Method and system for measuring earthwork of building by carrying laser radar on unmanned aerial vehicle
CN116339336A (en) Electric agricultural machinery cluster collaborative operation method, device and system
CN116892944B (en) Agricultural machinery navigation line generation method and device, and navigation method and device
Li et al. UAV-based SLAM and 3D reconstruction system
CN111025364B (en) Machine vision positioning system and method based on satellite assistance
Rovira-Más Global 3D terrain maps for agricultural applications
Contreras et al. Efficient decentralized collaborative mapping for outdoor environments
CN116380039A (en) Mobile robot navigation system based on solid-state laser radar and point cloud map
CN113538579B (en) Mobile robot positioning method based on unmanned aerial vehicle map and ground binocular information
Rana et al. A pose estimation algorithm for agricultural mobile robots using an rgb-d camera
Madjidi et al. Vision-based positioning and terrain mapping by global alignment for UAVs
CN113744398B (en) Map reconstruction fusion method based on laser and microwave cooperation
CN113450411B (en) Real-time self-pose calculation method based on variance component estimation theory
Park et al. Localization of an unmanned ground vehicle using 3D registration of laser range data and DSM
CN118031951A (en) Multisource fusion positioning method, multisource fusion positioning device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination