CN112270751A - Agriculture and forestry operation scene display method and device, terminal and storage medium - Google Patents

Agriculture and forestry operation scene display method and device, terminal and storage medium Download PDF

Info

Publication number
CN112270751A
CN112270751A CN202011176833.5A CN202011176833A CN112270751A CN 112270751 A CN112270751 A CN 112270751A CN 202011176833 A CN202011176833 A CN 202011176833A CN 112270751 A CN112270751 A CN 112270751A
Authority
CN
China
Prior art keywords
target
simulation model
forestry
agriculture
terrain simulation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011176833.5A
Other languages
Chinese (zh)
Inventor
尤勇敏
其他发明人请求不公开姓名
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiuling Shanghai Intelligent Technology Co ltd
Original Assignee
Jiuling Shanghai Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiuling Shanghai Intelligent Technology Co ltd filed Critical Jiuling Shanghai Intelligent Technology Co ltd
Priority to CN202011176833.5A priority Critical patent/CN112270751A/en
Publication of CN112270751A publication Critical patent/CN112270751A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9537Spatial or temporal dependent retrieval, e.g. spatiotemporal queries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • G06Q10/047Optimisation of routes or paths, e.g. travelling salesman problem
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Forestry; Mining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • Databases & Information Systems (AREA)
  • Economics (AREA)
  • Strategic Management (AREA)
  • Human Computer Interaction (AREA)
  • Geometry (AREA)
  • General Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Health & Medical Sciences (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Game Theory and Decision Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Agronomy & Crop Science (AREA)
  • Animal Husbandry (AREA)
  • Marine Sciences & Fisheries (AREA)
  • Mining & Mineral Resources (AREA)
  • Quality & Reliability (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Development Economics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Remote Sensing (AREA)
  • Computer Graphics (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to a method, a device, a terminal and a storage medium for displaying agricultural and forestry operation scenes. The method comprises the following steps: acquiring a terrain simulation model corresponding to a target agriculture and forestry operation scene; the terrain simulation model comprises space geometric information and agriculture and forestry digital asset information of the target agriculture and forestry operation scene; determining a corresponding space state of the target operation equipment in the terrain simulation model; selecting an observation point according to the space state of the target operation equipment so as to display at least a local area of the terrain simulation model in the visual field range of the observation point; acquiring user operation triggered in the displayed local area; and updating and displaying the object operated by the user operation according to the user operation. By adopting the method, the operation information touch rate can be improved.

Description

Agriculture and forestry operation scene display method and device, terminal and storage medium
Technical Field
The application relates to the technical field of agriculture and forestry, in particular to a method, a device, a terminal and a storage medium for displaying an agriculture and forestry operation scene.
Background
With the development of agricultural mechanization and digitalization, the agricultural and forestry robot is gradually raised. The agricultural and forestry robots mostly adopt remote control or autonomous traveling mode operation, and are increasingly applied to scenes such as disaster relief, irrigation, fertilization and the like.
However, in the field operation process of the agricultural and forestry robot, a user cannot intuitively know the operation condition of the agricultural and forestry robot, so that the information access rate of the agricultural and forestry robot is low when the agricultural and forestry robot executes an agricultural and forestry operation task.
Disclosure of Invention
In view of the above, it is necessary to provide a method, an apparatus, a terminal and a storage medium for displaying an agricultural and forestry work scene, which can improve the information reach rate.
An agriculture and forestry operation scene display method comprises the following steps:
acquiring a terrain simulation model corresponding to a target agriculture and forestry operation scene; the terrain simulation model comprises space geometric information and agriculture and forestry digital asset information of the target agriculture and forestry operation scene;
determining a corresponding space state of the target operation equipment in the terrain simulation model;
selecting an observation point according to the space state of the target operation equipment so as to display at least a local area of the terrain simulation model in the visual field range of the observation point;
acquiring user operation triggered in the displayed local area;
and updating and displaying the object operated by the user operation according to the user operation.
In one embodiment, the spatial state includes a spatial position and orientation; selecting an observation point according to the space state of the target operation equipment so as to display at least a local area of the terrain simulation model in the visual field range of the observation point, wherein the method comprises the following steps:
taking the spatial position of the target operation equipment as an observation point, and taking the orientation of the target operation equipment as the observation direction of the observation point;
determining the visual field range of the observation point in the observation direction;
and displaying a local area of the terrain simulation model in the visual field range.
In one embodiment, the user operation is a drag operation; performing update display on the object operated by the user operation according to the user operation, including:
and in the action process of the dragging operation, updating the observation direction of the observation point in real time so as to display at least a local area of the terrain simulation model, which is positioned in the real-time visual field range of the observation point.
In one embodiment, the method further comprises:
displaying the target operation equipment in the local area;
acquiring a trigger operation acting on the target operation equipment;
and displaying the equipment parameters of the target operation equipment in a first preset area according to the triggering operation.
In one embodiment, before determining the corresponding spatial state of the target work device in the terrain simulation model, the method further comprises:
displaying at least two candidate working devices;
acquiring a target operating device selected from the candidate operating devices through a selection operation;
displaying the terrain simulation model according to a overlooking angle;
acquiring operation parameters selected from the terrain simulation model through selection operation;
and generating an agriculture and forestry operation task according to the operation parameters and the target operation equipment, sending the agriculture and forestry operation task to a server, so as to instruct the server to plan an operation path according to the operation parameters, and informing the target operation equipment to execute the agriculture and forestry operation task according to the operation path.
In one embodiment, the method further comprises:
displaying the target operation equipment in the local area;
displaying a two-dimensional thumbnail corresponding to the terrain simulation model in a second preset area; the operation path and the current operation position of the target operation equipment are displayed in the two-dimensional thumbnail;
and when the space state of the target operation equipment changes, adjusting the observation point in real time to update the local area and the two-dimensional thumbnail.
In one embodiment, the target work equipment includes an agricultural and forestry robot and an unmanned aerial vehicle; when the agriculture and forestry robot is selected as the currently displayed target operation equipment through user operation, the determining of the corresponding space state of the target operation equipment in the terrain simulation model comprises the following steps:
determining a corresponding space state of the agricultural and forestry robot in the terrain simulation model;
selecting an observation point according to the space state of the target operation equipment so as to display at least a local area of the terrain simulation model in the visual field range of the observation point, wherein the method comprises the following steps:
selecting an observation point according to the space state of the agriculture and forestry robot, and displaying a local area of the terrain simulation model in the visual field range of the observation point and the agriculture and forestry robot;
the method further comprises the following steps:
acquiring a target operation equipment switching instruction;
determining a corresponding space state of the unmanned aerial vehicle in the terrain simulation model;
and adjusting the observation point according to the space state of the unmanned aerial vehicle, and switching to a local area showing that the terrain simulation model is located in the adjusted view range of the observation point and the unmanned aerial vehicle.
An agricultural and forestry operations scene display device, the device includes:
the acquisition module is used for acquiring a terrain simulation model corresponding to a target agriculture and forestry operation scene; the terrain simulation model comprises space geometric information and agriculture and forestry digital asset information of the target agriculture and forestry operation scene;
the determining module is used for determining the corresponding space state of the target operation equipment in the terrain simulation model;
the display module is used for selecting an observation point according to the space state of the target operation equipment so as to display at least a local area of the terrain simulation model in the visual field range of the observation point;
the acquisition module is further used for acquiring user operation triggered in the displayed local area;
and the updating module is used for updating and displaying the object operated by the user operation according to the user operation.
A terminal comprising a memory and a processor, the memory storing a computer program, the processor when executing the computer program implementing the steps of:
acquiring a terrain simulation model corresponding to a target agriculture and forestry operation scene; the terrain simulation model comprises space geometric information and agriculture and forestry digital asset information of the target agriculture and forestry operation scene;
determining a corresponding space state of the target operation equipment in the terrain simulation model;
selecting an observation point according to the space state of the target operation equipment so as to display at least a local area of the terrain simulation model in the visual field range of the observation point;
acquiring user operation triggered in the displayed local area;
and updating and displaying the object operated by the user operation according to the user operation.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
acquiring a terrain simulation model corresponding to a target agriculture and forestry operation scene; the terrain simulation model comprises space geometric information and agriculture and forestry digital asset information of the target agriculture and forestry operation scene;
determining a corresponding space state of the target operation equipment in the terrain simulation model;
selecting an observation point according to the space state of the target operation equipment so as to display at least a local area of the terrain simulation model in the visual field range of the observation point;
acquiring user operation triggered in the displayed local area;
and updating and displaying the object operated by the user operation according to the user operation.
According to the agriculture and forestry operation scene display method, the device, the terminal and the storage medium, after the terrain simulation model corresponding to the target agriculture and forestry operation scene is obtained, at least a local area of the terrain simulation model, which is located in the visual field range of the observation point, can be displayed according to the corresponding space state of the target operation equipment in the terrain simulation model, the operation condition of the target operation equipment is visually displayed, a user can know operation information in real time, and the reach rate of the operation information is improved; and moreover, user operation is supported during display, and after the user operation triggered by the user in the displayed local area is obtained, the object operated by the user operation is updated and displayed, so that the touch rate of the operation information is further improved.
Drawings
FIG. 1 is a diagram of an exemplary environment in which a method for displaying agricultural and forestry operations is implemented;
FIG. 2 is a schematic flow chart illustrating a method for displaying agricultural and forestry operation scenes in one embodiment;
FIG. 3 is an interface diagram illustrating an agriculture and forestry operation scenario according to an embodiment;
FIG. 4 is a schematic view of an interface showing an agriculture and forestry operation scene in another embodiment;
FIG. 5 is a schematic view of an interface showing an agriculture and forestry operation scene in another embodiment;
FIG. 6 is a schematic view of an interface showing an agriculture and forestry operation scene in another embodiment;
FIG. 7 is a block diagram of an agricultural/forestry operation scene display apparatus according to an embodiment;
fig. 8 is an internal structural view of a terminal in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The agriculture and forestry operation scene display method can be applied to the application environment shown in the figure 1. The terminal 102 communicates with the server 104 through a network, the server 104 communicates with the target operation device 106 through the network, and the target operation device 106 includes an agricultural and forestry robot and an unmanned aerial vehicle. The terminal 102 may be, but not limited to, various personal computers, notebook computers, smart phones, tablet computers, and portable wearable devices, and the server 104 may be implemented by an independent server or a server cluster formed by a plurality of servers. The terminal 102 is configured to execute the agriculture and forestry operation scene display method, and specifically, the terminal 102 obtains a terrain simulation model corresponding to a target agriculture and forestry operation scene; determining a corresponding space state of the target operation equipment in the terrain simulation model; selecting an observation point according to the space state of the target operation equipment so as to display at least a local area of the terrain simulation model in the visual field range of the observation point; acquiring user operation triggered in the displayed local area; and updating and displaying the object operated by the user operation according to the user operation. The terrain simulation model is provided by the server 104 and includes spatial geometry information of a target agriculture and forestry operation scene, agriculture and forestry digital asset information and operation equipment information.
In further embodiments, target work device 106 may also communicate with terminal 102 over a network.
In an embodiment, as shown in fig. 2, an agriculture and forestry operation scene display method is provided, which is described by taking the application of the method to the terminal in fig. 1 as an example, and includes the following steps:
step 202, acquiring a terrain simulation model corresponding to a target agriculture and forestry operation scene; the terrain simulation model comprises space geometric information of a target agriculture and forestry operation scene and agriculture and forestry digital asset information.
The terrain simulation model is a virtual model simulated by a computer technology according to an actual agriculture and forestry scene. The terrain simulation model is specifically constructed according to the digital terrain model and the agriculture and forestry digital asset information.
It will be appreciated that the digital terrain model expresses the terrain morphology in terms of coordinates X, Y, Z of dense terrain model points. The agriculture and forestry digital asset information is the digitalized result of the agriculture and forestry assets. Agricultural and forestry assets include agricultural and/or forestry resources such as terrain, vegetation types, environments, and soil. The computer equipment can be simulated and simulated by relying on the digital terrain model and combining the agriculture and forestry digital asset information, so that a relatively accurate agriculture and forestry digital twin space, namely a terrain simulation model, is constructed and used for simulating a real agriculture and forestry scene.
The target agriculture and forestry operation scene is an agriculture and forestry scene where the operation equipment is currently operating.
Specifically, different agriculture and forestry scenes usually have different agriculture and forestry digital asset information, and the computer device can respectively simulate different terrain simulation models for the different agriculture and forestry scenes, and then correspondingly store the various agriculture and forestry scenes and the corresponding simulated terrain simulation models. Therefore, after receiving an instruction of a user for checking the agriculture and forestry operation scene, the terminal sends the instruction to the server, and after receiving the instruction, the server determines a target agriculture and forestry operation scene specified by the instruction, inquires a terrain simulation model corresponding to the target agriculture and forestry operation scene, and feeds the terrain simulation model back to the terminal.
And step 204, determining the corresponding space state of the target operation equipment in the terrain simulation model.
Wherein the target operation equipment is operation equipment which operates in a target agriculture and forestry operation scene. The number of target working devices may be one or more than one.
The spatial state is a state of the target working device in the three-dimensional terrain simulation model, and includes a physical position, an attitude and the like. The attitude may include the orientation and motion of the target work device, etc.
Specifically, the terminal may initiate a space state query request corresponding to the target operation device to the server, and after receiving the space state query request, the server queries a space state corresponding to the target operation device according to the space state query request and feeds the space state back to the terminal.
In another embodiment, the target operation device may determine the current position information in real time by using a positioning algorithm, then actively send the positioned position information to the server, and the server maps the position information of the target operation device to a corresponding spatial state in the terrain simulation model and then synchronizes the spatial state to the terminal in real time.
And step 206, selecting an observation point according to the space state of the target operation equipment so as to display at least a local area of the terrain simulation model in the visual field range of the observation point.
Wherein the observation point is a spatial position for observing the terrain simulation model. The terminal can display the picture in the visual field range of the observation point in the interface displayed on the display screen of the terminal. The interface can be a man-machine interaction interface provided by an application, and the application can be an agriculture and forestry operation control application and the like.
It is understood that the presented content may be a local interval of the terrain simulation model. The displayed content may be at least a partial picture in the visual field range of the observation point, that is, a partial picture in the visual field range may be displayed, or all pictures in the visual field range may be displayed.
In one embodiment, the spatial state includes spatial position and orientation; step 206, comprising: taking the space position of the target operation equipment as an observation point, and taking the orientation of the target operation equipment as the observation direction of the observation point; determining the visual field range of the observation point in the observation direction; and showing a local area of the terrain simulation model in the visual field range.
Specifically, after acquiring the spatial state of the target operation equipment, the terminal takes the spatial position of the target operation equipment as an observation point, takes the orientation of the target operation equipment as the observation direction of the observation point, and determines the view range of the observation point in the observation direction according to a certain view angle. The terminal can display the picture in the visual field range of the observation point in the interface displayed on the display screen of the terminal, namely, the local area of the terrain simulation model in the visual field range is displayed.
In this embodiment, the view angle of the target operation device is used as the observation view angle, and when the agriculture and forestry operation scene is displayed, the picture in the view angle of the target operation device is directly displayed, so that the user can more intuitively know the current operation state of the target operation device which is currently operating.
In one embodiment, the spatial state includes spatial position and orientation; step 206, comprising: and selecting a target position according to the spatial position of the target operation equipment, taking the target position as an observation point, and taking the orientation of the target operation equipment as the observation direction of the observation point, so that the local area of the terrain simulation model in the visual field range of the observation point and the target operation equipment are jointly displayed.
The target position selected according to the spatial position of the target working device may be a position above the spatial position, so that the target working device is within the visual field of the observation point. The target operation equipment displayed on the terminal display screen by the terminal can be displayed partially or completely.
For example, referring to fig. 3, it can be seen that a local area where a target working device 301 (agricultural and forestry robot) in the terrain simulation model is working is displayed in the interface of the terminal, and the target working device 301 (agricultural and forestry robot) is displayed locally. Referring again to fig. 4, it can be seen that a local area of the terrain simulation model where the target working device 401 (drone) is working is displayed in the interface of the terminal, and the target working device 401 (drone) is displayed locally.
In this embodiment, when an agriculture and forestry operation scene is displayed, the target operation equipment which is executing an agriculture and forestry operation task is displayed at the same time, so that a user can know the target operation equipment which is operating currently more intuitively and know the current operation state of the target operation equipment.
In one embodiment, a switching entry may exist in the interface of the terminal, and the switching entry is used for switching the selection mode of the observation point, and whether the spatial position of the target operation device is directly used as the observation point or not. If the spatial position of the target operation device is directly used as the observation point, the view angle of the target operation device is used as the observation view angle. If the spatial position of the target operation device is not directly used as the observation point, the global view may be used as the observation angle, or the view for observing the target operation device may be used as the observation angle. The visual angle of observation target operation equipment is used as an observation visual angle, and a local area of the terrain simulation model, which is positioned in the visual field range of an observation point, and the target operation equipment are displayed together; and taking the global visual angle as an observation visual angle to display the terrain simulation model globally. Therefore, information can be displayed in detail through visual angle switching.
Step 208, user operations triggered in the displayed local area are acquired.
Specifically, the terminal utilizes the three-dimensional characteristics of the terrain simulation model and coordinates point capture, and a user can perform operations such as point selection, frame selection, zooming, rotation and isolation on areas and objects on the model.
The terrain simulation model expresses the environment information of actual operation relatively completely. The user can operate the operation equipment and the agriculture and forestry assets more accurately by taking the terrain simulation model as a basis.
And step 210, updating and displaying the object operated by the user operation according to the user operation.
Specifically, after the terminal obtains the user operation, the terminal updates and displays the object operated by the user operation according to the user operation, so as to update the display content of the terminal interface. And the user directly interacts with the terrain simulation model, and the user operation instruction is visually displayed on the model.
For example, as shown in fig. 5, a user may click on a tree 501 on the model, and the terminal may display growth information 502 of the tree in the interface in response to the click.
For another example, the user may select a region on the model for isolation processing, and the terminal may perform blurring processing on a region outside the region in the interface in response to the selection operation to highlight the region.
In one embodiment, the user operation is a drag operation; the method for updating and displaying the object operated by the user operation according to the user operation comprises the following steps: and in the action process of the dragging operation, updating the observation direction of the observation point in real time so as to display at least a local area of the terrain simulation model in the real-time visual field range of the observation point.
In this embodiment, information can be presented in detail by rotation.
According to the agriculture and forestry operation scene display method, after the terrain simulation model corresponding to the target agriculture and forestry operation scene is obtained, at least a local area of the terrain simulation model, which is located in the visual field range of the observation point, can be displayed according to the corresponding space state of the target operation equipment in the terrain simulation model, the operation condition of the target operation equipment is visually displayed, a user can know operation information in real time, and the reach rate of the operation information is improved; and moreover, user operation is supported during display, and after the user operation triggered by the user in the displayed local area is obtained, the object operated by the user operation is updated and displayed, so that the touch rate of the operation information is further improved.
In one embodiment, the method for displaying agricultural and forestry operation scenes further comprises the following steps: displaying target operation equipment in a local area; acquiring a trigger operation acting on target operation equipment; and displaying the equipment parameters of the target operation equipment in a first preset area according to the triggering operation.
Specifically, the terminal may also display the target operation device in the local area, detect a trigger operation acting on the target operation device, and detect a trigger operation acting on the target operation device; and displaying the equipment parameters of the target operation equipment in a first preset area according to the triggering operation. The device parameters include operation speed, operation time, remaining power and the like. A first predetermined area such as the right side area of the interface, etc.
By way of example, with continued reference to fig. 3, the terminal may detect a trigger operation acting on the target working device 301 (agricultural robot), revealing a device parameter 302 of the target working device 301 (agricultural robot). Referring again to fig. 4, the terminal may detect a trigger operation acting on the target operation device 401 (drone), and display the device parameters 402 of the target operation device 401 (drone).
In this embodiment, the device parameters of the operating device can be flexibly displayed according to the user operation, so that the user can more intuitively know the current operating state of the currently operating target operating device.
In one embodiment, before determining the corresponding spatial state of the target work device in the terrain simulation model, the method further comprises: displaying at least two candidate working devices; acquiring a target operation device selected from the candidate operation devices through a selection operation; displaying the terrain simulation model according to the overlooking angle; acquiring operation parameters selected from a terrain simulation model through user operation; and generating an agriculture and forestry operation task according to the operation parameters and the target operation equipment, sending the agriculture and forestry operation task to the server, indicating the server to plan an operation path according to the operation parameters, and informing the target operation equipment to execute the agriculture and forestry operation task according to the operation path.
Wherein, at least two candidate operation equipment include agriculture and forestry robot and unmanned aerial vehicle. One or more target work apparatuses selected from the candidate work apparatuses may be provided.
Specifically, the terminal may present at least two candidate job devices, and acquire a target job device selected from the candidate job devices by the selection operation. When the target operation equipment is unique, the terminal displays the terrain simulation model according to the overlooking angle, and then obtains operation parameters selected from the terrain simulation model through user operation; and generating an agriculture and forestry operation task according to the operation parameters and the target operation equipment, sending the agriculture and forestry operation task to the server, planning an operation path according to the operation parameters by the server, and informing the target operation equipment of executing the agriculture and forestry operation task according to the operation path. When a plurality of target operation devices are provided, such as an agricultural and forestry robot and an unmanned aerial vehicle, the terminal can display the terrain simulation model according to the overlooking angle, and then the operation parameters corresponding to the target operation devices selected from the terrain simulation model through user operation are obtained; and generating an agriculture and forestry operation task according to the operation parameters and the target operation equipment, sending the agriculture and forestry operation task to the server, planning an operation path corresponding to each target operation equipment according to the operation parameters by the server, and informing the plurality of target operation equipment to jointly execute the agriculture and forestry operation task according to the corresponding operation paths.
The operation parameter may include an operation range, and may further include an operation start point and an operation end point.
For example, as shown in fig. 6, the figure shows the interface content when the user selects the operation parameters from the terrain simulation model for the unmanned aerial vehicle.
It can be understood that the GPS map does not provide a precise and informative reference when the user plans the operation range of the device according to the GPS map due to the limitation of the information amount covered by the GPS map itself and the limitation of the 2D form. In this embodiment, the terrain simulation model includes spatial geometric information of a target agriculture and forestry operation scene and agriculture and forestry digital asset information, covers main elements in the agriculture and forestry operation scene from a model level, relatively completely expresses a real environment of actual operation, and a user can more accurately plan an operation range of equipment.
In one embodiment, after the user finishes the operation of the terminal, the terminal sends a specific job instruction to the server, and then the server sends the job instruction to the job equipment.
In one embodiment, the method for displaying agricultural and forestry operation scenes further comprises the following steps: displaying target operation equipment in a local area; displaying a two-dimensional thumbnail corresponding to the terrain simulation model in a second preset area; a work path and the current work position of the target work equipment are displayed in the two-dimensional thumbnail; and when the space state of the target operation equipment is changed, adjusting the observation point in real time to update the local area and the two-dimensional thumbnail.
Specifically, when the terminal displays the target operation device in the local area, a two-dimensional thumbnail corresponding to the terrain simulation model may be displayed in the second preset area, and the operation path and the current operation position of the target operation device are displayed in the two-dimensional thumbnail. The terminal can also receive the space state of the target operation equipment sent by the server in real time, and when the space state of the target operation equipment changes, the observation point is adjusted in real time to update the local area and the two-dimensional thumbnail. Wherein, the second preset area is, for example, the upper left corner area of the interface.
In this embodiment, the terrain simulation model is a visual medium basis of terminal operation, overlaps agriculture and forestry digital asset information and operation equipment information, and covers main elements in agriculture and forestry operation scenes from a model data layer. In the operation process of the actual terminal, the server acquires the position information of the actual operation equipment in real time through a wireless communication technology and feeds the position information back to the terminal for real-time updating in the digital terrain model, so that a user can more intuitively know the current operation state of the currently operating target operation equipment.
In one embodiment, the target work equipment includes an agricultural and forestry robot and an unmanned aerial vehicle; when the agriculture and forestry robot is selected as the currently displayed target operation equipment through user operation, the corresponding space state of the target operation equipment in the terrain simulation model is determined, and the method comprises the following steps: determining a corresponding space state of the agricultural and forestry robot in the terrain simulation model; selecting an observation point according to the space state of the target operation equipment so as to display at least a local area of the terrain simulation model in the visual field range of the observation point, wherein the method comprises the following steps: selecting an observation point according to the space state of the agriculture and forestry robot, and displaying a local area of the terrain simulation model in the visual field range of the observation point and the agriculture and forestry robot; the method further comprises the following steps: acquiring a target operation equipment switching instruction; determining a corresponding space state of the unmanned aerial vehicle in a terrain simulation model; and adjusting the observation point according to the space state of the unmanned aerial vehicle, and switching to a local area for displaying the terrain simulation model in the visual field range of the adjusted observation point and the unmanned aerial vehicle.
Specifically, when equipment with different operation planes operates, the terminal can accurately show the position and state information of the operation equipment from a three-dimensional view based on the terrain simulation model. For example, when the unmanned aerial vehicle and the agricultural and forestry robot work in the same area, the traditional GPS map cannot accurately depict the height information of the equipment. However, in the embodiment of the application, the information can be displayed in the air by combining different elevation information with the switching and rotation of the visual angle.
For example, the terminal can show agriculture and forestry robot and unmanned aerial vehicle separately respectively, through switching target operation equipment, switches the show between agriculture and forestry robot visual angle and unmanned aerial vehicle visual angle. The agricultural and forestry robot visual angle and the unmanned aerial vehicle visual angle correspond different elevation information. The terminal also can show agriculture and forestry robot and unmanned aerial vehicle respectively jointly, through different elevation information to the three-dimensional scene shows.
In the embodiment of the application, the terminal is provided with a complete model, and the visualization effect is good. And the terrain simulation model is constructed based on a universal digital terrain model, agriculture and forestry asset information, equipment information (BOT) and relative position information (Anchor/BOT Location) of the equipment and the model.
It should be understood that, although the steps in the flowchart of fig. 2 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in fig. 2 may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed in turn or alternately with other steps or at least a portion of the other steps or stages.
In one embodiment, as shown in fig. 7, there is provided an agricultural and forestry work scene display device, including: an obtaining module 701, a determining module 702, a displaying module 703 and an updating module 704, wherein:
an obtaining module 701, configured to obtain a terrain simulation model corresponding to a target agriculture and forestry operation scene; the terrain simulation model comprises space geometric information of a target agriculture and forestry operation scene and agriculture and forestry digital asset information;
a determining module 702, configured to determine a spatial state corresponding to the target operating device in the terrain simulation model;
the display module 703 is configured to select an observation point according to a spatial state of the target operation device, so as to display at least a local area of the terrain simulation model within a visual field range of the observation point;
the obtaining module 701 is further configured to obtain a user operation triggered in the displayed local area;
and the updating module 704 is used for updating and displaying the object operated by the user operation according to the user operation.
In one embodiment, the spatial state includes spatial position and orientation; the display module 703 is further configured to use the spatial position of the target operation device as an observation point, and use the orientation of the target operation device as an observation direction of the observation point; determining the visual field range of the observation point in the observation direction; and showing a local area of the terrain simulation model in the visual field range.
In one embodiment, the user operation is a drag operation; the updating module 704 is further configured to update the observation direction of the observation point in real time during the action of the dragging operation, so as to display at least a partial region of the terrain simulation model located in the real-time visual field range of the observation point.
In one embodiment, the presentation module 703 is further configured to present the target work device in a local area; acquiring a trigger operation acting on target operation equipment; and displaying the equipment parameters of the target operation equipment in a first preset area according to the triggering operation.
In one embodiment, before determining the corresponding spatial state of the target operating device in the terrain simulation model, the presentation module 703 is further configured to present at least two candidate operating devices; acquiring a target operation device selected from the candidate operation devices through a selection operation; displaying the terrain simulation model according to the overlooking angle; acquiring operation parameters selected from a terrain simulation model through user operation; and generating an agriculture and forestry operation task according to the operation parameters and the target operation equipment, sending the agriculture and forestry operation task to the server, indicating the server to plan an operation path according to the operation parameters, and informing the target operation equipment to execute the agriculture and forestry operation task according to the operation path.
In one embodiment, the presentation module 703 is further configured to present the target work device in a local area; displaying a two-dimensional thumbnail corresponding to the terrain simulation model in a second preset area; a work path and the current work position of the target work equipment are displayed in the two-dimensional thumbnail; and when the space state of the target operation equipment is changed, adjusting the observation point in real time to update the local area and the two-dimensional thumbnail.
In one embodiment, the target work equipment includes an agricultural and forestry robot and an unmanned aerial vehicle; the determining module 702 is further configured to determine a corresponding spatial state of the agricultural and forestry robot in the terrain simulation model; the display module 703 is further configured to select an observation point according to the spatial state of the agricultural and forestry robot, and display a local area of the terrain simulation model within the visual field range of the observation point and the agricultural and forestry robot; acquiring a target operation equipment switching instruction; determining a corresponding space state of the unmanned aerial vehicle in a terrain simulation model; and adjusting the observation point according to the space state of the unmanned aerial vehicle, and switching to a local area for displaying the terrain simulation model in the visual field range of the adjusted observation point and the unmanned aerial vehicle.
After the agriculture and forestry operation scene display device obtains the terrain simulation model corresponding to the target agriculture and forestry operation scene, at least a local area of the terrain simulation model, which is positioned in the visual field range of the observation point, can be displayed according to the corresponding spatial state of the target operation equipment in the terrain simulation model, the operation condition of the target operation equipment is visually displayed, a user can know operation information in real time, and the reach rate of the operation information is improved; and moreover, user operation is supported during display, and after the user operation triggered by the user in the displayed local area is obtained, the object operated by the user operation is updated and displayed, so that the touch rate of the operation information is further improved.
For the specific limitation of the agriculture and forestry operation scene display device, reference may be made to the above limitation on the agriculture and forestry operation scene display method, which is not described herein again. All or part of the modules in the agriculture and forestry operation scene display device can be realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a terminal is provided, an internal structure of which may be as shown in fig. 8. The terminal comprises a processor, a memory, a communication interface, a display screen and an input device which are connected through a system bus. Wherein the processor of the terminal is configured to provide computing and control capabilities. The memory of the terminal comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The communication interface of the terminal is used for carrying out wired or wireless communication with an external terminal, and the wireless communication can be realized through WIFI, an operator network, NFC (near field communication) or other technologies. The computer program is executed by a processor to realize the agriculture and forestry operation scene display method. The display screen of the terminal can be a liquid crystal display screen or an electronic ink display screen, and the input device of the terminal can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on a shell of the terminal, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the configuration shown in fig. 8 is a block diagram of only a portion of the configuration associated with the present application and does not constitute a limitation on the terminal to which the present application is applied, and that a particular terminal may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, there is provided a terminal comprising a memory and a processor, the memory having a computer program stored therein, the processor when executing the computer program implementing the steps of: acquiring a terrain simulation model corresponding to a target agriculture and forestry operation scene; the terrain simulation model comprises space geometric information of a target agriculture and forestry operation scene and agriculture and forestry digital asset information; determining a corresponding space state of the target operation equipment in the terrain simulation model; selecting an observation point according to the space state of the target operation equipment so as to display at least a local area of the terrain simulation model in the visual field range of the observation point; acquiring user operation triggered in the displayed local area; and updating and displaying the object operated by the user operation according to the user operation.
In one embodiment, the spatial state includes spatial position and orientation; selecting an observation point according to the space state of the target operation equipment so as to display at least a local area of the terrain simulation model in the visual field range of the observation point, wherein the method comprises the following steps: taking the space position of the target operation equipment as an observation point, and taking the orientation of the target operation equipment as the observation direction of the observation point; determining the visual field range of the observation point in the observation direction; and showing a local area of the terrain simulation model in the visual field range.
In one embodiment, the user operation is a drag operation; the method for updating and displaying the object operated by the user operation according to the user operation comprises the following steps: and in the action process of the dragging operation, updating the observation direction of the observation point in real time so as to display at least a local area of the terrain simulation model in the real-time visual field range of the observation point.
In one embodiment, the processor, when executing the computer program, further performs the steps of: displaying target operation equipment in a local area; acquiring a trigger operation acting on target operation equipment; and displaying the equipment parameters of the target operation equipment in a first preset area according to the triggering operation.
In one embodiment, before determining the corresponding spatial state of the target work device in the terrain simulation model, the processor, when executing the computer program, further performs the steps of: displaying at least two candidate working devices; acquiring a target operation device selected from the candidate operation devices through a selection operation; displaying the terrain simulation model according to the overlooking angle; acquiring operation parameters selected from a terrain simulation model through user operation; and generating an agriculture and forestry operation task according to the operation parameters and the target operation equipment, sending the agriculture and forestry operation task to the server, indicating the server to plan an operation path according to the operation parameters, and informing the target operation equipment to execute the agriculture and forestry operation task according to the operation path.
In one embodiment, the processor, when executing the computer program, further performs the steps of: displaying target operation equipment in a local area; displaying a two-dimensional thumbnail corresponding to the terrain simulation model in a second preset area; a work path and the current work position of the target work equipment are displayed in the two-dimensional thumbnail; and when the space state of the target operation equipment is changed, adjusting the observation point in real time to update the local area and the two-dimensional thumbnail.
In one embodiment, the target work equipment includes an agricultural and forestry robot and an unmanned aerial vehicle; when the agriculture and forestry robot is selected as the currently displayed target operation equipment through user operation, the corresponding space state of the target operation equipment in the terrain simulation model is determined, and the method comprises the following steps: determining a corresponding space state of the agricultural and forestry robot in the terrain simulation model; selecting an observation point according to the space state of the target operation equipment so as to display at least a local area of the terrain simulation model in the visual field range of the observation point, wherein the method comprises the following steps: selecting an observation point according to the space state of the agriculture and forestry robot, and displaying a local area of the terrain simulation model in the visual field range of the observation point and the agriculture and forestry robot; the processor, when executing the computer program, further performs the steps of: acquiring a target operation equipment switching instruction; determining a corresponding space state of the unmanned aerial vehicle in a terrain simulation model; and adjusting the observation point according to the space state of the unmanned aerial vehicle, and switching to a local area for displaying the terrain simulation model in the visual field range of the adjusted observation point and the unmanned aerial vehicle.
In one embodiment, a computer-readable storage medium is provided, having a computer program stored thereon, which when executed by a processor, performs the steps of: acquiring a terrain simulation model corresponding to a target agriculture and forestry operation scene; the terrain simulation model comprises space geometric information of a target agriculture and forestry operation scene and agriculture and forestry digital asset information; determining a corresponding space state of the target operation equipment in the terrain simulation model; selecting an observation point according to the space state of the target operation equipment so as to display at least a local area of the terrain simulation model in the visual field range of the observation point; acquiring user operation triggered in the displayed local area; and updating and displaying the object operated by the user operation according to the user operation.
In one embodiment, the spatial state includes spatial position and orientation; selecting an observation point according to the space state of the target operation equipment so as to display at least a local area of the terrain simulation model in the visual field range of the observation point, wherein the method comprises the following steps: taking the space position of the target operation equipment as an observation point, and taking the orientation of the target operation equipment as the observation direction of the observation point; determining the visual field range of the observation point in the observation direction; and showing a local area of the terrain simulation model in the visual field range.
In one embodiment, the user operation is a drag operation; the method for updating and displaying the object operated by the user operation according to the user operation comprises the following steps: and in the action process of the dragging operation, updating the observation direction of the observation point in real time so as to display at least a local area of the terrain simulation model in the real-time visual field range of the observation point.
In one embodiment, the computer program when executed by the processor further performs the steps of: displaying target operation equipment in a local area; acquiring a trigger operation acting on target operation equipment; and displaying the equipment parameters of the target operation equipment in a first preset area according to the triggering operation.
In one embodiment, prior to determining the corresponding spatial state of the target work device in the terrain simulation model, the computer program when executed by the processor further performs the steps of: displaying at least two candidate working devices; acquiring a target operation device selected from the candidate operation devices through a selection operation; displaying the terrain simulation model according to the overlooking angle; acquiring operation parameters selected from a terrain simulation model through user operation; and generating an agriculture and forestry operation task according to the operation parameters and the target operation equipment, sending the agriculture and forestry operation task to the server, indicating the server to plan an operation path according to the operation parameters, and informing the target operation equipment to execute the agriculture and forestry operation task according to the operation path.
In one embodiment, the computer program when executed by the processor further performs the steps of: displaying target operation equipment in a local area; displaying a two-dimensional thumbnail corresponding to the terrain simulation model in a second preset area; a work path and the current work position of the target work equipment are displayed in the two-dimensional thumbnail; and when the space state of the target operation equipment is changed, adjusting the observation point in real time to update the local area and the two-dimensional thumbnail.
In one embodiment, the target work equipment includes an agricultural and forestry robot and an unmanned aerial vehicle; when the agriculture and forestry robot is selected as the currently displayed target operation equipment through user operation, the corresponding space state of the target operation equipment in the terrain simulation model is determined, and the method comprises the following steps: determining a corresponding space state of the agricultural and forestry robot in the terrain simulation model; selecting an observation point according to the space state of the target operation equipment so as to display at least a local area of the terrain simulation model in the visual field range of the observation point, wherein the method comprises the following steps: selecting an observation point according to the space state of the agriculture and forestry robot, and displaying a local area of the terrain simulation model in the visual field range of the observation point and the agriculture and forestry robot; the computer program when executed by the processor further realizes the steps of: acquiring a target operation equipment switching instruction; determining a corresponding space state of the unmanned aerial vehicle in a terrain simulation model; and adjusting the observation point according to the space state of the unmanned aerial vehicle, and switching to a local area for displaying the terrain simulation model in the visual field range of the adjusted observation point and the unmanned aerial vehicle.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database or other medium used in the embodiments provided herein can include at least one of non-volatile and volatile memory. Non-volatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical storage, or the like. Volatile Memory can include Random Access Memory (RAM) or external cache Memory. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. An agriculture and forestry operation scene display method is characterized by comprising the following steps:
acquiring a terrain simulation model corresponding to a target agriculture and forestry operation scene; the terrain simulation model comprises space geometric information and agriculture and forestry digital asset information of the target agriculture and forestry operation scene;
determining a corresponding space state of the target operation equipment in the terrain simulation model;
selecting an observation point according to the space state of the target operation equipment so as to display at least a local area of the terrain simulation model in the visual field range of the observation point;
acquiring user operation triggered in the displayed local area;
and updating and displaying the object operated by the user operation according to the user operation.
2. The method of claim 1, wherein the spatial state comprises a spatial position and orientation; selecting an observation point according to the space state of the target operation equipment so as to display at least a local area of the terrain simulation model in the visual field range of the observation point, wherein the method comprises the following steps:
taking the spatial position of the target operation equipment as an observation point, and taking the orientation of the target operation equipment as the observation direction of the observation point;
determining the visual field range of the observation point in the observation direction;
and displaying a local area of the terrain simulation model in the visual field range and the target operation equipment.
3. The method of claim 2, wherein the user operation is a drag operation; performing update display on the object operated by the user operation according to the user operation, including:
and in the action process of the dragging operation, updating the observation direction of the observation point in real time so as to display at least a local area of the terrain simulation model, which is positioned in the real-time visual field range of the observation point.
4. The method of claim 1, further comprising:
displaying the target operation equipment in the local area;
acquiring a trigger operation acting on the target operation equipment;
and displaying the equipment parameters of the target operation equipment in a first preset area according to the triggering operation.
5. The method of claim 1, wherein prior to determining the spatial state to which the target work equipment corresponds in the terrain simulation model, the method further comprises:
displaying at least two candidate working devices;
acquiring a target operating device selected from the candidate operating devices through a selection operation;
displaying the terrain simulation model according to a overlooking angle;
acquiring operation parameters selected from the terrain simulation model through user operation;
and generating an agriculture and forestry operation task according to the operation parameters and the target operation equipment, sending the agriculture and forestry operation task to a server, so as to instruct the server to plan an operation path according to the operation parameters, and informing the target operation equipment to execute the agriculture and forestry operation task according to the operation path.
6. The method of claim 5, further comprising:
displaying the target operation equipment in the local area;
displaying a two-dimensional thumbnail corresponding to the terrain simulation model in a second preset area; the operation path and the current operation position of the target operation equipment are displayed in the two-dimensional thumbnail;
and when the space state of the target operation equipment changes, adjusting the observation point in real time to update the local area and the two-dimensional thumbnail.
7. The method of claim 1, wherein the target work equipment comprises an agricultural and forestry robot and an unmanned aerial vehicle; when the agriculture and forestry robot is selected as the currently displayed target operation equipment through user operation, the determining of the corresponding space state of the target operation equipment in the terrain simulation model comprises the following steps:
determining a corresponding space state of the agricultural and forestry robot in the terrain simulation model;
selecting an observation point according to the space state of the target operation equipment so as to display at least a local area of the terrain simulation model in the visual field range of the observation point, wherein the method comprises the following steps:
selecting an observation point according to the space state of the agriculture and forestry robot, and displaying a local area of the terrain simulation model in the visual field range of the observation point and the agriculture and forestry robot;
the method further comprises the following steps:
acquiring a target operation equipment switching instruction;
determining a corresponding space state of the unmanned aerial vehicle in the terrain simulation model;
and adjusting the observation point according to the space state of the unmanned aerial vehicle, and switching to a local area showing that the terrain simulation model is located in the adjusted view range of the observation point and the unmanned aerial vehicle.
8. The utility model provides an agriculture and forestry operation scene display device which characterized in that, the device includes:
the acquisition module is used for acquiring a terrain simulation model corresponding to a target agriculture and forestry operation scene; the terrain simulation model comprises space geometric information and agriculture and forestry digital asset information of the target agriculture and forestry operation scene;
the determining module is used for determining the corresponding space state of the target operation equipment in the terrain simulation model;
the display module is used for selecting an observation point according to the space state of the target operation equipment so as to display at least a local area of the terrain simulation model in the visual field range of the observation point;
the acquisition module is further used for acquiring user operation triggered in the displayed local area;
and the updating module is used for updating and displaying the object operated by the user operation according to the user operation.
9. A terminal comprising a memory and a processor, the memory storing a computer program, characterized in that the processor, when executing the computer program, implements the steps of the method of any of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 7.
CN202011176833.5A 2020-10-29 2020-10-29 Agriculture and forestry operation scene display method and device, terminal and storage medium Pending CN112270751A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011176833.5A CN112270751A (en) 2020-10-29 2020-10-29 Agriculture and forestry operation scene display method and device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011176833.5A CN112270751A (en) 2020-10-29 2020-10-29 Agriculture and forestry operation scene display method and device, terminal and storage medium

Publications (1)

Publication Number Publication Date
CN112270751A true CN112270751A (en) 2021-01-26

Family

ID=74344403

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011176833.5A Pending CN112270751A (en) 2020-10-29 2020-10-29 Agriculture and forestry operation scene display method and device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN112270751A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109917813A (en) * 2019-04-19 2019-06-21 成都蔚来空间科技有限公司 Unmanned plane autonomous flight three-dimensional scenic display methods and terminal
CN109947038A (en) * 2019-04-29 2019-06-28 李志海 A kind of wisdom agricultural weather robot operating system
CN110262476A (en) * 2019-05-21 2019-09-20 北京农业信息技术研究中心 Cloud platform is broadcast live in one plant growth
CN110765620A (en) * 2019-10-28 2020-02-07 上海科梁信息工程股份有限公司 Aircraft visual simulation method, system, server and storage medium
CN111443723A (en) * 2020-04-07 2020-07-24 中国航空无线电电子研究所 Program for generating and displaying third visual angle view of unmanned aerial vehicle
CN111652964A (en) * 2020-04-10 2020-09-11 合肥工业大学 Auxiliary positioning method and system for power inspection unmanned aerial vehicle based on digital twinning

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109917813A (en) * 2019-04-19 2019-06-21 成都蔚来空间科技有限公司 Unmanned plane autonomous flight three-dimensional scenic display methods and terminal
CN109947038A (en) * 2019-04-29 2019-06-28 李志海 A kind of wisdom agricultural weather robot operating system
CN110262476A (en) * 2019-05-21 2019-09-20 北京农业信息技术研究中心 Cloud platform is broadcast live in one plant growth
CN110765620A (en) * 2019-10-28 2020-02-07 上海科梁信息工程股份有限公司 Aircraft visual simulation method, system, server and storage medium
CN111443723A (en) * 2020-04-07 2020-07-24 中国航空无线电电子研究所 Program for generating and displaying third visual angle view of unmanned aerial vehicle
CN111652964A (en) * 2020-04-10 2020-09-11 合肥工业大学 Auxiliary positioning method and system for power inspection unmanned aerial vehicle based on digital twinning

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
吕文磊等: "水下作业观测系统布局可视化仿真软件的研究", 《电子技术与软件工程》 *
方子岩等: "《摄影测量学》", 31 December 2012 *

Similar Documents

Publication Publication Date Title
US11175651B2 (en) Method, device and system for presenting operation information of a mobile platform
CA3120725C (en) Surveying and mapping system, surveying and mapping method and device, apparatus and medium
CN109215486B (en) Electronic map marking and displaying method and device, terminal equipment and storage medium
CN110309236B (en) Method, device, computer equipment and storage medium for finding way in map
CN107430686A (en) Mass-rent for the zone profiles of positioning of mobile equipment creates and renewal
US11030808B2 (en) Generating time-delayed augmented reality content
US11042961B2 (en) Spatial processing for map geometry simplification
CN113741698A (en) Method and equipment for determining and presenting target mark information
CN109154503A (en) The planing method and ground end equipment in unmanned machine operation course line
CN108885467B (en) Control method, terminal, management platform, system and storage medium
CN112462756A (en) Agriculture and forestry operation task generation method and device, computer equipment and storage medium
KR20210105345A (en) Surveying and mapping methods, devices and instruments
CN114373047A (en) Method, device and storage medium for monitoring physical world based on digital twin
CN114782646A (en) House model modeling method and device, electronic equipment and readable storage medium
CN110928959B (en) Determination method and device of relationship characteristic information between entities, electronic equipment and storage medium
EP4261789A1 (en) Method for displaying posture of robot in three-dimensional map, apparatus, device, and storage medium
CN115265520A (en) Intelligent operation equipment and mapping method, device and storage medium thereof
CN114385934A (en) System for jointly inquiring multiple AR maps
CA3120722C (en) Method and apparatus for planning sample points for surveying and mapping, control terminal and storage medium
CN112270751A (en) Agriculture and forestry operation scene display method and device, terminal and storage medium
CN115346008A (en) Three-dimensional visualization processing method, device and equipment for engineering investigation and storage medium
WO2019137212A1 (en) Method and apparatus for visualization of public welfare activities
CN114564526A (en) Three-dimensional earth data display method, device, equipment and storage medium
KR20210106422A (en) Job control system, job control method, device and instrument
CN110196638B (en) Mobile terminal augmented reality method and system based on target detection and space projection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210126

RJ01 Rejection of invention patent application after publication