CN114527896A - Image interaction method, device, equipment and storage medium - Google Patents

Image interaction method, device, equipment and storage medium Download PDF

Info

Publication number
CN114527896A
CN114527896A CN202210165034.0A CN202210165034A CN114527896A CN 114527896 A CN114527896 A CN 114527896A CN 202210165034 A CN202210165034 A CN 202210165034A CN 114527896 A CN114527896 A CN 114527896A
Authority
CN
China
Prior art keywords
image
image processing
instruction
state
state information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210165034.0A
Other languages
Chinese (zh)
Inventor
杨鑫
姜凯英
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Technology Shenzhen Co Ltd
Original Assignee
Ping An Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Technology Shenzhen Co Ltd filed Critical Ping An Technology Shenzhen Co Ltd
Priority to CN202210165034.0A priority Critical patent/CN114527896A/en
Priority to PCT/CN2022/090724 priority patent/WO2023159761A1/en
Publication of CN114527896A publication Critical patent/CN114527896A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses an image interaction method, an image interaction device, an image interaction apparatus, an image interaction medium and a program product, wherein in the image interaction method, a first instruction for realizing the opening of an image processing toolbox is responded, and the opening state of the image processing toolbox, which is used for representing the image processing toolbox, is displayed; responding to a second instruction of opening the image processing tool box, and displaying the opening state of the image processing tool box and the to-be-selected state of the first operation option of the image processing tool box; responding to a third instruction for carrying out first operation on the image, and displaying an opening state of the image processing tool box for representing the first operation and a candidate state of a second operation option of the image processing tool box; responding to a fourth instruction for realizing second operation on the image, and displaying the opening state of the second operation option; the first operation is used for realizing cloud removal operation of the image, and the second operation is used for realizing image enhancement operation. The method and the device can be widely applied to the field of images.

Description

Image interaction method, device, equipment and storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image interaction method, apparatus, device, and storage medium.
Background
In recent years, with the development of computer technology and remote sensing technology, digital image processing technology has been widely applied in agriculture. However, the remote sensing technology has certain problems in the development and application of agriculture, for example, the situation of cloud layers or shadows appearing in remote sensing images due to weather causes, or only part of information of images generated by remote sensing has reference value in some professional application fields, and other information can be judged by images instead, so that how to reduce the influence of weather conditions and other unnecessary information on the remote sensing images, and selecting a proper image cloud removing technology and an image enhancement technology and developing a corresponding product interaction strategy have practical significance for the application of satellite remote sensing in agricultural scenes.
In the related technology, the idea of high-frequency emphasis filtering is introduced into the field of cloud removal, although the effect is obvious, the method is only suitable for processing the image of the fast bird, and the problem of satellite data acquired by other sensors is difficult to solve; therefore, the user cannot feel the improvement of the effect of the new technology on the remote sensing image.
In summary, the problems of the related art need to be solved.
Disclosure of Invention
The present application aims to solve at least one of the technical problems in the related art to some extent.
Therefore, an object of the embodiments of the present application is to provide an image interaction method, which is convenient for a user to better experience an improvement in an effect of a new technology on a satellite remote sensing image, so that the user can better obtain the satellite remote sensing image with an image cloud removal and enhancement effect.
In order to achieve the technical purpose, the technical scheme adopted by the embodiment of the application comprises the following steps:
in one aspect, an embodiment of the present application provides an image interaction method, including:
displaying first state information of an image processing tool box in response to a first instruction, wherein the first instruction is used for realizing the opening of the image processing tool box, and the first state information represents the opening state of the image processing tool box;
responding to a second instruction, displaying second state information of the image processing tool box, wherein the second instruction is used for realizing displaying of a first operation option of the image processing tool box, and the second state information represents an opening state of the image processing tool box and a candidate state of the first operation option;
responding to a third instruction, displaying third state information of the image processing tool box, wherein the third instruction is used for realizing the first operation option on the image and displaying a second operation option of the image processing tool box, and the third state information represents an opening state of the first operation option and a candidate state of the second operation option;
responding to a fourth instruction, displaying fourth state information of the image processing tool box, wherein the fourth instruction is used for realizing the second operation option on the image, and the fourth state information represents the opening state of the second operation option;
the first operation option is used for realizing cloud removal operation on the image, and the second operation option is used for realizing image enhancement operation on the image.
In addition, the image interaction method according to the above embodiment of the present application may further have the following additional technical features:
further, in one embodiment of the present application, the method further comprises the steps of:
displaying third state information of the image processing tool kit in response to a fifth instruction;
wherein the fifth instruction is to enable turning off the second operation option.
Further, in one embodiment of the present application, the method further comprises the steps of:
displaying second state information of the image processing tool kit in response to a sixth instruction;
wherein the sixth instruction is to implement closing the first operation option.
Further, in an embodiment of the present application, the interaction method further includes the following steps:
displaying the first state information of the image processing toolbox in response to a seventh instruction;
wherein the seventh instruction is to enable collapsing at least one of the first operational option or the second operational option.
Further, in an embodiment of the present application, the interaction method further includes the following steps:
displaying fourth state information of the image processing tool kit in response to an eighth instruction;
wherein the eighth instruction is to enable expanding at least one of the first operational option or the second operational option.
Further, in an embodiment of the present application, before the step of displaying the first state information of the image processing toolbox in response to the first instruction, the interaction method further includes:
displaying fifth state information of the image processing tool kit in response to a ninth instruction;
wherein the ninth instruction is for obtaining the image, and the fifth status information characterizes a candidate status of the image processing toolbox.
Further, in an embodiment of the present application, the interaction method further includes: displaying fifth state information of the image processing toolbox in response to a tenth instruction;
wherein the tenth instruction is to shut down the image processing tool.
On the other hand, an embodiment of the present application further provides another image interaction apparatus, including:
a first module, configured to display first state information of an image processing toolbox in response to a first instruction, where the first instruction is used to enable opening of the image processing toolbox, and the first state information represents an open state of the image processing toolbox;
a second module, configured to display second state information of the image processing toolbox in response to a second instruction, where the second instruction is used to implement displaying a first operation option of the image processing toolbox, and the second state information represents an open state of the image processing toolbox and a candidate state of the first operation option;
a third module, configured to display third state information of the image processing toolbox in response to a third instruction, where the third instruction is used to implement performing the first operation option on the image and displaying a second operation option of the image processing toolbox, and the third state information represents an open state of the first operation option and a candidate state of the second operation option;
a fourth module, configured to display fourth state information of the image processing toolbox in response to a fourth instruction, where the fourth instruction is used to implement the second operation option on the image, and the fourth state information represents an open state of the second operation option;
the first operation option is used for realizing cloud removal operation on the image, and the second operation option is used for realizing image enhancement operation on the image.
In another aspect, an embodiment of the present application provides a computer device, including:
at least one processor;
at least one memory for storing at least one program;
the at least one program, when executed by the at least one processor, causes the at least one processor to implement the image interaction method described above.
On the other hand, the embodiment of the present application further provides a computer-readable storage medium, in which a program executable by a processor is stored, and the program executable by the processor is used for implementing the image interaction method when executed by the processor.
Advantages and benefits of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application:
in the image interaction method, in response to a first instruction for opening an image processing toolbox, first state information for representing whether the image processing toolbox is opened or not is displayed; responding to a second instruction of opening the image processing tool box, and displaying the opening state of the image processing tool box and the candidate state of the first operation option of the image processing tool box; responding to a third instruction for performing first operation on the image, and displaying an opening state of the image processing tool box for representing the first operation option and a candidate state of a second operation option of the image processing tool box; responding to a fourth instruction for realizing second operation on the image, and displaying the opening state of the second operation option; the first operation option is used for realizing cloud removal operation of the image, and the second operation option is used for realizing image enhancement operation. The method can conveniently and correspondingly display the first state information, the second state information, the third state information, the fourth state information and the third state information through the first instruction, the fourth state information and the fourth instruction, so that a user can obtain the cloud removing and enhancing effects of the image by combining the third state information and the fourth state information through simple interactive operation, the user can feel the improvement of the effect of the new technology on the satellite remote sensing image better, and the user can monitor crops more visually and predict the crops more accurately.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings of the embodiments of the present application or the related technical solutions in the prior art are described below, it should be understood that the drawings in the following description are only for convenience and clarity of describing some embodiments of the technical solutions of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a schematic diagram of an implementation environment of an image interaction method provided in an embodiment of the present application;
fig. 2 is a schematic flowchart of an image interaction method provided in an embodiment of the present application;
fig. 3 is a schematic flowchart of an image interaction method provided in an embodiment of the present application;
fig. 4 is a schematic structural diagram of a computer device provided in an embodiment of the present application.
Detailed Description
The present application is further described with reference to the following figures and specific examples. The described embodiments should not be considered as limiting the present application, and all other embodiments obtained by a person skilled in the art without making any inventive step are within the scope of protection of the present application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is understood that "some embodiments" may be the same subset or different subsets of all possible embodiments, and may be combined with each other without conflict.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the present application only and is not intended to be limiting of the application.
Before further detailed description of the embodiments of the present application, terms and expressions referred to in the embodiments of the present application will be described, and the terms and expressions referred to in the embodiments of the present application will be used for the following explanation.
1) Block chains: (Blockchain) is a novel application mode of computer technologies such as distributed data storage, point-to-point transmission, consensus mechanism, encryption algorithm and the like. The blockchain is essentially a decentralized database, which is a string of data blocks associated by using cryptography, each data block contains information of a batch of network transactions, and the information is used for verifying the validity (anti-counterfeiting) of the information and generating the next block. The blockchain may include a blockchain underlying platform, a platform product services layer, and an application services layer. The block chain underlying platform can comprise processing modules such as user management, basic service, intelligent contract and operation monitoring. The user management module is responsible for identity information management of all blockchain participants, and comprises public and private key generation maintenance (account management), key management, user real identity and blockchain address corresponding relation maintenance (authority management) and the like, and under the authorization condition, the user management module supervises and audits the transaction condition of certain real identities and provides rule configuration (wind control audit) of risk control; the basic service module is deployed on all block chain node equipment and used for verifying the validity of the service request, recording the service request to storage after consensus on the valid request is completed, for a new service request, the basic service firstly performs interface adaptation analysis and authentication processing (interface adaptation), then encrypts service information (consensus management) through a consensus algorithm, transmits the service information to a shared account (network communication) completely and consistently after encryption, and performs recording and storage; the intelligent contract module is responsible for registering and issuing contracts, triggering the contracts and executing the contracts, developers can define contract logics through a certain programming language, issue the contract logics to a block chain (contract registration), call keys or other event triggering execution according to the logics of contract clauses, complete the contract logics and simultaneously provide the function of canceling contract upgrading logout; the operation monitoring module is mainly responsible for deployment, configuration modification, contract setting, cloud adaptation in the product release process and visual output of real-time states in product operation, such as: alarm, monitoring network conditions, monitoring node equipment health status, and the like. The platform product service layer provides basic capability and an implementation framework of typical application, and developers can complete block chain implementation of business logic based on the basic capability and the characteristics of the superposed business. The application service layer provides the application service based on the block chain scheme for the business participants to use.
2) The style refers to a collection of character formats and paragraph formats saved by meaningful names, so that when a repeated format is arranged, a style of the format is created, and then the style is applied where needed, and the repeated formatting operation on the styles is not needed.
3) Page information is organized in page information, the information pages are realized by languages, and hypertext links are established among the information pages for browsing.
4) In response to the condition or state on which the performed operation depends, one or more of the performed operations may be in real-time or may have a set delay when the dependent condition or state is satisfied; there is no restriction on the order of execution of the operations performed unless otherwise specified.
In the related art, with the development of computer technology and remote sensing technology, digital image processing technology has been widely applied in agriculture, and satellite remote sensing image processing technology has been developed. However, the satellite remote sensing image processing technology has certain problems in the development and application of agriculture, for example, the situation that cloud layers or shadows appear in the satellite remote sensing image due to weather is difficult to eliminate, and for example, in some professional application fields, only part of information in the satellite remote sensing image has reference value, and other redundant information can generate noise to influence the judgment of the satellite remote sensing image conclusion. Therefore, how to reduce the influence of weather reasons and other unnecessary information on the satellite remote sensing image, how to select a proper image processing technology and develop a corresponding product interaction strategy has practical significance to the satellite remote sensing image processing technology in an agricultural scene.
At present, the image cloud removing technology and the image enhancement technology have specific application in a plurality of fields, but the research has high specialty and particularity. Compared with the method that a high-frequency emphasis filtering thought is introduced into the cloud removing field, the method is only suitable for processing the fast bird images although the cloud removing effect is obvious, and the problem of satellite remote sensing image data acquired by other sensors is difficult to solve. Therefore, the technology development and the effect application of satellite remote sensing image processing in the agricultural scene have certain initiatives. According to the invention, the interaction strategy of the satellite remote sensing image processing toolbox is researched on the basis of technical development, so that a user can better experience the effect improvement of a new technology on the satellite remote sensing image, the cloud removing and enhancing effects of the image obtained by the user are better, the display efficiency of the image is improved, and the monitoring of the user on the planting area, the growth vigor, the soil moisture content, the yield and major natural disasters of crops is more visual and more accurate in prediction.
In order to solve the problems in satellite remote sensing image processing in the related art, embodiments of the present application provide an image interaction method, an apparatus, a device, and a storage medium, in which on one hand, first state information of an image processing toolbox is displayed in response to a first instruction, where the first instruction is used to implement region selection on a sample image, and the first state information represents an open state of the image processing toolbox; responding to a second instruction, and displaying second state information of the image processing tool box, wherein the second instruction is used for realizing the opening of the image processing tool box, and the second state represents the opening state of the image processing tool box and the to-be-selected state of the first operation option of the image processing tool box; responding to a third instruction, and displaying a third state of the image processing tool box, wherein the third instruction is used for realizing the first operation option of the image, and the third state represents the starting state of the first operation option and the to-be-selected state of a second operation option of the image processing tool box; responding to a fourth instruction, and displaying a fourth state of the image processing tool box, wherein the fourth instruction is used for realizing second operation on the image, and the fourth state represents the opening state of a second operation option; the first operation option is used for realizing cloud removal operation on the image, and the second operation option is used for realizing image enhancement operation on the image; according to the method, the cloud removing and enhancing technologies of the remote sensing images in the agricultural scene are used, the processing operations of the two satellite remote sensing images, namely the cloud removing operation and the image enhancing operation, are independently or comprehensively applied through the interaction mode of the image processing toolbox, and the effects of the two image processing operations are better displayed to a user.
Fig. 1 is a schematic diagram of an implementation environment of an image interaction method according to an embodiment of the present application. Referring to fig. 1, the software and hardware main body of the implementation environment mainly comprises a terminal 101 and a server 102, and the terminal 101 is connected with the server 102 in a communication way. Wherein the image interaction method may be performed based on the interaction between both the terminal 101 and the server 102. In addition, the terminal 101 and the server 102 may be nodes in a block chain, which is not limited in this embodiment.
The embodiment of the application provides an image interaction method, which is used for realizing an interactive display task of objects such as satellite remote sensing images and the like based on cloud removal operation and image enhancement operation, and can be executed by a terminal 101, or can be applied to a server 102, or can be applied to an application scene formed by the terminal 101 and the server 102.
Specifically, the image interaction instruction may be triggered by a user by clicking, pulling down a menu, and the like in a main interface of a client executed by the terminal 101.
In a possible implementation manner, the client executed by the terminal 101 may be an image interaction application independently executed on the terminal 101, and the image interaction application may be an instant messaging application, an electronic payment application, or other application. The terminal 101 installs the image interactive application, logs in the image interactive application based on the user identification, and realizes interaction between the user and the terminal 101 through the image interactive application.
In another possible implementation manner, the image interaction client executed by the terminal 101 may be a sub-application executed in an internet application, and the internet application may be an instant messaging application, an electronic payment application, a map application, other applications, and the like. The sub-applications can be public numbers or small programs in internet applications, the public numbers are interactive forms, and all-round communication and interaction with concerned users of the public numbers can be realized through characters, pictures, voice, videos and the like. An applet is an application that can be used without download and installation. The terminal 101 installs the internet application, logs in the internet application based on the user identifier, and logs in the sub-application through the user identifier of the logged-in internet application, so that the sub-application is operated in the internet application, and the interaction between the user and the terminal 101 is realized.
Specifically, the terminal 101 in the present application may include, but is not limited to, any one or more of a smart camera device, a smart watch, a smart phone, a computer, a Personal Digital Assistant (PDA), a smart voice interaction device, a smart appliance, or a vehicle-mounted terminal. The server 102 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a Network service, cloud communication, a middleware service, a domain name service, a security service, a CDN (Content Delivery Network), a big data and artificial intelligence platform. The communication connection between the terminal 101 and the server 102 may be established through a wireless Network or a wired Network, which uses standard communication technology and/or protocol, and the Network may be set as the internet, or may be any other Network, such as but not limited to a Local Area Network (LAN), a Metropolitan Area Network (MAN), a Wide Area Network (WAN), a mobile, wired or wireless Network, a private Network, or any combination of virtual private networks. Of course, it is understood that the implementation environment in fig. 1 is only an optional application scenario of the image interaction method provided in the embodiment of the present application, and an actual application is not fixed to the software and hardware environment shown in fig. 1.
The image interaction method provided in the embodiment of the present application is described in detail below with reference to the implementation environment shown in fig. 1.
Referring to fig. 2, fig. 2 is a schematic flowchart of an image interaction method provided in an embodiment of the present application, and the image interaction method shown in fig. 2 is mainly executed on the server 102 side. The method may be applied to the relevant computer devices in the server 102, but is not limited to the above form. The method of fig. 2 includes, but is not limited to, steps 210 through 230.
Step 210: and responding to a first instruction, and displaying first state information of the image processing tool box, wherein the first instruction is used for realizing the opening of the image processing tool box, and the first state information represents the opening state of the image processing tool box.
Before step 210, in response to a ninth instruction, displaying fifth state information of the image processing toolbox; wherein the ninth instruction is for obtaining an image and the fifth status information characterizes a candidate status of the image processing toolbox. The image processing toolbox defaults to the off/standby state and the icon of the image processing toolbox is displayed in gray. And after the interaction equipment receives a ninth instruction for acquiring the image, which is triggered after the input of the user, the image processing tool box acquires the image to be processed, and the image processing tool box is in a candidate state or a closed state.
It should be noted that words such as "click open image processing toolbox" may be automatically displayed on the gray image processing toolbox for further operation prompt, so as to remind the user to open the image processing toolbox by clicking the image processing toolbox or clicking the sample image area. It should be noted that, when the image processing toolbox is in a certain state, if the next operation can be performed on the image processing toolbox, the prompt of the next operation, such as "click open the image processing toolbox", "click open the cloud removing operation", "click open the image enhancing operation", can be displayed near the image processing toolbox, such as up, down, left, right, and the like.
In this step, when the user issues a first instruction to the image processing tool box, the image processing tool box displays first state information thereof in response to the first instruction. The image interaction application terminal refers to a related device for performing image interaction, and the specific device type has been described in the foregoing for example, and is not described herein again. The first instruction in this embodiment may also be to select an area of the sample image, so as to open the image processing toolbox for the area of the sample image selected by the user, and in response to the first instruction, the image processing toolbox displays first state information, where the first state information is an open state of the image processing toolbox.
In this step, when the image processing tool box is opened, the image processing tool box may be displayed in an icon highlight state, or the image processing tool box may be displayed in a color different from the original gray icon. According to the icon change of the image processing tool box, a user can know whether the image processing tool box is in a closed or open state, so that the user can conveniently judge the working state of the image processing tool box, the user can be better helped to make guidance of the next operation, and an image interaction task is completed.
In the embodiment of the present application, the display configuration of the open state of the image processing tool box is not limited to the case of gray color, highlight color, or configuration with a color different from gray color. In other words, the implementation of the present application includes both the configuration of the closed state of the image processing tool box and the configuration of the open state of the image processing tool box, and the difference between the two states can be flexibly selected according to the requirements.
Step 220: and responding to a second instruction, and displaying second state information of the image processing tool box, wherein the second instruction is used for realizing the display of the first operation option of the image processing tool box, and the second state information represents the opening state of the image processing tool box and the candidate state of the first operation option.
In this step, when the user issues a second instruction to the image processing tool box, the image processing tool box displays second state information thereof in response to the second instruction. And when the user opens the image processing tool box, responding to a second instruction, and displaying a first operation option originally folded in the image processing tool box by the image processing tool box, wherein the first operation option is in a gray candidate state. After the image processing tool box is opened in step 210, by clicking an icon of the image processing tool box, the first operation option originally folded in the image processing tool box may appear, and after the first operation option in the gray candidate state appears, the user may specifically select the operation of performing the first operation option on the image. The first operation is used for achieving cloud removing operation of the image, and cloud removing processing of the image can be achieved by clicking the first operation option in the opening state of the image processing toolbox.
In the embodiment of the present application, the first operation option may also automatically appear after the image processing toolbox is opened; more specifically, once the image processing tool box is opened, the first operation option is switched to a standby state, and the like. Generally, the first operational option is disposed adjacent to the image processing tool box. Moreover, it can be understood that, in the embodiment of the present application, the occurrence of the first operation option represents an alternative state of the first operation, and the expression form of the first operation option may be any one of common alternative state expression forms, which is not limited in this application.
Step 230: and responding to a third instruction, and displaying third state information of the image processing tool box, wherein the third instruction is used for realizing the first operation option of the image and displaying a second operation option of the image processing tool box, and the third state information represents the starting state of the first operation option and the candidate state of the second operation option.
In this step, when the user issues a third instruction to the image processing tool box, the image processing tool box displays third status information thereof in response to the third instruction. When the image processing tool box is in an open state and the first operation option is in a candidate state, the user sends an instruction for performing first operation on the sample image, the instruction may be a third instruction for starting the first operation on the image, namely performing cloud removal operation on the image, at this time, the image is gradually or immediately displayed in a state after cloud removal, the first operation option is changed from the original candidate state to the open state, and a second operation option is displayed on one side of the image processing tool box icon or one side of the first operation option, at this time, the second operation option is in a gray candidate state. At this time, the first operation, that is, the cloud removal operation, has been performed on the image, and the image is displayed in a cloud-free or cloud-poor state after the cloud removal operation. Notably, the second operation is used to implement an image enhancement operation on the image.
It is understood that, in the embodiment of the present application, the second operation option may also automatically appear after the first operation option of the image processing tool box is turned on; more specifically, once the first operation option of the image processing tool box is opened, the second operation option is switched to a standby state, and the like. Generally, the second operational option is disposed adjacent to the image processing tool box or adjacent to the first operational option of the image processing tool box. Moreover, it can be understood that, in the embodiment of the present application, the occurrence of the second operation option represents an alternative state of the second operation, and the expression form of the second operation option may be any one of common alternative state expression forms, which is not limited in this application.
And 240, responding to a fourth instruction, and displaying fourth state information of the image processing tool box, wherein the fourth instruction is used for implementing a second operation option on the image, and the fourth state information represents an opening state of the second operation option.
In this step, when the user issues a fourth instruction to the image processing tool box, the image processing tool box displays fourth status information thereof in response to the fourth instruction. The fourth instruction is used for realizing second operation, namely image enhancement operation, on the image, when a user sends an instruction for image enhancement operation on the image, the image processing toolbox and the first operation option are both in an open state, the second operation option is changed from a standby state to an open state, the second operation, namely the image enhancement operation, on the image is already realized, and the image is displayed in an image enhancement state after the cloud removal operation.
At this time, the image processing tool box realizes the first operation and the second operation on the image, and the image processing tool box is in the fourth state, namely the image processing tool box, the first operation option and the second operation option are all in the opening state. If the user thinks that the interaction of the image is achieved, the user can directly click the image processing tool box or click a blank position beside the image processing tool box to realize that the functions of the image processing tool box are completely closed.
It should be noted that the first operation may also be a defogging and defogging process, and the second operation may also be an image resolution enhancement operation using a spatio-temporal super-resolution technique. The cloud and fog removing processing is carried out on the original satellite remote sensing image with the cloud and fog, and the image resolution is improved through a space-time super-resolution technology, so that the real-time updating of the remote sensing image with relatively high quality and relatively low cost is achieved.
In some embodiments, the image interaction method further comprises the steps of:
displaying third state information of the image processing tool box in response to a fifth instruction;
wherein the fifth instruction is to implement closing the second operation on the image.
In this step, after the image processing toolbox, the first operation option, and the second operation option are opened in step 240, if the user needs to view the image when the second operation option is closed, a fifth instruction may be issued, specifically, the fifth instruction may be an instruction to close the second operation on the image, and is used to close the second operation option, at this time, the image processing toolbox, the first operation option are opened, and the second operation option is closed.
In some embodiments, the image interaction method further comprises the steps of:
displaying second state information of the image processing tool kit in response to the sixth instruction;
and the sixth instruction is used for closing the first operation on the image.
In this step, after the second operation performed on the image is closed in response to the fifth instruction, a sixth instruction may be further issued to the image processing toolbox to close the first operation performed on the image. At this time, the user can view the original state of the image, that is, the state of the image when the cloud removal and image enhancement operations are not performed, by closing the first operation. At this time, the image processing tool box is in an open state, and the first operation option and the second operation option are in a closed state or a candidate state.
In some embodiments, the image interaction method further comprises the steps of:
displaying first state information of the image processing tool kit in response to a seventh instruction;
wherein the seventh instruction is for implementing the collapse of the first operational option and the second operational option.
In this step, when the image processing tool box, the first operation option and the second operation option are all in the open state, the user sends a seventh instruction for retracting the first operation option and the second operation option, and only the open state of the image processing tool box is displayed. Specifically, the user can pack the first operation option and the second operation option at one time by clicking the image processing toolbox icon at a single time, and can pack the first operation option by clicking the image processing toolbox icon at the first time, and pack the second operation option by clicking the image processing toolbox icon at the second time.
In some embodiments, the image interaction method further comprises the steps of:
displaying a fourth state of the image processing toolbox in response to the eighth instruction;
wherein the eighth instruction is for implementing the expanding the first operational option and the second operational option.
In this step, after the seventh instruction is completed, the user may further send an eighth instruction to the image processing toolbox for expanding the collapsed first operation option and the second operation option, so that the image processing toolbox is in a fourth state, that is, the image processing toolbox, the first operation option, and the second operation option are all in an open state. Specifically, the expansion of the first operation option and the second operation option may be realized by clicking or double clicking an image processing toolbox icon.
In some embodiments, the image interaction method further comprises the steps of:
displaying fifth state information of the image processing toolbox in response to the tenth instruction;
wherein the tenth instruction is for closing the image processing tool box.
In this step, no matter which state the image processing tool box is in, the user may issue a tenth instruction thereto for closing the image processing tool box. Specifically, the closing of the image processing toolbox may be achieved by double-clicking the image processing toolbox, so that it returns to the fifth state information, i.e., the closed or candidate state.
In the embodiment of the application, the image can be a satellite remote sensing image, and the image processing tool box can be a satellite remote sensing image processing tool box. The application researches the interaction strategy of the satellite remote sensing image processing toolbox on the basis of technical development, facilitates users to better experience the improvement of the effect of a new technology on satellite remote sensing images, enables the users to obtain optimal and highest-efficiency image cloud removal and enhancement effects, and enables the users to monitor the planting area, growth vigor, soil moisture, yield and major natural disasters of crops more visually and more accurately to predict.
Referring to fig. 3, fig. 3 is a diagram illustrating an image interaction method according to an embodiment of the present application, in which, initially, a toolbox (i.e., an image processing toolbox) is in a default closed state, and an icon of the toolbox is in a gray to-be-selected state, in which a user can select whether to open the toolbox. If the tool box is selected to be opened, the tool box icon is highlighted, the cloud removing effect icon appears, the state of the cloud removing effect icon is a gray to-be-selected state and is displayed in parallel with the tool box icon, and in the state, a user can select whether the cloud removing effect on the image is opened or not. If the cloud removing effect is started, the cloud removing effect icon is changed into a highlight state from a gray to-be-selected state, the image to be processed is changed into an image with the cloud removing effect from an original state, meanwhile, the image enhancement effect icon appears and is in a gray to-be-selected state, and in the state, a user can select whether to start the enhancement effect on the image. If the image enhancement effect is selected to be started, the enhancement effect icon is changed from a gray to-be-selected state to a highlight state, and meanwhile, the to-be-processed image is changed from the image with the cloud removal effect to the cloud-removed image with the image enhancement effect. At this time, if there is further operation, the interactive process can be directly ended. If the user needs to check and open the image only with the cloud removing effect, an instruction for closing the enhancement effect can be sent out, the enhancement effect icon of the tool box is changed from a high-brightness state to a gray state to be selected, and meanwhile, the tool box and the cloud removing effect icon are in the high-brightness state. At this time, if the user needs to check the original image, an instruction for closing the cloud removal effect may be issued, the cloud removal effect icon of the toolbox is changed from a highlighted state to a gray to-be-selected state, and the toolbox icon is highlighted at the same time.
Based on the image interaction method, an embodiment of the present application further provides an image interaction apparatus, including a first module, configured to respond to a first instruction, and display first state information of an image processing toolbox, where the first instruction is used to implement opening of the image processing toolbox, and the first state represents an open state of the image processing toolbox;
the second module is used for responding to a second instruction and displaying second state information of the image processing tool box, wherein the second instruction is used for realizing the display of a first operation option of the image processing tool box, and the second state information represents the opening state of the image processing tool box and the to-be-selected state of the first operation option;
the third module is used for responding to a third instruction and displaying third state information of the image processing toolbox, wherein the third instruction is used for realizing the first operation on the image and displaying a second operation option of the image processing toolbox, and the third state information represents the starting state of the first operation and the to-be-selected state of the second operation option;
the fourth module is used for responding to a fourth instruction, displaying fourth state information of the image processing tool box, wherein the fourth instruction is used for realizing a second operation option for the image, and the fourth state information represents the opening state of the second operation option;
the first operation option is used for realizing cloud removal operation on the image, and the second operation option is used for realizing image enhancement operation on the image.
In the embodiment of the application, an image interaction device is provided, and in the device, the interaction strategy of a satellite remote sensing image processing tool kit is researched on the basis of technical development, so that a user can better feel the effect improvement of a new technology on a satellite remote sensing image, the effect of removing cloud and enhancing the image obtained by the user is optimal, and the efficiency is highest, and therefore the user can monitor the planting area, growth vigor, soil moisture, yield and major natural disasters of crops more intuitively and predict more accurately.
Referring to fig. 4, an embodiment of the present application further discloses an electronic device, including:
at least one processor 410;
at least one memory 420 for storing at least one program;
when the at least one program is executed by the at least one processor 410, the at least one processor 410 may be caused to implement the image interaction method embodiment as shown in fig. 2.
It can be understood that the contents in the image interaction method embodiment shown in fig. 2 are all applicable to the computer device embodiment, the functions implemented in the computer device embodiment are the same as the image interaction method embodiment shown in fig. 2, and the beneficial effects achieved by the computer device embodiment are also the same as the beneficial effects achieved by the image interaction method embodiment shown in fig. 2.
The embodiment of the application also discloses a computer readable storage medium, wherein a program executable by a processor is stored, and the program executable by the processor is used for realizing the embodiment of the image interaction method shown in fig. 2 when being executed by the processor.
It is understood that the contents in the embodiment of the image interaction method shown in fig. 2 are all applicable to the embodiment of the computer readable storage medium, the functions implemented by the embodiment of the computer readable storage medium are the same as the embodiment of the image interaction method shown in fig. 2, and the beneficial effects achieved by the embodiment of the image interaction method shown in fig. 2 are also the same as the beneficial effects achieved by the embodiment of the image interaction method shown in fig. 2.
In alternative embodiments, the functions/acts noted in the block diagrams may occur out of the order noted in the operational illustrations. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved. Furthermore, the embodiments presented and described in the flowcharts of the present application are provided by way of example in order to provide a more thorough understanding of the technology. The disclosed methods are not limited to the operations and logic flows presented herein. Alternative embodiments are contemplated in which the order of various operations is changed and in which sub-operations described as part of larger operations are performed independently.
Furthermore, although the present application is described in the context of functional modules, it should be understood that, unless otherwise stated to the contrary, one or more of the functions and/or features may be integrated in a single physical device and/or software module, or one or more functions and/or features may be implemented in separate physical devices or software modules. It will also be appreciated that a detailed discussion regarding the actual implementation of each module is not necessary for an understanding of the present application. Rather, the actual implementation of the various functional modules in the apparatus disclosed herein will be understood within the ordinary skill of an engineer given the nature, function, and interrelationships of the modules. Accordingly, those skilled in the art can, using ordinary skill, practice the present application as set forth in the claims without undue experimentation. It is also to be understood that the specific concepts disclosed are merely illustrative of and not intended to limit the scope of the application, which is defined by the appended claims and their full scope of equivalents.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk, and various media capable of storing program codes.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
In the foregoing description of the specification, reference to the description of "one embodiment/example," "another embodiment/example," or "certain embodiments/examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
While embodiments of the present application have been shown and described, it will be understood by those of ordinary skill in the art that: numerous changes, modifications, substitutions and alterations can be made to the embodiments without departing from the principles and spirit of the application, the scope of which is defined by the claims and their equivalents.
While the present application has been described with reference to the preferred embodiments, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims
In the description herein, references to the description of the term "one embodiment," "another embodiment," or "certain embodiments," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
While embodiments of the present application have been shown and described, it will be understood by those of ordinary skill in the art that: numerous changes, modifications, substitutions and alterations can be made to the embodiments without departing from the principles and spirit of the application, the scope of which is defined by the claims and their equivalents.

Claims (10)

1. An image interaction method, comprising:
displaying first state information of an image processing tool box in response to a first instruction, wherein the first instruction is used for realizing the opening of the image processing tool box, and the first state information represents the opening state of the image processing tool box;
responding to a second instruction, displaying second state information of the image processing tool box, wherein the second instruction is used for realizing displaying of a first operation option of the image processing tool box, and the second state information represents an opening state of the image processing tool box and a candidate state of the first operation option;
responding to a third instruction, displaying third state information of the image processing tool box, wherein the third instruction is used for realizing the first operation option on the image and displaying a second operation option of the image processing tool box, and the third state information represents an opening state of the first operation option and a candidate state of the second operation option;
responding to a fourth instruction, displaying fourth state information of the image processing tool box, wherein the fourth instruction is used for realizing the second operation option on the image, and the fourth state information represents the opening state of the second operation option;
the first operation option is used for realizing cloud removal operation on the image, and the second operation option is used for realizing image enhancement operation on the image.
2. The image interaction method according to claim 1, further comprising the steps of:
displaying third state information of the image processing tool kit in response to a fifth instruction;
wherein the fifth instruction is to enable turning off the second operation option.
3. The image interaction method according to claim 2, further comprising the steps of:
displaying second state information of the image processing tool kit in response to a sixth instruction;
wherein the sixth instruction is to implement closing the first operation option.
4. The image interaction method according to any one of claims 1, characterized in that the interaction method further comprises the steps of:
displaying the first state information of the image processing toolbox in response to a seventh instruction;
wherein the seventh instruction is to enable collapsing at least one of the first operational option or the second operational option.
5. The image interaction method according to claim 4, characterized in that the interaction method further comprises the steps of:
displaying fourth state information of the image processing tool kit in response to an eighth instruction;
wherein the eighth instruction is to enable expansion of at least one of the first operational option or the second operational option.
6. The image interaction method of claim 1, wherein prior to the step of displaying the first status information of the image processing toolbox in response to the first instruction, the interaction method further comprises:
displaying fifth state information of the image processing tool kit in response to a ninth instruction;
wherein the ninth instruction is for obtaining the image, and the fifth status information characterizes a candidate status of the image processing toolbox.
7. The image interaction method according to claim 6, further comprising: displaying fifth state information of the image processing toolbox in response to a tenth instruction;
wherein the tenth instruction is for closing the image processing tool box.
8. An image interaction apparatus, characterized in that the apparatus comprises:
a first module, configured to display first state information of an image processing toolbox in response to a first instruction, where the first instruction is used to enable opening of the image processing toolbox, and the first state information represents an open state of the image processing toolbox;
a second module, configured to display second state information of the image processing toolbox in response to a second instruction, where the second instruction is used to implement displaying a first operation option of the image processing toolbox, and the second state information represents an open state of the image processing toolbox and a candidate state of the first operation option;
a third module, configured to display third state information of the image processing toolbox in response to a third instruction, where the third instruction is used to implement performing the first operation option on the image and displaying a second operation option of the image processing toolbox, and the third state information represents an open state of the first operation option and a candidate state of the second operation option;
a fourth module, configured to display fourth state information of the image processing toolbox in response to a fourth instruction, where the fourth instruction is used to implement the second operation option on the image, and the fourth state information represents an open state of the second operation option;
the first operation option is used for realizing cloud removal operation on the image, and the second operation option is used for realizing image enhancement operation on the image.
9. An electronic device, comprising: a memory for storing executable instructions; a processor for implementing the image interaction method of any one of claims 1 to 7 when executing executable instructions stored in the memory.
10. A computer-readable storage medium storing executable instructions for implementing the image interaction method of any one of claims 1 to 7 when executed by a processor.
CN202210165034.0A 2022-02-22 2022-02-22 Image interaction method, device, equipment and storage medium Pending CN114527896A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210165034.0A CN114527896A (en) 2022-02-22 2022-02-22 Image interaction method, device, equipment and storage medium
PCT/CN2022/090724 WO2023159761A1 (en) 2022-02-22 2022-04-29 Image interaction method and apparatus, and device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210165034.0A CN114527896A (en) 2022-02-22 2022-02-22 Image interaction method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114527896A true CN114527896A (en) 2022-05-24

Family

ID=81624140

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210165034.0A Pending CN114527896A (en) 2022-02-22 2022-02-22 Image interaction method, device, equipment and storage medium

Country Status (2)

Country Link
CN (1) CN114527896A (en)
WO (1) WO2023159761A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110849814A (en) * 2019-11-20 2020-02-28 平衡机器科技(深圳)有限公司 Remote sensing image processing method based on multi-source remote sensing satellite
US20210034871A1 (en) * 2019-07-29 2021-02-04 Wistron Corporation Electronic device, interactive information display method and computer readable recording medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20210001555A (en) * 2019-06-28 2021-01-06 엠블레포 주식회사 Method of processing infrared image
CN110617800A (en) * 2019-08-21 2019-12-27 深圳大学 Emergency remote sensing monitoring method, system and storage medium based on civil aircraft
CN113645384A (en) * 2021-07-30 2021-11-12 上海商汤临港智能科技有限公司 Image information processing method, device, equipment and storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210034871A1 (en) * 2019-07-29 2021-02-04 Wistron Corporation Electronic device, interactive information display method and computer readable recording medium
CN110849814A (en) * 2019-11-20 2020-02-28 平衡机器科技(深圳)有限公司 Remote sensing image processing method based on multi-source remote sensing satellite

Also Published As

Publication number Publication date
WO2023159761A1 (en) 2023-08-31

Similar Documents

Publication Publication Date Title
EP3355225B1 (en) Apparatus and method for providing a ethereum virtual device
CN102253827B (en) Mashup infrastructure with learning mechanism
US8549471B2 (en) Method and apparatus for providing API service and making API mash-up, and computer readable recording medium thereof
CN102520841B (en) Collection user interface
CN102713848B (en) For using lightweight client to calculate, with virtualization, the method that service is docked by network
CN108694238A (en) Business data processing method, device based on block chain and storage medium
CN110929806B (en) Picture processing method and device based on artificial intelligence and electronic equipment
CN115511501A (en) Data processing method, computer equipment and readable storage medium
EP4364021A1 (en) Privacy transformations in data analytics
CN106796515A (en) For the system and method for the user interface frame of metadata driven
US20060080447A1 (en) Operation definition information producing method, operation definition information producing apparatus, operation definition information producing program, recording medium and data structure
CN102929592B (en) The method and device of the three-dimension interaction of equipment based on monitoring system
CN104184791A (en) Image effect extraction
US20030172010A1 (en) System and method for analyzing data
CN114527896A (en) Image interaction method, device, equipment and storage medium
CN110555732A (en) Marketing strategy pushing method and device and marketing strategy operation platform
CN115756472A (en) Cloud edge cooperative industrial equipment digital twin operation monitoring method and system
CN109558420A (en) Data query method, apparatus, electronic equipment and storage medium
Gugliotta et al. Benefits and challenges of applying semantic web services in the e-Government domain
Ferrigno et al. 3D Real Time Digital Twin
CN113609083B (en) File storage method, device, electronic equipment and computer readable storage medium
CN117541883B (en) Image generation model training, image generation method, system and electronic equipment
KR101517611B1 (en) Method for Providing Multimedia Except for Communication Load
KR20170027327A (en) Method for Exchanging Data Except for Communication Load
JP4865323B2 (en) Graphic information processing device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20220524