CN105635302B - object operation method and device - Google Patents

object operation method and device Download PDF

Info

Publication number
CN105635302B
CN105635302B CN201610028827.2A CN201610028827A CN105635302B CN 105635302 B CN105635302 B CN 105635302B CN 201610028827 A CN201610028827 A CN 201610028827A CN 105635302 B CN105635302 B CN 105635302B
Authority
CN
China
Prior art keywords
operated
information
server
client
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610028827.2A
Other languages
Chinese (zh)
Other versions
CN105635302A (en
Inventor
余韬
黄燕华
陈炜于
徐云峰
徐瑜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201610028827.2A priority Critical patent/CN105635302B/en
Publication of CN105635302A publication Critical patent/CN105635302A/en
Application granted granted Critical
Publication of CN105635302B publication Critical patent/CN105635302B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/52Network services specially adapted for the location of the user terminal

Abstract

the application discloses an object operation method and device. One embodiment of the method comprises: sending position information to a server to enable the server to acquire natural environment data of the position of an object to be operated according to the position information, wherein the position information is obtained by positioning the object to be operated; receiving an operation instruction returned by the server, wherein the operation instruction is an operation instruction which is returned by the server when the natural environment data meets a preset condition and is used for executing a preset operation on the object to be operated, and the preset condition is determined by the server according to object type information of the object to be operated, which is acquired in advance; and executing the preset operation on the object to be operated according to the operation instruction. The implementation method realizes the execution of the preset operation on the object to be operated according to the real-time natural environment.

Description

Object operation method and device
Technical Field
the present application relates to the field of computer technologies, in particular, to the field of network technologies, and in particular, to a method and an apparatus for operating an object.
Background
Automation refers to a process of realizing an expected target by automatic detection, information processing, analysis and judgment, and manipulation control according to the requirements of people without direct participation of people or less people in a machine device, a system or a process.
however, in the existing automation technology, when an operation is performed on an object to be operated, the corresponding operation is usually performed only according to a predetermined program control device, and the problems of low intelligence and poor adaptability exist.
Disclosure of Invention
it is an object of the present application to propose an improved object manipulation method and apparatus to solve the technical problems mentioned in the background section above.
In a first aspect, the present application provides a method for operating an object, the method including: sending position information to a server to enable the server to acquire natural environment data of the position of an object to be operated according to the position information, wherein the position information is obtained by positioning the object to be operated; receiving an operation instruction returned by the server, wherein the operation instruction is an operation instruction which is returned by the server when the natural environment data meets a preset condition and is used for executing a preset operation on the object to be operated, and the preset condition is determined by the server according to object type information of the object to be operated, which is acquired in advance; and executing the preset operation on the object to be operated according to the operation instruction.
In some embodiments, the object type information is determined by the server recognizing the image after acquiring the image of the object to be operated in advance and sending the acquired image to the server.
In some embodiments, before said sending location information to said server, said method further comprises: receiving a location information request sent by the server, wherein the location information request is sent when the current time is in a time range for executing the preset operation included in operation plan information, and the operation plan information is operation plan information matched with the object type information and acquired by the server in advance.
in some embodiments, the performing, according to the operation instruction, the preset operation on the object to be operated includes: acquiring a current image of the object to be operated; uploading the current image to a server; and acquiring operation parameters returned by the server and executing corresponding preset operation on the operation object according to the operation parameters, wherein the operation parameters are obtained by calculating according to the state information, the natural environment data and the operation plan information after the server determines the state information of the operation object according to the current image.
in a second aspect, the present application provides a method for operating an object, the method comprising: receiving position information sent by a client, wherein the position information is obtained by positioning an object to be operated by the client; acquiring natural environment data of the position of the object to be operated according to the position information; and when the natural environment data meet a preset condition, sending an operation instruction for executing a preset operation on the object to be operated to the client, wherein the preset condition is determined according to the pre-acquired object type information of the object to be operated.
In some embodiments, the object type information of the object to be operated is determined by identifying an image uploaded by the client in advance, where the image is obtained by the client performing image acquisition on the object to be operated.
In some embodiments, before the receiving the location information sent by the client, the method further comprises: and when the current time is in a time range for executing the preset operation, which is included in operation plan information, sending a position information request to the client, wherein the operation plan information is operation plan information which is obtained in advance and is matched with the object type information.
In some embodiments, after sending the operation instruction to the client to perform the preset operation on the object to be operated, the method further includes: receiving a current image sent by the client, wherein the current image is a current image of the object to be operated and acquired by the client; identifying the current image to determine state information of the operation object; calculating the obtained operation parameters according to the state information, the natural environment data and the operation plan information; and sending the operating parameters to the client so that the client executes the operating instructions according to the operating parameters.
in a third aspect, the present application provides an object manipulating apparatus, the apparatus comprising: the system comprises a position information sending unit, a position information processing unit and a position information processing unit, wherein the position information sending unit is used for sending position information to a server so that the server obtains natural environment data of the position of an object to be operated according to the position information, and the position information is obtained by positioning the object to be operated; an instruction receiving unit, configured to receive an operation instruction returned by the server, where the operation instruction is an operation instruction that is returned by the server when the natural environment data meets a preset condition and performs a preset operation on the object to be operated, and the preset condition is determined by the server according to object type information of the object to be operated, which is acquired in advance; and the operation unit is used for executing the preset operation on the object to be operated according to the operation instruction.
in some embodiments, the object type information is determined by the server recognizing the image after acquiring the image of the object to be operated in advance and sending the acquired image to the server.
In some embodiments, the apparatus further comprises: a request receiving unit, configured to receive a location information request sent by the server, where the location information request is sent by the server when a current time is in a time range included in operation plan information for executing the preset operation, and the operation plan information is operation plan information that is matched with the object category information and that is acquired by the server in advance.
In some embodiments, the operation unit includes: the image acquisition subunit is used for acquiring a current image of the object to be operated; the image uploading subunit is used for uploading the current image to a server; and the execution subunit is configured to acquire an operation parameter returned by the server and execute a corresponding preset operation on the operation object according to the operation parameter, where the operation parameter is obtained by calculating, by the server, according to the state information of the operation object determined by the current image, the state information, the natural environment data, and the operation plan information.
In a fourth aspect, the present application provides an object manipulating apparatus, the apparatus comprising: a position information receiving unit, configured to receive position information sent by a client, where the position information is obtained by positioning the object to be operated by the client; the acquisition unit is used for acquiring the natural environment data of the position of the object to be operated according to the position information; and the instruction sending unit is used for sending an operation instruction for executing preset operation on the object to be operated to the client when the natural environment data meets a preset condition, wherein the preset condition is determined according to the pre-acquired object type information of the object to be operated.
in some embodiments, the object type information of the object to be operated is determined by identifying an image uploaded by the client in advance, where the image is obtained by the client performing image acquisition on the object to be operated.
In some embodiments, the request sending unit is configured to send a location information request to the client when the current time is within a time range for executing the preset operation included in operation plan information, where the operation plan information is operation plan information that is obtained in advance and matches the object category information.
In some embodiments, the apparatus further comprises: the image receiving unit is used for receiving a current image sent by the client, wherein the current image is a current image of the object to be operated and acquired by the client; an identifying unit configured to identify the current image to determine state information of the operation object; calculating the obtained operation parameters according to the state information, the natural environment data and the operation plan information; and sending the operating parameters to the client so that the client executes the operating instructions according to the operating parameters.
according to the object operation method and device, the natural environment data of the object to be operated are determined through the position information, and the time for executing the preset operation is determined according to the natural environment data, so that the intelligent degree of the operation is improved.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which the present application may be applied;
FIG. 2 is a flow diagram of one embodiment of a method of operation of an object according to the present application;
FIG. 3 is a schematic diagram of yet another embodiment of a method of operating an object according to the present application;
FIG. 4 is a schematic block diagram of one embodiment of a subject device according to the present application;
FIG. 5 is a schematic block diagram of yet another embodiment of a subject apparatus according to the present application;
Fig. 6 is a schematic structural diagram of a computer system suitable for implementing the terminal device or the server according to the embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 illustrates an exemplary system architecture 100 to which embodiments of the subject operating method or subject operating apparatus of the present application may be applied.
As shown in fig. 1, system architecture 100 may include terminal device 101, network 102, and server 103. Network 102 is the medium used to provide communication links between terminal devices 101 and server 103. Network 102 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The terminal apparatus 101 interacts with the server 103 through the network 102 to receive or transmit messages and the like. The terminal device 101 may have an application installed thereon, and the terminal device 101 may control a corresponding execution component (not shown) to perform a corresponding automated operation on the object to be operated, such as a maintenance operation on plants, drugs, concrete, and the like, through the application. The execution unit may be located on the terminal device 101, or may be connected to the terminal device 101 through a network. The terminal apparatus 101 may also be referred to as an operation terminal.
The server 103 may be a server providing various services, such as a background server providing background data support for the terminal device 101. The background server may analyze and perform other processing on the received data, and feed back a processing result (e.g., an operation instruction and an operation parameter) to the terminal device 101. In general, the server 103 may include a group of cluster servers for processing mass data to generate corresponding processing results. The server 103 may also be referred to as a cloud.
It should be noted that the object operation method provided in the corresponding embodiment of fig. 2 is generally executed by the terminal device 101, and accordingly, the object operation apparatus provided in fig. 4 is generally disposed in the terminal device 101. The object operation method provided by the embodiment corresponding to fig. 3 is generally executed by the server 103, and accordingly, the object operation apparatus provided by fig. 5 is generally disposed in the server 103.
it should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to FIG. 2, a flow 200 of one embodiment of a method of operating an object in accordance with the present application is shown. The object operation method comprises the following steps:
Step 201, sending the position information to the server, so that the server obtains the natural environment data of the position of the object to be operated according to the position information.
In this embodiment, an electronic device (for example, the terminal device 101 shown in fig. 1) on which the object operation method is executed may transmit position information of an object to be operated, which is acquired in advance, to a server through a wired connection manner or a wireless connection manner.
the object to be operated may be various tangible objects required by the automation operation. For example, plants, concrete, drugs, vehicles, etc. that need to perform maintenance operations, may also be other objects to be operated.
the location information, also called geographical location information, may be that the electronic device obtains geographical location information (e.g., longitude and latitude coordinates) through a specific positioning technology. In this embodiment, the position information may be to position the object to be operated so as to obtain the position information of the object to be operated in advance. For example, when the object to be operated is a plant, the position information may be position information obtained by positioning the plant. The positioning technology comprises one or more of GPS positioning, base station positioning, wifi positioning, IP positioning, RFID/two-dimensional code and other label identification positioning, Bluetooth positioning, sound wave positioning and scene identification positioning. Taking GPS positioning as an example, the electronic device may acquire the position information of the object to be operated by a GPS positioning instrument that is installed in advance on the object to be operated or at a position near the object to be operated. The electronic device may send the location information to the server through various wired or wireless connections. It is noted that the wireless connection means may include, but is not limited to, a 3G/4G connection, a WiFi connection, a bluetooth connection, a WiMAX connection, a Zigbee connection, a UWB (ultra wideband) connection, and other wireless connection means now known or developed in the future.
it should be noted that, obtaining the positioning information by positioning may be performed at each operation; or the positioning operation can be executed only once or under the condition that a certain condition is met, and the position information sent at other times can be obtained according to the last positioning operation. In general, the former is suitable for an object to be operated whose position changes frequently, and the latter is suitable for an object to be operated whose position changes less.
based on the position information sent to the server, the server can acquire the natural environment data of the position of the object to be operated according to the position information. The natural environment is a sum of various natural factors surrounding the living things, such as atmosphere, water, other species, soil, rock minerals, solar radiation, etc., and the natural environment data is information for describing a current state of the natural environment. The server can collect the natural environment data of the position in real time after receiving the position information. The natural environment data may be collected in real time by pre-deployed sensors or by other means.
And step 202, receiving an operation instruction returned by the server.
In this embodiment, the server may return an operation instruction to the electronic device when the natural environment data meets the preset condition, and the electronic device may receive the operation instruction. The preset condition may be determined by the server according to the object type information of the object to be operated. When the natural environment data meets the preset conditions, the current natural environment is suitable for executing corresponding operation on the object to be operated, that is, the current time can be determined as the time for executing the preset operation on the object to be operated.
the object type information may be information that is predetermined or set by a service and is used for representing a type to which the object to be operated belongs. For example, when the object to be operated is a plant, the species may include roses, lilies, and the like; when the object to be operated is a medicine, the types of the medicine can comprise traditional Chinese medicines and western medicines; when the object to be operated is concrete, the kind may be cement concrete, asphalt concrete, gypsum concrete, polymer concrete, or the like. The preset operation may be various operations that are preset and are expected to be performed on the object to be operated, and may be the cooling operation, the dehumidifying operation, or other operations. For example, when the object to be operated is a plant, the natural environment data may be illumination information, humidity information, and temperature information, and the corresponding preset operation may be watering, fertilizing, and the like. For another example, when the object to be operated is concrete, the natural environment data may be humidity information and temperature information, and the preset operation may be an operation of performing natural curing, steam curing, or the like on the concrete. For another example, when the object to be operated is a medicine, the natural environment data may be one or more of sunlight information, air composition information, temperature information, humidity information, microorganism and insect information, and the corresponding preset operation may be cooling, dehumidification, turnover acceleration, and the like.
And 203, executing preset operation on the object to be operated according to the operation instruction.
In this embodiment, after receiving the operation instruction returned by the server, the electronic device may execute the preset operation. The electronic equipment can execute corresponding preset operation through an automatic operation component installed on the electronic equipment, and can also send an operation command to operation equipment connected with the electronic equipment so as to enable the operation equipment to execute corresponding operation.
in some optional implementation manners of this embodiment, the object type information of the object to be operated may be determined by the server recognizing an image after the electronic device collects an image of the object to be operated in advance and sends the collected image to the server. The method can be realized by the following steps: first, the electronic device may capture an image of the object to be operated. The electronic device can acquire an image of the object to be operated in various ways, for example, a cloud image is used to photograph the object to be operated. Then, the electronic device may upload the image to the server, so that the server recognizes the image to determine the object type information of the object to be operated. It should be noted that, this process generally only needs to be executed once in advance to determine the object type of the object to be operated, and does not need to be executed again in the subsequent operation. The method can ensure that the server automatically determines the type of the object to be operated through image uploading, thereby further improving the automation degree.
In some optional implementations of this embodiment, before step 201, the object operation method further includes: and receiving a position information request sent by the server, wherein the position information request is sent when the current time is in a time range for executing the preset operation and included in the operation plan information, and the operation plan information is the operation plan information which is acquired by the server in advance and is matched with the object type information. The specific process is as follows: first, after specifying the object type information, the server may acquire and store operation plan information matching the object type information in advance based on the object type information. The operation plan may include a time range for performing the preset operation on the object to be operated. When the server determines that the current time is in the time range, the server can send a position information request to the electronic equipment, so that the electronic equipment responds to the position information request. The server may transmit the operation schedule information to the electronic device in advance, and the electronic device may specify the time for transmitting the position information to the server based on the time range in the operation schedule information. The method can execute the subsequent steps according to the operation plan corresponding to the object type of the object to be operated, thereby further improving the automation degree.
in some optional implementation manners of this embodiment, the executing the preset operation on the object to be operated includes the following processes: acquiring a current image of an object to be operated; uploading the current image to a server; and acquiring the operation parameters returned by the server and executing corresponding preset operation on the operation object according to the operation parameters. The process is specifically realized by the following steps: first, the electronic device may acquire a current image of an object to be operated through the image acquisition device. The above description may be referred to for the acquisition of the current image, and details are not repeated here. And then, the electronic equipment uploads the image to a server so that the server can determine the state information of the object to be operated according to the current image. The state information may be used to characterize the current state of the object to be operated, i.e. the intrinsic factors of the object to be operated. Taking plants as an example, the state information may be growth stage information of the plants and health state information of the plants. The growth stage of the plant may be various states such as new growth, adult growth, flowering, fruiting, etc. The health status of the plant may be normal, water-deficient, malnutrition, etc. Further, for example, the status information may include, but is not limited to, information about worm damage, mildew, oil bleeding, color change, and the like. The state information can be determined by means of a pre-acquired image library containing images of different states of the object to be operated. And then, the electronic equipment receives the operation parameters returned by the server and executes corresponding preset operation according to the operation parameters. The operation parameter can be used as the operation parameter of the preset operation and used for carrying out quantitative control on the preset operation so as to realize the precision of the operation. The operation parameter may be an operation parameter obtained by the server through calculation according to the state information of the object to be operated, the natural environment data, and the operation plan information. The implementation mode determines the operation parameters used by the object to be operated to execute the operation according to the current state of the object to be operated, the environment and the preset operation plan, and can be used for quantitatively controlling the operation so as to realize the accuracy of the operation.
The method provided by the embodiment of the application determines the natural environment data of the object to be operated according to the position information, and determines the time for executing the preset operation according to the natural environment data, so that the specific time for executing the operation is adapted to the natural environment at that time, and the intelligent degree of the operation is improved.
With further reference to FIG. 3, a flow 300 of yet another embodiment of a method of operating an object is illustrated. The process 300 of the object operation method includes the following steps:
Step 301, receiving the location information sent by the client.
In this embodiment, the electronic device (e.g., the server 103 shown in fig. 1) on which the object operation method operates may receive the location information from the client (e.g., the terminal device in fig. 1) through a wired connection manner or a wireless connection manner. The position information is obtained by positioning the object to be operated by the client.
and step 302, acquiring natural environment data of the position of the object to be operated according to the position information.
In this embodiment, the electronic device may obtain the natural environment data of the position where the object to be operated is located according to the position information. The electronic equipment can acquire the natural environment data of the position in real time after receiving the position information. The natural environment data may be collected in real time by pre-deployed sensors or by other means.
Optionally, the natural environment data may include at least one of: illumination information, temperature information, humidity information. The illumination may be natural illumination information or artificial illumination, and the illumination information includes, but is not limited to, an illumination period, an illumination time in the illumination period, a luminous intensity, a luminous flux, and the like. Temperature information, which may be information used to record air temperature, to characterize the thermal condition of a location, may be measured in degrees celsius, fahrenheit, or other temperature measurement units. The humidity information may be a physical quantity indicating the degree of dryness of the atmosphere, and may be measured in various ways such as a water vapor pressure, an absolute humidity, a relative humidity, a specific humidity, and a dew point.
And 303, when the natural environment data meet the preset conditions, sending an operation instruction for executing preset operation on the object to be operated to the client.
The electronic equipment judges whether the natural environment data meet preset conditions, and when the natural environment data meet the preset conditions, the electronic equipment can send an operation instruction for executing preset operation on the object to be operated to the client. The preset condition may be determined according to the kind information of the operation object acquired in advance. For example, the preset condition may be a value range in which the acquired natural environment data should be. The value range may be a value range in which the natural environment data is located when the electronic device executes the preset operation, the value range being determined by the electronic device according to the object type information of the object to be operated, which is acquired in advance. When the natural environment data is within the value range, the current time can be determined as the best time for executing the preset operation on the object to be operated. At this time, the electronic device may send an operation instruction for executing a preset operation to the client.
In some optional implementation manners of this embodiment, the object type information of the object to be operated may be determined by identifying an image uploaded by the client in advance. The method can be realized by the following steps: first, the electronic device may receive an image previously uploaded by the client device. The image is the image of the object to be operated, which is acquired by the client. The electronic device can determine the object type information of the object through an image recognition technology and a pre-established image library system. For example, the drug to which the received image belongs may be identified based on image recognition techniques and a drug image library system, and the plant to which the received image belongs may be identified based on image recognition techniques and a plant image library system.
In some optional implementations of this embodiment, before step 301, the object operation method further includes: and when the current time is in the time range for executing the preset operation included in the operation plan information, sending a position information request to the client. The operation plan information is operation plan information matched with the object type information acquired in advance. The specific process is as follows: first, after specifying the object type information, the electronic device may acquire and store operation plan information matching the object type information in advance based on the object type information. The operation plan is a series of plans for performing an operation on an object to be operated in a future certain period, and the operation plan information is information describing the plans. The operation plan information may be set in advance, or may be generated by a method such as data statistics or model training. The operation plan includes a time range for executing the preset operation on the object to be operated. Thereafter, the electronic device may detect that the current time is within the time range. When the electronic device detects that the current time is in the time range, the electronic device can send a location information request to the client, so that the client responds to the location information request to send location information.
in some optional implementation manners of this embodiment, after sending, to the client, an operation instruction for executing a preset operation on an object to be operated, the object operation method further includes: receiving a current image sent by a client, wherein the current image is a current image of an object to be operated and acquired by the client; identifying a current image to determine state information of an operation object; calculating the obtained operation parameters according to the state information, the natural environment data and the operation plan information; and sending the operating parameters to the client so that the client executes the operating instructions according to the operating parameters. The process is specifically realized by the following steps: first, the electronic device may receive a current image of an object to be operated from a client, where the client acquires an image obtained by performing real-time image acquisition on the object to be operated. Then, the server determines the state information of the object to be operated according to the current image. The state information may be used to characterize the current state of the object to be operated, i.e. the intrinsic factors of the object to be operated. Taking the plant as an example, the state information of the plant may further include growth stage information of the plant and health state information of the plant. The growth stage of the plant may be various states such as new growth, adult growth, flowering, fruiting, etc. The health status of the plant may be normal, water-deficient, malnutrition, etc. For example, the state may be any of various states such as moth-eaten, mildew, oil-bleeding, and discoloration. Further, for example, the concrete, the state information may include, but is not limited to, various states such as setting, hardening, and the like. In the process of identifying the current image to determine the state information, the state information corresponding to the current object can be determined according to the image matching degree by means of a pre-acquired image library containing images of different states of the object to be operated. And then, the electronic equipment calculates the obtained operation parameters according to the state information of the object to be operated, the natural environment data and the operation plan information. The operating parameter may be an operating parameter of the preset operation described above. Taking maintenance of plants as an example, when the preset operation is irrigation, the operation parameter may be the amount of irrigation water or the irrigation speed; when the preset operation is fertilization, the operation parameter may be the amount of fertilizer applied. When the temperature is lowered at the time of the preset operation, the operation parameter may be the lowered temperature. The electronic device may pre-store a calculation strategy for calculating the operation parameters according to the state information, the natural environment data, and the operation plan information. The operation policy may be stored in the form of a data table, a function, or in other manners. Finally, the electronic device can send the operating parameters to the client, so that the client executes the operating instructions according to the operating parameters. The implementation mode determines the operation parameters used by the object to be operated to execute the operation according to the current state of the object to be operated, the environment and the preset operation plan, and can be used for quantitatively controlling the operation so as to realize the accuracy of the operation.
As can be seen from fig. 3, in the process 300 of the object operation method in this embodiment, the natural environment data of the object to be operated is determined according to the position information, and the optimal time for executing the preset operation is determined according to the natural environment data, so that the degree of automation of the operation is achieved.
With further reference to fig. 4, as an implementation of the methods shown in the above-mentioned figures, the present application provides an embodiment of an object operating apparatus, which corresponds to the method embodiment shown in fig. 2, and which can be applied to various terminal devices.
As shown in fig. 4, the object operating apparatus 400 according to the present embodiment includes: position information transmitting section 401, instruction receiving section 402, and operation section 403. The position information sending unit 401 is configured to send position information to a server, so that the server obtains natural environment data of a position where an object to be operated is located according to the position information, where the position information is obtained by positioning the object to be operated; the instruction receiving unit 402 is configured to receive an operation instruction returned by the server, where the operation instruction is an operation instruction for executing a preset operation on an object to be operated, which is returned by the server when the natural environment data meets a preset condition, and the preset condition is determined by the server according to object type information of the object to be operated, which is acquired in advance; an operation unit 403, configured to execute the preset operation on the object to be operated according to the operation instruction.
In this embodiment, the specific processing of the position information sending unit 401, the instruction receiving unit 402, and the operation unit 403 can refer to steps 201, 202, and 203 in the corresponding embodiment of fig. 2.
In some optional implementation manners of this embodiment, the object type information may be determined by acquiring an image of the object to be operated in advance, sending the acquired image to the server, and then identifying the image by the server. The specific processing of this implementation may refer to the corresponding implementation in the corresponding embodiment of fig. 2.
In some optional implementations of the present embodiment, the object operating apparatus 400 further includes: a request receiving unit (not shown) for receiving a location information request transmitted by the server, wherein the location information request is transmitted when the current time is in a time range for performing a preset operation included in the operation plan information, and the operation plan information may be operation plan information matched with the above object category information acquired in advance by the server. The specific processing of this implementation may refer to the corresponding implementation in the corresponding embodiment of fig. 2.
In some optional implementations of this embodiment, the operation unit 403 may include: an image acquisition subunit (not shown) for acquiring a current image of the object to be operated; an image uploading subunit (not shown) for uploading the current image to the server; and an execution subunit (not shown) configured to obtain an operation parameter returned by the server, and execute a corresponding preset operation on the operation object according to the operation parameter, where the operation parameter is obtained by the server through calculation according to the state information, the natural environment data, and the operation plan information after determining the state information of the operation object according to the current image. The specific processing of this implementation may refer to the corresponding implementation in the corresponding embodiment of fig. 2.
With further reference to fig. 5, as an implementation of the methods shown in the above-mentioned figures, the present application provides yet another embodiment of an object operating apparatus, which corresponds to the method embodiment shown in fig. 3, and which can be applied to various servers.
As shown in fig. 5, the object operating apparatus 500 according to the present embodiment includes: a position information receiving unit 501, an acquisition unit 502, and an instruction transmitting unit 503. The position information receiving unit 501 is configured to receive position information sent by a client, where the position information is obtained by positioning an object to be operated by the client; the acquiring unit 502 is configured to acquire natural environment data of a position where an object to be operated is located according to the position information; the instruction sending unit 503 is configured to send, to the client, an operation instruction for executing a preset operation on the object to be operated when the natural environment data is within a preset value range, where the value range is determined according to the object type information of the object to be operated, and is a value range where the natural environment data should be when the preset operation is executed.
In this embodiment, the specific processing of the position information receiving unit 501, the obtaining unit 502, and the instruction sending unit 503 may refer to steps 301, 302, and 303 in the corresponding embodiment of fig. 3.
In some optional implementation manners of this embodiment, the object type information of the object to be operated is determined by identifying an image uploaded by the client in advance, where the image is obtained by the client performing image acquisition on the object to be operated. The specific processing of this implementation may refer to the corresponding implementation in the corresponding embodiment of fig. 3.
In some optional implementations of the present embodiment, the object operating apparatus 500 further includes: a request transmitting unit (not shown) for transmitting a location information request to the client when the current time is in a time range for performing a preset operation included in operation plan information that is operation plan information matched with the object category information acquired in advance. The specific processing of this implementation may refer to the corresponding implementation in the corresponding embodiment of fig. 3.
in some optional implementations of the present embodiment, the object operating apparatus 500 further includes: an image receiving unit (not shown) for receiving a current image sent by the client, wherein the current image is a current image of an object to be operated and collected by the client; an identifying unit (not shown) for identifying the current image to determine the state information of the operation object; a calculation unit (not shown) for calculating the obtained operation parameters based on the state information, the natural environment data, and the operation plan information; and a parameter sending unit (not shown) for sending the operation parameter to the client so that the client executes the operation instruction according to the operation parameter. The specific processing of this implementation may refer to the corresponding implementation in the corresponding embodiment of fig. 3.
referring now to FIG. 6, shown is a block diagram of a computer system 600 suitable for use in implementing a terminal device or server of an embodiment of the present application.
As shown in fig. 6, the computer system 600 includes a Central Processing Unit (CPU)601 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage section 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the system 600 are also stored. The CPU 601, ROM 602, and RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
The following components are connected to the I/O interface 605: an input portion 606 including a keyboard, a mouse, and the like; an output portion 607 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 608 including a hard disk and the like; and a communication section 609 including a network interface card such as a LAN card, a modem, or the like. The communication section 609 performs communication processing via a network such as the internet. The driver 610 is also connected to the I/O interface 605 as needed. A removable medium 611 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 610 as necessary, so that a computer program read out therefrom is mounted in the storage section 608 as necessary.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program tangibly embodied on a machine-readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 609, and/or installed from the removable medium 611.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes a position information transmitting unit, an instruction receiving unit, and an operation unit. The names of these units do not form a limitation on the unit itself in some cases, and for example, an operation unit may also be described as "a unit that performs the preset operation on the object to be operated according to the operation instruction".
As another aspect, the present application also provides a non-volatile computer storage medium, which may be the non-volatile computer storage medium included in the apparatus in the above-described embodiments; or it may be a non-volatile computer storage medium that exists separately and is not incorporated into the terminal. The non-transitory computer storage medium stores one or more programs that, when executed by a device, cause the device to: sending position information to a server to enable the server to acquire natural environment data of the position of an object to be operated according to the position information, wherein the position information is obtained by positioning the object to be operated; receiving an operation instruction returned by the server, wherein the operation instruction is an operation instruction which is returned by the server when the natural environment data meets a preset condition and is used for executing a preset operation on the object to be operated, and the preset condition is determined by the server according to object type information of the object to be operated, which is acquired in advance; and executing the preset operation on the object to be operated according to the operation instruction. Or receiving position information sent by a client, wherein the position information is obtained by positioning an object to be operated by the client; acquiring natural environment data of the position of the object to be operated according to the position information; and when the natural environment data meet a preset condition, sending an operation instruction for executing a preset operation on the object to be operated to the client, wherein the preset condition is determined according to the pre-acquired object type information of the object to be operated.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by a person skilled in the art that the scope of the invention as referred to in the present application is not limited to the embodiments with a specific combination of the above-mentioned features, but also covers other embodiments with any combination of the above-mentioned features or their equivalents without departing from the inventive concept. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (8)

1. a method of operating an object, the method comprising:
Sending position information to a server to enable the server to acquire natural environment data of the position of an object to be operated according to the position information, wherein the position information is obtained by positioning the object to be operated;
Receiving an operation instruction returned by the server, wherein the operation instruction is an operation instruction which is returned by the server when the natural environment data meets a preset condition and performs a preset operation on the object to be operated, the preset condition is determined by the server according to object type information of the object to be operated, which is obtained in advance, and the object type information is determined by identifying the image by the server after the image of the object to be operated is acquired in advance and the acquired image is sent to the server;
Executing the preset operation on the object to be operated according to the operation instruction;
Before the sending of the location information to the server, the method further comprises:
receiving a location information request sent by the server, wherein the location information request is sent when the current time is in a time range for executing the preset operation included in operation plan information, and the operation plan information is operation plan information matched with the object type information and acquired by the server in advance.
2. The method according to claim 1, wherein the performing the preset operation on the object to be operated according to the operation instruction comprises:
acquiring a current image of the object to be operated;
Uploading the current image to a server;
And acquiring operation parameters returned by the server and executing corresponding preset operation on the operation object according to the operation parameters, wherein the operation parameters are obtained by calculating according to the state information, the natural environment data and the operation plan information after the server determines the state information of the operation object according to the current image.
3. a method of operating an object, the method comprising:
Receiving position information sent by a client, wherein the position information is obtained by positioning an object to be operated by the client;
Acquiring natural environment data of the position of the object to be operated according to the position information;
When the natural environment data meet a preset condition, sending an operation instruction for executing a preset operation on the object to be operated to the client, wherein the preset condition is determined according to object type information of the object to be operated, which is acquired in advance, and the object type information of the object to be operated is determined by identifying an image uploaded by the client in advance, wherein the image is obtained by the client through image acquisition on the object to be operated;
Before the receiving the location information sent by the client, the method further includes:
And when the current time is in a time range for executing the preset operation, which is included in operation plan information, sending a position information request to the client, wherein the operation plan information is operation plan information which is obtained in advance and is matched with the object type information.
4. The method according to claim 3, wherein after the sending, to the client, an operation instruction for performing a preset operation on the object to be operated, the method further comprises:
Receiving a current image sent by the client, wherein the current image is a current image of the object to be operated and acquired by the client;
Identifying the current image to determine state information of the operation object;
Calculating to obtain operation parameters according to the state information, the natural environment data and the operation plan information;
and sending the operating parameters to the client so that the client executes the operating instructions according to the operating parameters.
5. an object manipulating apparatus, characterized in that the apparatus comprises:
the system comprises a position information sending unit, a position information processing unit and a position information processing unit, wherein the position information sending unit is used for sending position information to a server so that the server obtains natural environment data of the position of an object to be operated according to the position information, and the position information is obtained by positioning the object to be operated;
The instruction receiving unit is used for receiving an operation instruction returned by the server, wherein the operation instruction is an operation instruction which is returned by the server when the natural environment data meets a preset condition and is used for executing a preset operation on the object to be operated, the preset condition is determined by the server according to object type information of the object to be operated, which is obtained in advance, and the object type information is determined by identifying the image by the server after the image of the object to be operated is acquired in advance and the acquired image is sent to the server;
the operation unit is used for executing the preset operation on the object to be operated according to the operation instruction;
The device further comprises:
a request receiving unit, configured to receive a location information request sent by the server, where the location information request is sent by the server when a current time is in a time range included in operation plan information for executing the preset operation, and the operation plan information is operation plan information that is matched with the object category information and that is acquired by the server in advance.
6. The apparatus according to claim 5, wherein the operation unit comprises:
the image acquisition subunit is used for acquiring a current image of the object to be operated;
the image uploading subunit is used for uploading the current image to a server;
And the execution subunit is configured to acquire an operation parameter returned by the server and execute a corresponding preset operation on the operation object according to the operation parameter, where the operation parameter is obtained by calculating, by the server, according to the state information of the operation object determined by the current image, the state information, the natural environment data, and the operation plan information.
7. An object manipulating apparatus, characterized in that the apparatus comprises:
The system comprises a position information receiving unit, a position information processing unit and a position information processing unit, wherein the position information receiving unit is used for receiving position information sent by a client, and the position information is obtained by positioning an object to be operated by the client;
the acquisition unit is used for acquiring the natural environment data of the position of the object to be operated according to the position information;
an instruction sending unit, configured to send, to the client, an operation instruction for performing a preset operation on the object to be operated when the natural environment data meets a preset condition, where the preset condition is determined according to object type information of the object to be operated, where the object type information of the object to be operated is determined by identifying an image uploaded by the client in advance, and the image is obtained by the client performing image acquisition on the object to be operated;
the device further comprises:
A request sending unit, configured to send a location information request to the client when a current time is within a time range for executing the preset operation included in operation plan information, where the operation plan information is operation plan information that is obtained in advance and matches the object type information.
8. The apparatus of claim 7, further comprising:
the image receiving unit is used for receiving a current image sent by the client, wherein the current image is a current image of the object to be operated and acquired by the client;
An identifying unit configured to identify the current image to determine state information of the operation object;
The calculating unit is used for calculating the obtained operation parameters according to the state information, the natural environment data and the operation plan information;
and the parameter sending unit is used for sending the operating parameters to the client so that the client executes the operating instructions according to the operating parameters.
CN201610028827.2A 2016-01-15 2016-01-15 object operation method and device Active CN105635302B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610028827.2A CN105635302B (en) 2016-01-15 2016-01-15 object operation method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610028827.2A CN105635302B (en) 2016-01-15 2016-01-15 object operation method and device

Publications (2)

Publication Number Publication Date
CN105635302A CN105635302A (en) 2016-06-01
CN105635302B true CN105635302B (en) 2019-12-13

Family

ID=56049773

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610028827.2A Active CN105635302B (en) 2016-01-15 2016-01-15 object operation method and device

Country Status (1)

Country Link
CN (1) CN105635302B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110377961B (en) * 2019-06-25 2023-04-28 北京百度网讯科技有限公司 Crop growth environment control method, device, computer equipment and storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4668347B2 (en) * 2009-03-05 2011-04-13 エンパイア テクノロジー ディベロップメント エルエルシー Information service providing system, information service providing apparatus and method
CN202854575U (en) * 2012-09-28 2013-04-03 山东中创软件工程股份有限公司 Agricultural Internet of Things system
CN105158255A (en) * 2015-01-25 2015-12-16 无锡桑尼安科技有限公司 Identification method of crop maturity
CN104916090B (en) * 2015-04-27 2018-06-12 小米科技有限责任公司 Prompt message sending method and device
CN104866970B (en) * 2015-05-26 2018-07-24 徐吉祥 Intelligent Cultivate administration method and intelligent planting equipment
CN105183050B (en) * 2015-09-02 2017-04-05 云南中科物联网科技有限公司 A kind of smart city environmental management technique and system based on Internet of Things

Also Published As

Publication number Publication date
CN105635302A (en) 2016-06-01

Similar Documents

Publication Publication Date Title
CN113137954B (en) Generating real-time sensor maps from video and in-ground sensor data
CN106054844B (en) A kind of agricultural intelligent remote management system
JP7300796B2 (en) Prediction system, method and program for growth status or pest occurrence status
CN114706933A (en) Spraying area determining method and device based on unmanned aerial vehicle operation
EP3620774B1 (en) Method and apparatus for monitoring plant health state
JP7172605B2 (en) Information generation method, information generation device, program
CN105744225A (en) Method and system for analyzing crop growth remotely
CN104007733B (en) It is a kind of that the system and method being monitored is produced to intensive agriculture
Ramírez-Gil et al. Design of electronic devices for monitoring climatic variables and development of an early warning system for the avocado wilt complex disease
CN105635302B (en) object operation method and device
JP2019067074A (en) Wind environment learning device, wind environment evaluation system, wind environment learning method and wind environment evaluation method
CN110896761A (en) Irrigation decision-making method and system for greenhouse
Nyakuri et al. IoT and AI based smart soil quality assessment for data-driven irrigation and fertilization
US11805740B2 (en) Systems and methods for monitoring and controlling crop irrigation schedules
CN109362040B (en) Information processing method and device
KR20210083708A (en) A system forecasting water content of soil using weather information
CN111866464B (en) Agricultural tractor remote control system based on virtual reality technology
CN212846418U (en) Forest region personnel distribution system based on 5G communication
CN106598115A (en) Information-based agriculture greenhouse control system and control method thereof
KR20180051753A (en) System of Managing ICT-based Customized Greenhouse and Operation Method thereof
CN111815110A (en) Agricultural production planning accurate decision-making method and system based on big data and artificial intelligence
Murakami et al. Growth Estimation Sensor Network System for Aquaponics using Multiple Types of Depth Cameras
CN105589416A (en) Intelligent Internet of things technology-based intelligent agricultural monitoring management system and method thereof
CN107438223B (en) Indoor orientation method and device
CN110691116A (en) Method, positioning device and system for managing network device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant