CN110290356A - The processing method and processing device of object - Google Patents

The processing method and processing device of object Download PDF

Info

Publication number
CN110290356A
CN110290356A CN201910684982.3A CN201910684982A CN110290356A CN 110290356 A CN110290356 A CN 110290356A CN 201910684982 A CN201910684982 A CN 201910684982A CN 110290356 A CN110290356 A CN 110290356A
Authority
CN
China
Prior art keywords
target
target object
case
detecting
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910684982.3A
Other languages
Chinese (zh)
Inventor
臧云波
鲁邹尧
吴明辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Second Network Technology Co Ltd
Original Assignee
Shanghai Second Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Second Network Technology Co Ltd filed Critical Shanghai Second Network Technology Co Ltd
Priority to CN201910684982.3A priority Critical patent/CN110290356A/en
Publication of CN110290356A publication Critical patent/CN110290356A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M29/00Scaring or repelling devices, e.g. bird-scaring apparatus
    • A01M29/30Scaring or repelling devices, e.g. bird-scaring apparatus preventing or obstructing access or passage, e.g. by means of barriers, spikes, cords, obstacles or sprinkled water
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Birds (AREA)
  • Insects & Arthropods (AREA)
  • Pest Control & Pesticides (AREA)
  • Wood Science & Technology (AREA)
  • Zoology (AREA)
  • Environmental Sciences (AREA)
  • Alarm Systems (AREA)

Abstract

The present invention provides a kind of processing method and processing devices of object, this method comprises: obtaining instruction information, wherein instruction information is used to indicate monitoring device and detects target object occur in target position, for being monitored to target area, target position is located in target area monitoring device;Response instructions information is moved to target position from predetermined position;Target image is acquired in target position, and is detected in target image with the presence or absence of target object;There are in the case where target object in detecting target image, emit target substance corresponding with the environmental information of target position to target object, and follow target object mobile, until target object leaves target area.Through the invention, it solves the problems, such as that the efficiency for driving target object in the related technology is lower, and then has achieved the effect that improve the efficiency for driving target object.

Description

The processing method and processing device of object
Technical field
The present invention relates to computer fields, in particular to a kind of processing method and processing device of object.
Background technique
In certain scenes, such as: the kitchen in dining room, warehouse, storehouse, parking lot etc. are needed to harmful or invasion Biology (mouse, cockroach, birds, cat, dog etc.) driven.It is current that drive mode all be to drive nocuousness by tour personnel Or the biology of invasion, efficiency are very low.
For above-mentioned problem, currently no effective solution has been proposed.
Summary of the invention
The embodiment of the invention provides a kind of processing method and processing devices of object, at least to solve to drive mesh in the related technology Mark the lower problem of the efficiency of object.
According to one embodiment of present invention, a kind of processing method of object is provided, comprising:
Obtain instruction information, wherein the instruction information is used to indicate monitoring device and detects occur in target position Target object, for being monitored to target area, the target position is located in the target area monitoring device;
It responds the instruction information and is moved to the target position from predetermined position;
Target image is acquired in the target position, and is detected in the target image with the presence or absence of the target object;
There are in the case where the target object in detecting the target image, Xiang Suoshu target object emits and institute State the corresponding target substance of environmental information of target position, and follow the target object mobile, until the target object from Open the target area.
Optionally, in detecting the target image there are in the case where the target object, Xiang Suoshu target object Emit target substance, and follow the target object mobile, until the target object leaves the target area and includes:
There are in the case where the target object in detecting the target image, Xiang Suoshu target object emits and institute State the corresponding target substance of environmental information of target position;
Detect whether the target object moves;
In the case where detecting that the target object moves, follows the target object and continue to the target Object emits target substance corresponding with the environmental information of the target position, until the target object leaves the target area Domain.
Optionally, it follows the target object and continues to believe to target object transmitting and the environment of the target position Corresponding target substance is ceased, until the target object leaves the target area and includes:
During following the target object, it whether there is the target pair in real-time detection acquired image As;
There are in the case where the target object, continue to emit to the target object in detecting acquired image Target substance corresponding with the environmental information of the target position;
In the case where the target object is not present in detecting acquired image, the target is not detected in determination The duration of object;
In the case where the duration is greater than target duration, determine that the target object leaves the target area, and move Move the predetermined position.
Optionally, Xiang Suoshu target object transmitting target substance corresponding with the environmental information of the target position includes:
Detect the light intensity of the target position, wherein the environmental information includes the line strength;
In the case where detecting the light intensity of the target position below or equal to the first intensity, Xiang Suoshu target Object emits the light that light intensity is higher than the second intensity, wherein second intensity is greater than first intensity;
In the case where detecting that the light intensity of the target position is higher than first intensity, Xiang Suoshu target object Emit the liquid shell of light storage.
Optionally, obtaining the instruction information includes:
The picture of the target area is shot by the monitoring device;
The target object is identified in the picture of the target area by the corresponding server of the monitoring device;
In the case where identifying the target object by the server, the target is determined by the server Coordinate value of the object in the target area, wherein the coordinate value is used to indicate the target position;
The instruction information for carrying the coordinate value is generated by the server;
Receive the instruction information that the server is sent.
Optionally, target image is acquired in the target position, and detected in the target image with the presence or absence of the mesh Marking object includes:
In the target position with the image collecting device of target angular velocity rotation configuration, and pass through during rotation Described image acquisition device acquires the target image;
Whether appointing there are the target object is had in detection target time section in collected multiple target images One image;
Wherein, detecting in the target time section have described in presence in the collected the multiple target image In the case where any image of target object, determination detects that there are the target objects in the target image;It is detecting In the case where the target object is not present in the collected the multiple target image in the target time section, It determines and detects that there is no the target objects in the target image.
According to another embodiment of the invention, a kind of processing unit of object is provided, comprising:
Module is obtained, for obtaining instruction information, wherein the instruction information is used to indicate monitoring device and detects in mesh There is target object in cursor position, and for being monitored to target area, the target position is located at described the monitoring device In target area;
Mobile module is moved to the target position from predetermined position for responding the instruction information;
First processing module, for acquiring target image in the target position, and detect in the target image whether There are the target objects;
Second processing module, in detecting the target image there are in the case where the target object, to institute It states target object and emits target substance corresponding with the environmental information of the target position, and follow the target object mobile, Until the target object leaves the target area.
Optionally, the Second processing module includes:
First transmitting unit, in detecting the target image there are in the case where the target object, to institute It states target object and emits target substance corresponding with the environmental information of the target position;
First detection unit, for detecting whether the target object moves;
First processing units, for following the target pair in the case where detecting that the target object moves As and continue to emit corresponding with the environmental information of target position target substance to the target object, up to the target Object leaves the target area.
Optionally, the processing unit includes:
Detection sub-unit, for during following the target object, in real-time detection acquired image whether There are the target objects;
Emit subelement, for there are in the case where the target object, continue in detecting acquired image to The target object emits target substance corresponding with the environmental information of the target position;
First determines subelement, in the case where for the target object to be not present in detecting acquired image, Determine the duration that the target object is not detected;
Second determine subelement, for the duration be greater than target duration in the case where, determine the target object from The target area is opened, and is moved to the predetermined position.
Optionally, the Second processing module includes:
Second detection unit, for detecting the light intensity of the target position, wherein the environmental information includes described Line strength;
Second transmitting unit, for being below or equal to the first intensity in the light intensity for detecting the target position In the case of, Xiang Suoshu target object emit light intensity be higher than the second intensity light, wherein second intensity be higher than or Equal to first intensity;
Third transmitting unit, for the case where the light intensity for detecting the target position is higher than first intensity Under, Xiang Suoshu target object emits the liquid shell of light storage.
Optionally, the acquisition module includes:
Shooting unit, for shooting the picture of the target area by the monitoring device;
Recognition unit, for identifying institute in the picture of the target area by the corresponding server of the monitoring device State target object;
Determination unit, for passing through the service in the case where identifying the target object by the server Device determines the target object in the coordinate value of the target area, wherein the coordinate value is used to indicate the target position;
Generation unit, for generating the instruction information for carrying the coordinate value by the server;
Receiving unit, the instruction information sent for receiving the server.
Optionally, the first processing module includes:
The second processing unit, the image collecting device for being configured in the target position with target angular velocity rotation, and The target image is acquired by described image acquisition device during rotation;
Third detection unit, for detecting in target time section whether have presence in collected multiple target images Any image of the target object;
Wherein, detecting in the target time section have described in presence in the collected the multiple target image In the case where any image of target object, determination detects that there are the target objects in the target image;It is detecting In the case where the target object is not present in the collected the multiple target image in the target time section, It determines and detects that there is no the target objects in the target image.
According to still another embodiment of the invention, a kind of storage medium is additionally provided, meter is stored in the storage medium Calculation machine program, wherein the computer program is arranged to execute the step in any of the above-described embodiment of the method when operation.
According to still another embodiment of the invention, a kind of electronic device, including memory and processor are additionally provided, it is described Computer program is stored in memory, the processor is arranged to run the computer program to execute any of the above-described Step in embodiment of the method.
Through the invention, information is indicated by obtaining, wherein instruction information is used to indicate monitoring device and detects in target There is target object in position, and for being monitored to target area, target position is located in target area monitoring device;Response Indicate that information is moved to target position from predetermined position;Acquire target image in target position, and detect in target image whether There are target objects;There are in the case where target object in detecting target image, to target object transmitting and target position The corresponding target substance of environmental information, and follow target object mobile, until target object leaves the mode of target area, when Monitoring device detects when target object occurs in target position, obtains instruction information, and responds the instruction information from predetermined Position is moved to target position, target object is determined whether there is by the detection of image in target position, if it is present root Emit corresponding target substance according to the environmental information of target position and drive target object, with moving with it until target object leaves The target area, to realize driving automatically for target object.Therefore, it can solve and drive target object in the related technology The lower problem of efficiency achievees the effect that improve the efficiency for driving target object.
Detailed description of the invention
The drawings described herein are used to provide a further understanding of the present invention, constitutes part of this application, this hair Bright illustrative embodiments and their description are used to explain the present invention, and are not constituted improper limitations of the present invention.In the accompanying drawings:
Fig. 1 is a kind of hardware block diagram of the mobile terminal of the processing method of object of the embodiment of the present invention;
Fig. 2 is the flow chart of the processing method of object according to an embodiment of the present invention;
Fig. 3 is the structural block diagram of the processing unit of object according to an embodiment of the present invention.
Specific embodiment
Hereinafter, the present invention will be described in detail with reference to the accompanying drawings and in combination with Examples.It should be noted that not conflicting In the case of, the features in the embodiments and the embodiments of the present application can be combined with each other.
It should be noted that description and claims of this specification and term " first " in above-mentioned attached drawing, " Two " etc. be to be used to distinguish similar objects, without being used to describe a particular order or precedence order.
Embodiment of the method provided by the embodiment of the present application one can be in mobile terminal, terminal or similar fortune It calculates and is executed in device.For running on mobile terminals, Fig. 1 is a kind of shifting of the processing method of object of the embodiment of the present invention The hardware block diagram of dynamic terminal.As shown in Figure 1, mobile terminal 10 may include one or more (only showing one in Fig. 1) (processor 102 can include but is not limited to the processing dress of Micro-processor MCV or programmable logic device FPGA etc. to processor 102 Set) and memory 104 for storing data, optionally, above-mentioned mobile terminal can also include the transmission for communication function Equipment 106 and input-output equipment 108.It will appreciated by the skilled person that structure shown in FIG. 1 is only to illustrate, It does not cause to limit to the structure of above-mentioned mobile terminal.For example, mobile terminal 10 may also include it is more than shown in Fig. 1 or Less component, or with the configuration different from shown in Fig. 1.
Memory 104 can be used for storing computer program, for example, the software program and module of application software, such as this hair The corresponding computer program of the processing method of object in bright embodiment, processor 102 are stored in memory 104 by operation Computer program realize above-mentioned method thereby executing various function application and data processing.Memory 104 can wrap Include high speed random access memory, may also include nonvolatile memory, as one or more magnetic storage device, flash memory or Other non-volatile solid state memories.In some instances, memory 104 can further comprise long-range relative to processor 102 The memory of setting, these remote memories can pass through network connection to mobile terminal 10.The example of above-mentioned network include but It is not limited to internet, intranet, local area network, mobile radio communication and combinations thereof.
Transmitting device 106 is used to that data to be received or sent via a network.Above-mentioned network specific example may include The wireless network that the communication providers of mobile terminal 10 provide.In an example, transmitting device 106 includes a Network adaptation Device (Network Interface Controller, referred to as NIC), can be connected by base station with other network equipments to It can be communicated with internet.In an example, transmitting device 106 can for radio frequency (Radio Frequency, referred to as RF) module is used to wirelessly be communicated with internet.
A kind of processing method of object is provided in the present embodiment, and Fig. 2 is the place of object according to an embodiment of the present invention The flow chart of reason method, as shown in Fig. 2, the process includes the following steps:
Step S202 obtains instruction information, wherein instruction information is used to indicate monitoring device and detects to go out in target position Target object is showed, for being monitored to target area, target position is located in target area monitoring device;
Step S204, response instructions information are moved to target position from predetermined position;
Step S206 acquires target image in target position, and detects in target image with the presence or absence of target object;
Step S208, there are in the case where target object in detecting target image, to target object transmitting and target The corresponding target substance of the environmental information of position, and follow target object mobile, until target object leaves target area.
Optionally, in the present embodiment, the processing method of above-mentioned object can be, but not limited to be applied to have the function of to drive Equipment on.Such as: robot device etc..
Optionally, in the present embodiment, target area can be, but not limited to include: kitchen, warehouse, parking lot, market etc. Deng.
Optionally, in the present embodiment, target object can be, but not limited to include: muroid, birds, pets, insects Biology etc..
Optionally, in the present embodiment, target substance can be, but not limited to include: light, liquid shell, sound wave etc..
Through the above steps, information is indicated by obtaining, wherein instruction information is used to indicate monitoring device and detects in mesh There is target object in cursor position, and for being monitored to target area, target position is located in target area monitoring device;It rings It should indicate that information is moved to target position from predetermined position;Target image is acquired in target position, and detects in target image and is It is no that there are target objects;There are in the case where target object in detecting target image, emit and target position to target object The corresponding target substance of the environmental information set, and follow target object mobile, until target object leaves the mode of target area, When monitoring device is detected when target object occurs in target position, instruction information is obtained, and responds the instruction information from pre- Positioning, which is set, is moved to target position, determines whether there is target object by the detection of image in target position, if it is present Emit corresponding target substance according to the environmental information of target position and drive target object, with move with it until target object from The target area is opened, to realize driving automatically for target object.Therefore, it can solve and drive target object in the related technology The lower problem of efficiency, achieve the effect that improve the efficiency for driving target object.
Optionally, robot can then enter there are when target object in detecting target image and drive mode, pass through Emit target substance to drive target object, and enter tracing mode, and then moves, held with the movement of target object Continuous drives, until it leaves target area.Such as: in above-mentioned steps S208, there are targets in detecting target image In the case where object, emit target substance corresponding with the environmental information of target position to target object;Detected target object is It is no to move;In the case where detecting that target object moves, follows target object and continue to emit to target object Target substance corresponding with the environmental information of target position, until target object leaves target area.
Optionally, if being able to detect that target object always, it is transmited to it target substance always and is driven.If It can't detect target object in certain time length, then can determine that target object has been moved off target area, robot can move Dynamic return to enters standby mode on predetermined position.Such as: during following target object, real-time detection acquired image In whether there is target object;There are in the case where target object in detecting acquired image, continue to target object Emit target substance corresponding with the environmental information of target position;There is no target objects in detecting acquired image In the case of, determine the duration that target object is not detected;In the case where duration is greater than target duration, determine that target object leaves Target area, and it is moved to predetermined position.
Optionally, the target substance of transmitting can be determined according to the environmental information of target area, if target area Light intensity it is weaker, then can emit strong light to target object to drive.If the light intensity of target area is very strong , then the liquid shell that can use storage instead drives.If liquid shell is insufficient, light intensity can be adjusted, be made It is driven with the stronger light irradiation target object of the light intensity than environment light.Such as: in above-mentioned steps S208, detection The light intensity of target position, wherein environmental information includes line strength;Be lower than in the light intensity for detecting target position or In the case that person is equal to the first intensity, the light of the second intensity is higher than to target object transmitting light intensity, wherein the second intensity Greater than the first intensity;In the case where detecting that the light intensity of target position is higher than the first intensity, to target pair The liquid shell stored as transmitting light.
It is alternatively possible to but be not limited to be shot by picture of the monitoring device to target area, it is corresponded to by target device Server target object is identified and positioned, generate prompt information be sent to robot device.Such as: in above-mentioned steps In S202, pass through the picture in monitoring device photographic subjects region;Figure by the corresponding server of monitoring device in target area Target object is identified in piece;In the case where identifying target object by server, determine that target object exists by server The coordinate value of target area, wherein coordinate value is used to indicate target position;The instruction for carrying coordinate value is generated by server Information;Receive the instruction information that server is sent.
Optionally, in above-mentioned steps S206, the image collecting device of configuration is rotated with target angular velocity in target position, And pass through image acquisition device target image during rotation;Detect collected multiple targets in target time section Whether any image there are target object is had in image;Wherein, collected multiple mesh in target time section are being detected In the case where having any image there are target object in logo image, determination detects that there are target objects in target image; Detecting in target time section that determination detects there is no in the case where target object in collected multiple target images Target object is not present in target image.
Through the above description of the embodiments, those skilled in the art can be understood that according to above-mentioned implementation The method of example can be realized by means of software and necessary general hardware platform, naturally it is also possible to by hardware, but it is very much In the case of the former be more preferably embodiment.Based on this understanding, technical solution of the present invention is substantially in other words to existing The part that technology contributes can be embodied in the form of software products, which is stored in a storage In medium (such as ROM/RAM, magnetic disk, CD), including some instructions are used so that a terminal device (can be mobile phone, calculate Machine, server or network equipment etc.) execute method described in each embodiment of the present invention.
A kind of processing unit of object is additionally provided in the present embodiment, and the device is for realizing above-described embodiment and preferably Embodiment, the descriptions that have already been made will not be repeated.As used below, predetermined function may be implemented in term " module " The combination of software and/or hardware.Although device described in following embodiment is preferably realized with software, hardware, or The realization of the combination of person's software and hardware is also that may and be contemplated.
Fig. 3 is the structural block diagram of the processing unit of object according to an embodiment of the present invention, as shown in figure 3, the device includes:
Module 32 is obtained, for obtaining instruction information, wherein instruction information is used to indicate monitoring device and detects in target There is target object in position, and for being monitored to target area, target position is located in target area monitoring device;
Mobile module 34 is moved to target position from predetermined position for response instructions information;
First processing module 36 for acquiring target image in target position, and detects in target image with the presence or absence of mesh Mark object;
Second processing module 38, in detecting target image there are in the case where target object, to target object Emit target substance corresponding with the environmental information of target position, and follow target object mobile, until target object leaves mesh Mark region.
Optionally, Second processing module includes:
First transmitting unit, for, there are in the case where target object, being sent out to target object in detecting target image Penetrate target substance corresponding with the environmental information of target position;
Whether first detection unit moves for detected target object;
First processing units, in the case where detecting that target object moves, following target object and continuing Emit target substance corresponding with the environmental information of target position to target object, until target object leaves target area.
Optionally, processing unit includes:
Detection sub-unit, for whether there is during following target object in real-time detection acquired image Target object;
Emit subelement, for, there are in the case where target object, continuing to target in detecting acquired image Object emits target substance corresponding with the environmental information of target position;
First determines subelement, in the case where for target object to be not present in detecting acquired image, determines The duration of target object is not detected;
Second determines subelement, for determining that target object leaves target area in the case where duration is greater than target duration Domain, and it is moved to predetermined position.
Optionally, Second processing module includes:
Second detection unit, for detecting the light intensity of target position, wherein environmental information includes line strength;
Second transmitting unit, for the case where the light intensity for detecting target position is below or equal to the first intensity Under, the light of the second intensity is higher than to target object transmitting light intensity, wherein the second intensity is greater than the last the first Degree;
Third transmitting unit, for detect target position light intensity be higher than the first intensity in the case where, to mesh Mark the liquid shell of object transmitting light storage.
Optionally, obtaining module includes:
Shooting unit, for passing through the picture in monitoring device photographic subjects region;
Recognition unit, for identifying target object in the picture of target area by the corresponding server of monitoring device;
Determination unit, for determining target pair by server in the case where identifying target object by server As the coordinate value in target area, wherein coordinate value is used to indicate target position;
Generation unit, for generating the instruction information for carrying coordinate value by server;
Receiving unit, for receiving the instruction information of server transmission.
Optionally, first processing module includes:
The second processing unit, the image collecting device for being configured in target position with target angular velocity rotation, and revolving Pass through image acquisition device target image during turning;
Whether third detection unit has that there are targets for detecting in target time section in collected multiple target images Any image of object;
Wherein, detecting in target time section there is that there are any of target object in collected multiple target images In the case where image, determination detects that there are target objects in target image;It is collected more in target time section detecting There is no in the case where target object, determine to detect that there is no target objects in target image in a target image.
It should be noted that above-mentioned modules can be realized by software or hardware, for the latter, Ke Yitong Following manner realization is crossed, but not limited to this: above-mentioned module is respectively positioned in same processor;Alternatively, above-mentioned modules are with any Combined form is located in different processors.
It is described in detail below with reference to alternative embodiment of the present invention.
Alternative embodiment of the present invention provides a kind of patrol robot equipment of processing for object, in this optional implementation The function of realizing automatic patrol and image monitoring in example with robot, links with video monitoring equipment, to muroid intrusion behavior Real time reaction, expel, realize protection to high cleanliness region, it is particularly possible under the unattended state in night warehouse Rat destruction work.
Above-mentioned patrol robot has camera shooting function, liquid big gun emission function, light emission function and wireless Communication function, camera can have infrared function, to support the patrol rat destruction at night to work.
Muroid is sensitive to foreign matter, walks quickly and keeps away behavior to new things, and have the characteristic for fearing strong light.In this alternative embodiment In, using the high mobility of patrol robot, and have the function of light transmitting, liquid ballistic projections, realizes " the light to mouse Beam attack " simultaneously expels, and the region for ensureing that high cleanliness requires is not invaded by mouse.
Patrol robot has static monitoring function, and the camera carried using patrol robot and target area are Deployed video monitoring equipment treats the key point implementing monitoring of protection zone, and the video for being back to server flows into The real-time image analysis of row, when changing generation in picture, application image identification technology analyzes and determines whether be that muroid enters It invades.It is computed when being determined as muroid intrusion event, to the specific location coordinate of patrol robot transmission invasion, and invasion muroid Image.
Patrol robot has the function of the positioning of muroid target, after patrol robot receives positioning coordinate, that is, receives row Dynamic instruction, automatic path planning goes to coordinates of targets, and camera position is rotated with certain angular speed, for capturing mouse Image.The video flowing of patrol robot shooting is back to server in real time, calculates in image muroid target whether occur.When When capturing mouse target in sometime threshold value, then start " rat destruction " function, if being more than time threshold, then it is assumed that muroid is not To key position, patrol robot returns to standby position for invasion, continues the function of starting " static state monitoring ".
Patrol robot also has mouse driving function, when the camera of patrol robot successfully captures mouse target, then stands The strong beam direction of the launch is adjusted, is directed at mouse target, and emit strong beam.When mouse target is run away behavior, then open " tracking " mode, follows mouse target, until it leaves indoor coordinate range, or disappears from the visual field.
Optionally, in this alternative embodiment, after rat destruction process, the time point that can also record mouse invasion is timely It is long, with the height for assessing plague of rats potential risk for place responsible person.
Applied robot's technical substitution manpower inspection of the present invention can have more fully monitoring and active to muroid intrusion behavior It launches an attack, compared to can not currently take timely measure, mouse discrepancy of leaving substantially improves muroid injurious act under open environment Adaptibility to response.The present invention will not cause to give birth to using the mode of beam emissions even if people is mistakenly identified as biological invasion by algorithm Reason injury.
The embodiments of the present invention also provide a kind of storage medium, computer program is stored in the storage medium, wherein The computer program is arranged to execute the step in any of the above-described embodiment of the method when operation.
Optionally, in the present embodiment, above-mentioned storage medium can be set to store by executing based on following steps Calculation machine program:
S1 obtains instruction information, wherein the instruction information is used to indicate monitoring device and detects occur in target position Target object, for being monitored to target area, the target position is located in the target area monitoring device;
S2 responds the instruction information from predetermined position and is moved to the target position;
S3 acquires target image in the target position, and detects in the target image with the presence or absence of the target pair As;
S4, there are in the case where the target object in detecting the target image, Xiang Suoshu target object emits Target substance corresponding with the environmental information of the target position, and follow the target object mobile, until the target pair As leaving the target area.
Optionally, in the present embodiment, above-mentioned storage medium can include but is not limited to: USB flash disk, read-only memory (Read- Only Memory, referred to as ROM), it is random access memory (Random Access Memory, referred to as RAM), mobile hard The various media that can store computer program such as disk, magnetic or disk.
The embodiments of the present invention also provide a kind of electronic device, including memory and processor, stored in the memory There is computer program, which is arranged to run computer program to execute the step in any of the above-described embodiment of the method Suddenly.
Optionally, above-mentioned electronic device can also include transmission device and input-output equipment, wherein the transmission device It is connected with above-mentioned processor, which connects with above-mentioned processor.
Optionally, in the present embodiment, above-mentioned processor can be set to execute following steps by computer program:
S1 obtains instruction information, wherein the instruction information is used to indicate monitoring device and detects occur in target position Target object, for being monitored to target area, the target position is located in the target area monitoring device;
S2 responds the instruction information from predetermined position and is moved to the target position;
S3 acquires target image in the target position, and detects in the target image with the presence or absence of the target pair As;
S4, there are in the case where the target object in detecting the target image, Xiang Suoshu target object emits Target substance corresponding with the environmental information of the target position, and follow the target object mobile, until the target pair As leaving the target area.
Optionally, the specific example in the present embodiment can be with reference to described in above-described embodiment and optional embodiment Example, details are not described herein for the present embodiment.
Obviously, those skilled in the art should be understood that each module of the above invention or each step can be with general Computing device realize that they can be concentrated on a single computing device, or be distributed in multiple computing devices and formed Network on, optionally, they can be realized with the program code that computing device can perform, it is thus possible to which they are stored It is performed by computing device in the storage device, and in some cases, it can be to be different from shown in sequence execution herein Out or description the step of, perhaps they are fabricated to each integrated circuit modules or by them multiple modules or Step is fabricated to single integrated circuit module to realize.In this way, the present invention is not limited to any specific hardware and softwares to combine.
The foregoing is only a preferred embodiment of the present invention, is not intended to restrict the invention, for the skill of this field For art personnel, the invention may be variously modified and varied.It is all within principle of the invention, it is made it is any modification, etc. With replacement, improvement etc., should all be included in the protection scope of the present invention.

Claims (10)

1. a kind of processing method of object characterized by comprising
Obtain instruction information, wherein the instruction information is used to indicate monitoring device and detects target occur in target position Object, for being monitored to target area, the target position is located in the target area monitoring device;
It responds the instruction information and is moved to the target position from predetermined position;
Target image is acquired in the target position, and is detected in the target image with the presence or absence of the target object;
There are in the case where the target object in detecting the target image, Xiang Suoshu target object emits and the mesh The corresponding target substance of the environmental information of cursor position, and follow the target object mobile, until the target object leaves institute State target area.
2. the method according to claim 1, wherein there are the targets pair in detecting the target image As in the case where, Xiang Suoshu target object emits target substance, and follows the target object mobile, until the target object Leaving the target area includes:
There are in the case where the target object in detecting the target image, Xiang Suoshu target object emits and the mesh The corresponding target substance of the environmental information of cursor position;
Detect whether the target object moves;
In the case where detecting that the target object moves, follows the target object and continue to the target object Emit target substance corresponding with the environmental information of the target position, until the target object leaves the target area.
3. according to the method described in claim 2, it is characterized in that, following the target object and continuing to the target object Emit target substance corresponding with the environmental information of the target position, until the target object leaves the target area packet It includes:
During following the target object, it whether there is the target object in real-time detection acquired image;
There are in the case where the target object, continue to target object transmitting and institute in detecting acquired image State the corresponding target substance of environmental information of target position;
In the case where the target object is not present in detecting acquired image, the target object is not detected in determination Duration;
In the case where the duration is greater than target duration, determine that the target object leaves the target area, and be moved to The predetermined position.
4. the method according to claim 1, wherein to the ring of target object transmitting and the target position Information corresponding target substance in border includes:
Detect the light intensity of the target position, wherein the environmental information includes the line strength;
In the case where detecting the light intensity of the target position below or equal to the first intensity, Xiang Suoshu target object Emit the light that light intensity is higher than the second intensity, wherein second intensity is greater than first intensity;
In the case where detecting that the light intensity of the target position is higher than first intensity, the transmitting of Xiang Suoshu target object The liquid shell of light storage.
5. a kind of processing unit of object characterized by comprising
Module is obtained, for obtaining instruction information, wherein the instruction information is used to indicate monitoring device and detects in target position It sets and target object has occurred, for the monitoring device for being monitored to target area, the target position is located at the target In region;
Mobile module is moved to the target position from predetermined position for responding the instruction information;
First processing module for acquiring target image in the target position, and detects and whether there is in the target image The target object;
Second processing module, in detecting the target image there are in the case where the target object, Xiang Suoshu mesh It marks object and emits target substance corresponding with the environmental information of the target position, and follow the target object mobile, until The target object leaves the target area.
6. device according to claim 5, which is characterized in that the Second processing module includes:
First transmitting unit, in detecting the target image there are in the case where the target object, Xiang Suoshu mesh It marks object and emits target substance corresponding with the environmental information of the target position;
First detection unit, for detecting whether the target object moves;
First processing units, for following the target object simultaneously in the case where detecting that the target object moves Continue to emit target substance corresponding with the environmental information of the target position to the target object, until the target object Leave the target area.
7. device according to claim 6, which is characterized in that the processing unit includes:
Detection sub-unit, for whether there is during following the target object in real-time detection acquired image The target object;
Emit subelement, for, there are in the case where the target object, continuing to described in detecting acquired image Target object emits target substance corresponding with the environmental information of the target position;
First determines subelement, in the case where for the target object to be not present in detecting acquired image, determines The duration of the target object is not detected;
Second determines subelement, for determining that the target object leaves institute in the case where the duration is greater than target duration Target area is stated, and is moved to the predetermined position.
8. device according to claim 5, which is characterized in that the Second processing module includes:
Second detection unit, for detecting the light intensity of the target position, wherein the environmental information includes the pipeline Intensity;
Second transmitting unit, for the case where the light intensity for detecting the target position is below or equal to the first intensity Under, Xiang Suoshu target object emits the light that light intensity is higher than the second intensity, wherein second intensity is greater than First intensity;
Third transmitting unit, for detect the target position light intensity be higher than first intensity in the case where, The liquid shell stored to target object transmitting light.
9. a kind of storage medium, which is characterized in that be stored with computer program in the storage medium, wherein the computer Program is arranged to execute method described in any one of Claims 1-4 when operation.
10. a kind of electronic device, including memory and processor, which is characterized in that be stored with computer journey in the memory Sequence, the processor are arranged to run the computer program to execute side described in any one of Claims 1-4 Method.
CN201910684982.3A 2019-07-26 2019-07-26 The processing method and processing device of object Pending CN110290356A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910684982.3A CN110290356A (en) 2019-07-26 2019-07-26 The processing method and processing device of object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910684982.3A CN110290356A (en) 2019-07-26 2019-07-26 The processing method and processing device of object

Publications (1)

Publication Number Publication Date
CN110290356A true CN110290356A (en) 2019-09-27

Family

ID=68024019

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910684982.3A Pending CN110290356A (en) 2019-07-26 2019-07-26 The processing method and processing device of object

Country Status (1)

Country Link
CN (1) CN110290356A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112913830A (en) * 2021-01-21 2021-06-08 国网黑龙江省电力有限公司哈尔滨供电公司 Small animal expelling device based on infrared imaging mobile detection
CN114514918A (en) * 2020-11-19 2022-05-20 苏州宝时得电动工具有限公司 System and method for driving animals away from specific area and autonomous mobile device
CN114650366A (en) * 2020-12-18 2022-06-21 深圳市卫飞科技有限公司 Flying bird defense method, master control module, flying bird defense system and storage medium
TWI815348B (en) * 2021-12-21 2023-09-11 英華達股份有限公司 Method for preventing pet from entering restricted area

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110144829A1 (en) * 2009-12-10 2011-06-16 Korea Atomic Energy Research Institute Countermeasure system for birds
CN108553028A (en) * 2018-04-12 2018-09-21 深圳市沃特沃德股份有限公司 Drive mouse method and sweeping robot
CN109168831A (en) * 2018-08-22 2019-01-11 深圳威琳懋生物科技有限公司 Bird-repellent robots
CN109345747A (en) * 2018-09-26 2019-02-15 深圳市敢为特种设备物联网技术有限公司 Anti-intrusion system and its control method and computer readable storage medium
CN109526934A (en) * 2018-11-15 2019-03-29 肖湘江 Scarer and its bird repellent method based on patrol robot
US20190141982A1 (en) * 2017-11-16 2019-05-16 Brian Wayne Carnell Methods and systems for directing animals away from an area
CN109799760A (en) * 2019-01-30 2019-05-24 华通科技有限公司 The bird-repellent robots control system and control method of power industry

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110144829A1 (en) * 2009-12-10 2011-06-16 Korea Atomic Energy Research Institute Countermeasure system for birds
US20190141982A1 (en) * 2017-11-16 2019-05-16 Brian Wayne Carnell Methods and systems for directing animals away from an area
CN108553028A (en) * 2018-04-12 2018-09-21 深圳市沃特沃德股份有限公司 Drive mouse method and sweeping robot
CN109168831A (en) * 2018-08-22 2019-01-11 深圳威琳懋生物科技有限公司 Bird-repellent robots
CN109345747A (en) * 2018-09-26 2019-02-15 深圳市敢为特种设备物联网技术有限公司 Anti-intrusion system and its control method and computer readable storage medium
CN109526934A (en) * 2018-11-15 2019-03-29 肖湘江 Scarer and its bird repellent method based on patrol robot
CN109799760A (en) * 2019-01-30 2019-05-24 华通科技有限公司 The bird-repellent robots control system and control method of power industry

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114514918A (en) * 2020-11-19 2022-05-20 苏州宝时得电动工具有限公司 System and method for driving animals away from specific area and autonomous mobile device
CN114650366A (en) * 2020-12-18 2022-06-21 深圳市卫飞科技有限公司 Flying bird defense method, master control module, flying bird defense system and storage medium
CN112913830A (en) * 2021-01-21 2021-06-08 国网黑龙江省电力有限公司哈尔滨供电公司 Small animal expelling device based on infrared imaging mobile detection
TWI815348B (en) * 2021-12-21 2023-09-11 英華達股份有限公司 Method for preventing pet from entering restricted area

Similar Documents

Publication Publication Date Title
CN110290356A (en) The processing method and processing device of object
US11576367B2 (en) System and methods for automated wildlife detection, monitoring and control
US20210027602A1 (en) Enhanced audiovisual analytics
CN103726879B (en) Utilize camera automatic capturing mine ore deposit to shake and cave in and the method for record warning in time
CN100504942C (en) Module set of intelligent video monitoring device, system and monitoring method
Adami et al. Design, development and evaluation of an intelligent animal repelling system for crop protection based on embedded edge-AI
CN109040709A (en) Video monitoring method and device, monitoring server and video monitoring system
CN111753594B (en) Dangerous identification method, device and system
CN109799760A (en) The bird-repellent robots control system and control method of power industry
CN107347145A (en) A kind of video frequency monitoring method and pan-tilt network camera
CN112022000A (en) Sweeping method of sweeping robot and related device
JP6625786B2 (en) Anomaly detection system, anomaly detection method and program
US20220281117A1 (en) Remote physiological data sensing robot
Schiano et al. Autonomous detection and deterrence of pigeons on buildings by drones
JP2020092643A (en) Unmanned flight device, unmanned flight system, and unmanned flight device control system
CN106303420A (en) A kind of monitoring method being applied to moving target and monitoring system
CN117275157A (en) Surrounding intrusion alarm system and method based on radar and video fusion
CN110488829A (en) Go on patrol the control method and device of equipment
CN109894296B (en) Method and device for adjusting water spraying state, computer equipment and storage medium
CN114558267A (en) Industrial scene fire prevention and control system
CN109815921A (en) The prediction technique and device of the class of activity in hydrogenation stations
CN112806339B (en) Laser emission control device, method and computer-readable storage medium
CN108717525A (en) A kind of information processing method, device, computer storage media and terminal
JP7300958B2 (en) IMAGING DEVICE, CONTROL METHOD, AND COMPUTER PROGRAM
WO2021250875A1 (en) Wild animal monitoring system and monitoring control device, method and program thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20190927

RJ01 Rejection of invention patent application after publication