CN110067274A - Apparatus control method and excavator - Google Patents
Apparatus control method and excavator Download PDFInfo
- Publication number
- CN110067274A CN110067274A CN201910358054.8A CN201910358054A CN110067274A CN 110067274 A CN110067274 A CN 110067274A CN 201910358054 A CN201910358054 A CN 201910358054A CN 110067274 A CN110067274 A CN 110067274A
- Authority
- CN
- China
- Prior art keywords
- target
- image data
- coordinate
- acquisition
- equipment
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F3/00—Dredgers; Soil-shifting machines
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/20—Drives; Control devices
Abstract
This application provides a kind of apparatus control method and excavators, wherein apparatus control method includes: the first image data acquired in the corresponding limited area of target device;Acquire the second image data of the limited area of target device;When detecting in the first image data there are when target object, the object region where target object is matched with each point in the second image data, to obtain the target three-dimensional coordinate range of object region;The first distance of target object and target device is obtained according to target three-dimensional coordinate range;When first distance is less than setting value, control target device stop working state or reduction operating rate.It can reduce by the identification of the environment to equipment periphery, when recognizing there may be target object or arrestment running, the running that equipment can be improved are safe.
Description
Technical field
This application involves mechanical equipment control technology fields, in particular to a kind of apparatus control method and excavator.
Background technique
Field of industrial machinery is being built, all kinds of construction equipments and construction personnel are likely to be in same working environment, if applied
During construction equipment is in operating status, then exist for the construction personnel in the working environment potential dangerous.Based on this, existing skill
By the identification of the specific tooling feature in the image to acquisition in art, if stopped if the specific tooling for recognizing construction personnel
The operation of equipment, using this processing mode, the working efficiency of equipment can be relatively low.
Summary of the invention
In view of this, the embodiment of the present application is designed to provide a kind of apparatus control method and excavator.
In a first aspect, the embodiment of the present application provides a kind of apparatus control method, comprising:
Acquire the first image data in the corresponding limited area of target device;
Acquire the second image data of the limited area of the target device;
When detecting in the first image data there are when target object, by the target image where the target object
Region is matched with each point in second image data, to obtain the target three-dimensional coordinate model of the object region
It encloses;
The first distance of the target object Yu the target device is obtained according to the target three-dimensional coordinate range;
When the first distance be less than setting value, control the target device stop working state or reduce operating rate.
Mode provided by the embodiments of the present application can be first to detecting, if detecting target pair in the first image data
As, then the first distance of target object and equipment is judged, if first distance is less than setting value, target object just may be used
There can be potential danger.It is identified by two-wheeled, the detection of the precarious position of target object can be improved.Further, judge mesh
Object distance is marked whether in danger zone, then changes the working condition of target device, therefore compared to recognizing constructor
Member specific tooling for the operating state of arrestment, can improve target device running when environment in it is each right
While the safety of elephant, additionally it is possible to blindly arrestment work is avoided, to can also improve the work of target device
Efficiency.
With reference to first aspect, the embodiment of the present application provides the first possible embodiment of first aspect, in which: institute
Stating the second image data includes three dimensional point cloud;The object region by where the target object and described second
Each point is matched in image data, the step of to obtain the target three-dimensional coordinate range of the object region, comprising:
The three dimensional point cloud is projected in the corresponding coordinate system of the first image data by coordinate conversion, with
Obtain projection point set;
Sub- projection point set corresponding in the object region is mapped back into the corresponding three-dimensional of the three dimensional point cloud
In coordinate system, to obtain the target three-dimensional coordinate range of the object region.
Further, second image data includes three dimensional point cloud;The mesh by where the target object
Logo image region is matched with each point in second image data, is sat with obtaining the target three-dimensional of the object region
The step of marking range, comprising:
Pixel in object region described in the first image data is subjected to coordinate conversion, it is described to obtain
The three-dimensional coordinate of pixel in object region, the three-dimensional coordinate are based on where each point in second image data
Three-dimensional system of coordinate under coordinate;
By three-dimensional point cloud of the three-dimensional coordinate of the pixel in the object region in second image data
Data are matched, to determine the target three-dimensional coordinate range of the pixel in the object region.
In the above-described embodiment, by the way that the pixel in object region is carried out coordinate conversion, make target image
Region project obtains the target three-dimensional coordinate range of target object into the second image data.Pass through relatively small number of calculating energy
The coverage area spatially for accessing target object, realizes the mapping of the object of different dimensions.
The first possible embodiment with reference to first aspect, the embodiment of the present application provide first aspect second can
The embodiment of energy, in which: the pixel by object region described in the first image data carries out coordinate
Conversion, the step of to obtain the three-dimensional coordinate of the pixel in the object region, comprising:
Using the transition matrix between the first acquisition equipment and the image of the second acquisition equipment acquisition to the three-dimensional point cloud
Data carry out coordinate conversion, and to obtain the projection point set, the first acquisition equipment is acquisition the first image data
Equipment, the second acquisition equipment is the equipment for acquiring the second image data;
The transition matrix is and to use the according to using first group of calibration point set of the first acquisition equipment acquisition
What second group of calibration point set of two acquisition equipment acquisitions determined.
In the above-described embodiment, by using the first acquisition equipment and the corresponding transition matrix of the second acquisition equipment to mesh
Pixel in logo image region carries out coordinate conversion, makes the pixel in the three-dimensional coordinate being converted to and the second image data
The coordinate system used more matches.It is additionally based on first group of calibration point set of the first acquisition equipment acquisition, and is adopted using second
Second group of calibration point set of collection equipment acquisition determines transition matrix, can make the transition matrix determined and two acquisition equipment
It is interrelated.
Second of possible embodiment with reference to first aspect, the embodiment of the present application provide first aspect the third can
The embodiment of energy, in which: the method also includes:
For the first acquisition equipment and the second acquisition equipment, identical time system is set;
Thread is controlled for the first acquisition device configuration first;
Thread is controlled for the second acquisition device configuration second;
In the above-described embodiment, pass through the first control thread and the second control thread, control described first
It is identical to acquire the equipment time interval for acquiring data twice adjacent with the second acquisition equipment, so that the first acquisition equipment acquisition
The first image data and it is described second acquisition collected second image data of equipment time synchronization.
Two acquisition equipment are controlled by thread, and identical time system is set for two acquisition equipment, can be made
The image data for acquiring equipment acquisition with two can match in time, to make the testing result of target object also just more
It is accurate.
With reference to first aspect, the embodiment of the present application provides the 4th kind of possible embodiment of first aspect, in which: institute
The step of stating the first distance that the target object and the target device are obtained according to the target three-dimensional coordinate range, packet
It includes:
The center point coordinate of the target object is obtained according to the target three-dimensional coordinate range computation, the central point is sat
Mark indicates the coordinate that target device described in distance is nearest in the target three-dimensional coordinate range;
The first distance of the target object Yu the target device is calculated according to the center point coordinate.
In the above-described embodiment, the central point of target object can preferably characterize the position of target object, thus logical
Center point coordinate is crossed at a distance from target device, obtains target object at a distance from target device, can more accurately characterize mesh
Mark the position of object and target device.
4th kind of possible embodiment with reference to first aspect, the embodiment of the present application provide first aspect the 5th kind can
The embodiment of energy, in which: described that the target object and the target device are calculated according to the center point coordinate
The step of first distance, comprising:
Obtain two coordinate values in the horizontal direction in the center point coordinate;
The water of the target object Yu the target device is calculated according to two coordinate values in the horizontal direction
Flat distance, as the first distance.
In the above-described embodiment, since equipment influences the distance of the safe major embodiment of related object in the horizontal direction,
Use the distance of horizontal direction as target object at a distance from target device, can be avoided the shadow adjusted the distance on vertical direction
It rings, to preferably identify the object in range of set value, is in the people in same working environment with equipment to improve
The safety of member.
4th kind of possible embodiment with reference to first aspect, the embodiment of the present application provide first aspect the 6th kind can
The embodiment of energy, in which: the central point for obtaining the target object according to the target three-dimensional coordinate range computation is sat
Target step, comprising:
Frame fitting is carried out to the target three-dimensional coordinate range, obtains the spatial cuboids where the target object;
The centre coordinate for calculating the spatial cuboids, using the centre coordinate of the spatial cuboids as the target pair
The center point coordinate of elephant.
In the above-described embodiment, target three-dimensional coordinate range is first mapped to by a convenience by the placement that frame is fitted
In the geometry for calculating central point, then based on obtained geometry computations obtain centre coordinate.In addition, being fitted by frame
The centre coordinate that geometry computations out obtain also can preferably indicate the center of target object.
With reference to first aspect, the embodiment of the present application provides the 7th kind of possible embodiment of first aspect, in which: institute
State method further include: when the first distance is less than setting value, generate warning message.
In the above-described embodiment, operator or target object can be reminded that may be present potential by warning message
Danger, to be effectively reduced risk.
The 7th kind of embodiment with reference to first aspect, the embodiment of the present application provide first aspect the 8th kind are possible
Embodiment, in which: the target device is excavator, and the excavator includes digging machine arm, described when the first distance is small
When setting value, generate warning message the step of, comprising:
When the first distance is less than the longest spread length of the digging machine arm, warning message is generated.
Second aspect, the embodiment of the present application also provide a kind of plant control unit, comprising:
First acquisition module, for acquiring the first image data in the corresponding limited area of target device;
Second acquisition module, the second image data of the limited area for acquiring the target device;
Matching module, for when detecting in the first image data there are when target object, by the target object
The object region at place is matched with each point in second image data, to obtain the mesh of the object region
Mark three-dimensional coordinate range;
Module is obtained, for obtaining the target object and the target device according to the target three-dimensional coordinate range
First distance;
Control module, for controlling the target device stop working state when the first distance is less than setting value
Or reduce operating rate.
The third aspect, the embodiment of the present application also provide a kind of excavator, comprising: processor, memory and bus, it is described to deposit
Reservoir is stored with the executable machine readable instructions of the processor, and when electronic equipment operation, the processor is deposited with described
By bus communication between reservoir, the machine readable instructions execute above-mentioned in a first aspect, or the when being executed by the processor
The step of method in any possible embodiment of one side.
To enable the above objects, features, and advantages of the application to be clearer and more comprehensible, special embodiment below, and appended by cooperation
Attached drawing is described in detail below.
Detailed description of the invention
Technical solution in ord to more clearly illustrate embodiments of the present application, below will be to needed in the embodiment attached
Figure is briefly described, it should be understood that the following drawings illustrates only some embodiments of the application, therefore is not construed as pair
The restriction of range for those of ordinary skill in the art without creative efforts, can also be according to this
A little attached drawings obtain other relevant attached drawings.
Fig. 1 is the block diagram of excavator provided by the embodiments of the present application.
Fig. 2 is the flow chart of apparatus control method provided by the embodiments of the present application.
Fig. 3 is the detail flowchart of the step S203 of apparatus control method provided by the embodiments of the present application.
Fig. 4 is the functional block diagram of plant control unit provided by the embodiments of the present application.
Specific embodiment
Below in conjunction with attached drawing in the embodiment of the present application, technical solutions in the embodiments of the present application carries out clear, complete
Ground description, it is clear that described embodiments are only a part of embodiments of the present application, instead of all the embodiments.Usually exist
The component of the embodiment of the present application described and illustrated in attached drawing can be arranged and be designed with a variety of different configurations herein.Cause
This, is not intended to limit claimed the application's to the detailed description of the embodiments herein provided in the accompanying drawings below
Range, but it is merely representative of the selected embodiment of the application.Based on embodiments herein, those skilled in the art are not being done
Every other embodiment obtained under the premise of creative work out, shall fall in the protection scope of this application.
It should also be noted that similar label and letter indicate similar terms in following attached drawing, therefore, once a certain Xiang Yi
It is defined in a attached drawing, does not then need that it is further defined and explained in subsequent attached drawing.Meanwhile the application's
In description, term " first ", " second " etc. are only used for distinguishing description, are not understood to indicate or imply relative importance.
In building industrial circle, since various kinds of equipment is larger, and in equipment, there may be some danger coefficients are relatively high
Component, for example, excavator bucket provides convenience for constructions work, but if accidentally bumps against people's for cutting the earth etc.
Words then will lead to people and be in jeopardy state.There may be operator, operators to see equipment periphery for some equipment
There is the running that arrestment is waited in the presence of personnel, but sometimes equipment may be bigger, causes operator may not be able to
The environment on periphery is seen well.Based on above-mentioned problem, present inventor is to each equipment built in industrial circle
It is studied.
Firstly, since construction personnel can wear the clothes with special characteristic, it is based on this, it can be by acquiring equipment peripheral ring
The image in border, by the identification of the image to collected environment, so as to realize the identification to equipment surrounding enviroment.Knowing
When being clipped to construction personnel, with regard to the operation of arrestment, so that equipment be avoided to collide construction personnel.But this mode may
Can exist, even if having construction personnel on equipment periphery, but construction personnel apart from equipment farther out, the normal operation of equipment can't
The safety of construction personnel is threatened, in this case, if equipment stopped operating, will be greatly reduced the running effect of equipment
Rate.In addition, for the personnel for not wearing the clothes with special characteristic, then there may be cannot identify, so as to cause
There may be potential dangers for the safety of such personnel.
Further, it provides in the prior art and wears mobile tag on construction personnel, using wireless signal in air
In spread speed, identify the distance between car-mounted terminal and mobile tag.It but is not that each people at the scene has shifting
Dynamic label may then have the case where some personnel can not be detected.
For above-mentioned research, a kind of apparatus control method is provided in the embodiment of the present application, by acquiring equipment periphery
Image in environment, it will be able to realize in image with the presence or absence of target object (for example, construction personnel).Calculating is identified again
Target object is at a distance from target device, so as to realize that the distance of target object calculates, to be set according to distance to target
It is standby to be controlled, to realize the control of the operating environment of equipment.
For convenient for understanding the present embodiment, first to executing a kind of equipment controlling party disclosed in the embodiment of the present application
The mechanical equipment of method describes in detail.
Embodiment one
As shown in Figure 1, Fig. 1 shows the block diagram of excavator provided by the embodiments of the present application.Excavator can wrap
Include: memory 111, storage control 112, processor 113, Peripheral Interface 114, input-output unit 115, acquisition equipment 116,
Bucket 117, fuselage 118.It will appreciated by the skilled person that structure shown in FIG. 1 is only to illustrate, not to excavation
The structure of machine 100 causes to limit.For example, excavator 100 may also include than shown in Fig. 1 more perhaps less component or
With the configuration different from shown in Fig. 1.
Memory 111, storage control 112, processor 113, Peripheral Interface 114, input-output unit 115 and acquisition are set
Standby 116 each elements are directly or indirectly electrically connected between each other, to realize the transmission or interaction of data.For example, these elements
It can be realized and be electrically connected by one or more communication bus or signal wire between each other.
Wherein, memory 111 may be, but not limited to, random access memory (Random Access Memory,
RAM), read-only memory (Read Only Memory, ROM), programmable read only memory (Programmable Read-Only
Memory, PROM), erasable read-only memory (Erasable Programmable Read-Only Memory, EPROM),
Electricallyerasable ROM (EEROM) (Electric Erasable Programmable Read-Only Memory, EEPROM) etc..
Wherein, memory 111 is for storing program, and the processor 113 executes described program, this Shen after receiving and executing instruction
It method please can be applied in processor 113 performed by the excavator 100 that defines of process that discloses of embodiment any embodiment,
Or it is realized by processor 113.
Processor 113 may be a kind of IC chip, the processing capacity with signal.Above-mentioned processor 113 can
To be general processor, including central processing unit (Central Processing Unit, abbreviation CPU), network processing unit
(Network Processor, abbreviation NP) etc.;Can also be digital signal processor (Digital Signal Processor,
Abbreviation DSP), specific integrated circuit (Application Specific Integrated Circuit, abbreviation ASIC), scene
Programmable gate array (Field Programmable Gate Array, abbreviation FPGA) or other programmable logic device are divided
Vertical door or transistor logic, discrete hardware components.It may be implemented or execute and is in the embodiment of the present application disclosed each
Method, step and logic diagram.General processor can be microprocessor or the processor is also possible to any conventional place
Manage device etc..
Various input/output devices are couple processor 113 and memory 111 by Peripheral Interface 114.In some implementations
In example, Peripheral Interface 114, processor 113 and storage control 112 can be realized in one single chip.In some other reality
In example, they can be realized by independent chip respectively.
Input-output unit 116 is for being supplied to user input data.Input-output unit may be, but not limited to, if
Standby upper key, mobile remote control device of connecting excavator etc..
Acquisition equipment 116 is used to acquire the environmental data on excavator periphery.Above-mentioned acquisition equipment 116 can be industrial phase
Machine, laser radar, binocular camera etc..In one embodiment, excavator may include two acquisition equipment, including:
First acquisition equipment and the second acquisition equipment.First acquisition equipment and the second acquisition equipment may be mounted in same horizontal line,
Also it may be mounted on same vertical line.Specifically suitable mounting means can be carried out according to the type of acquisition equipment 116.
Further, excavator can also include more units, for example, display unit.Display unit is in excavator 100
One interactive interface (such as user interface) is provided between user or is referred to for display image data to user.At this
In embodiment, display unit can be liquid crystal display or touch control display.If touch control display, can for support single-point and
Capacitance type touch control screen or resistance type touch control screen of multi-point touch operation etc..Single-point and multi-point touch operation is supported to refer to that touch-control is shown
Device can sense one or more positions while the touch control operation generated on the touch control display, and this is sensed
Touch control operation transfers to processor 113 to be calculated and handled.
Embodiment two
Referring to Fig. 2, being the flow chart of apparatus control method provided by the embodiments of the present application.In some embodiments, originally
Apparatus control method in embodiment is applied on excavator shown in FIG. 1.Detailed process shown in Fig. 2 will be carried out below detailed
It is thin to illustrate.
Step S201 acquires the first image data in the corresponding limited area of target device.
The first above-mentioned image data can be two-dimensional image data;It is also possible to video data.
In a kind of optional embodiment, the first above-mentioned image data can be to be collected by camera.For example, can
To be industrial camera or other general cameras that Image Acquisition may be implemented.
Step S202 acquires the second image data of the limited area of the target device.
The execution sequence of above-mentioned step S201 step S202 is not limited with sequence shown in Fig. 2, and Fig. 2 is only schematic
's.For example, step S201 can be executed before step S202, it can also execute, can also be held after step S202 simultaneously
Row.
The second above-mentioned image data can be two-dimensional image data;It is also possible to video data;It can also be three-dimensional
Point cloud data.
Wherein, if the second above-mentioned image data is image data or video data, the second image data is acquired
Equipment can be equipment identical with the first image data of acquisition.If the second above-mentioned image data is three-dimensional point cloud data,
The equipment for acquiring the second image data then can be the three-dimension sensor that can be obtained pixel and acquire equipment distance, for example,
Three-dimension sensor can be laser radar, depth camera etc..
Optionally, the first above-mentioned image data and the second image data can be two-dimensional image data, with this condition,
The the first acquisition equipment for acquiring the first image data may be mounted at same with the second acquisition equipment for acquiring the second image data
On horizontal line, the first acquisition equipment and second is set to acquire equipment formation binocular camera, for acquiring the X-Y scheme in surrounding enviroment
Picture carries out two dimensional image to match available three dimensional point cloud.It is alternatively possible to by the first above-mentioned image data and
Two image datas carry out binocular parallax matching line by line using the function of reprojectImageTo3D in OpenCV open source vision library
Obtain three dimensional point cloud.
Optionally, the first above-mentioned image data can be two-dimensional image data, and the second image data can be three-dimensional point
Cloud data.With this condition, it acquires the first acquisition equipment of the first image data and acquires the second acquisition of the second image data
Equipment may be mounted on same vertical line.In an optional example, the first above-mentioned acquisition equipment and the second acquisition are set
It is standby to may be mounted in an anti-water Cylinder of industry.For example, the second acquisition equipment, industrial waterproof are fixed in top in the anti-water Cylinder of industry
The first acquisition equipment is fixed below cylinder, to realize the first acquisition equipment and the second acquisition equipment in same vertical line.
Before step S203, the first above-mentioned image data can be detected by neural network model, thus
Detect whether there is target object in the first image data.
Above-mentioned neural network model can be obtained by the training of training data set.
Above-mentioned training data set can be using the first above-mentioned acquisition equipment.It include constructor in training data
The image of the picture of member, other personnel.Training data can also include the image shot at different conditions.Wherein, no
It may include different weather with condition, such as: fine day, rainy day, greasy weather, snowy day;Different condition can also include different time sections,
Such as: early, middle and late in one day;Different condition can also include different illumination.Further, the first acquisition equipment collects
Image data labelme annotation tool can be used the construction personnel position that data are concentrated is labeled, obtain every figure
The Labeling Coordinate data of the two-dimentional boundary rectangle of the position of objects such as corresponding construction personnel as in.Above-mentioned training data set
It may include training set, verifying collection, test set.Wherein, training set, verifying collection, test set ratio can be different, optionally,
Training set can with ratio can be greater than verifying collection and test set.For example, the ratio of training set, verifying collection and test set can be
6:2:2, for another example the ratio of training set, verifying collection and test set can be 5:3:2.
The corresponding model that can be the training of FastBox algorithm to training pattern of above-mentioned neural network model.FastBox
Algorithm relative to traditional Yolo algorithm, joined ROI-Pooling (Region of Interest-Pooling, it is interested
Pool area) structure.
Wherein, FastBox algorithm is divided into encoder and decoder two parts, encoder using VGG16 to construction personnel into
Row feature extraction, decoder are first to transmit coding characteristic with the 1x1 convolutional layer of multiple filters (for example, 500 filters),
The tensor of 39x12x500 size is generated, the channel of 6 39x12, tensor the first two channel are then exported by 1x1 convolutional layer
Bounding box is generated, the output result in one of channel is two-dimentional frame, another channel is in the two dimension frame
The classification of object.Rear four channels of the tensor indicate the bounding box coordinates of two-dimentional frame, maximum value respectively in first axle,
Minimum value in first axle, the maximum value on the second axis and the minimum value on the second axis.
Above-mentioned training set, verifying collection and test set input are calculated to training pattern respectively, after calculating every time really
The parameter to be determined to training pattern is made, so that being formed can be used for identifying in the first image data with the presence or absence of target object
Neural network model.
Construction personnel's model training is carried out by above-mentioned FastBox algorithm, by Adam optimizer and to the volume of all 1x1
The dropout that product carries out 0.5 obtains construction personnel's identification model that accuracy is 94.8%.
By acquiring first in the real-time afferent nerve network model of collected first image data of equipment, first is obtained
The corresponding two-dimentional boundary rectangle coordinate of target object (such as: construction personnel) in image data.
Optionally, the corresponding initial model to be trained of above-mentioned neural network model can also use Yolo, faster
The model that the objects detection algorithms such as rcnn, SSD (Single Shot MultiBox Detector) are formed.
Step S203 will be where the target object when detecting in the first image data there are when target object
Object region matched with each point in second image data, to obtain the target three of the object region
Tie up coordinate range.
It may determine that the periphery of target device with the presence or absence of target object by the identification to the first above-mentioned image data.
Above-mentioned target object can be people, for example, construction personnel, other visiting personnel etc.;It is also possible to some animals;It can also be
Some fixed buildings, construction material of accumulation etc..
In one embodiment, the second above-mentioned image data can be three dimensional point cloud.As shown in figure 3, step
S203 may include step S2031 and step S2032.
Step S2031, by the three dimensional point cloud, by coordinate conversion, to project to the first image data corresponding
In coordinate system, to obtain projection point set.
Above-mentioned projection point set is the coordinate point set for projecting to two-dimensional image space based on the second above-mentioned image data.
Step S2031 can be accomplished by the following way: use the figure of the first acquisition equipment and the second acquisition equipment acquisition
Transition matrix as between carries out coordinate conversion to the three dimensional point cloud, to obtain the projection point set.
Above-mentioned transition matrix is according to first group of calibration point set for using the first acquisition equipment acquisition, and use
What second group of calibration point set of the second acquisition equipment acquisition determined.
Specifically, it is determined that can also first be demarcated to the first acquisition equipment before transition matrix.It is alternatively possible to first make
The first acquisition equipment is demarcated with gridiron pattern standardization, obtains the internal reference matrix and distortion matrix of the first acquisition equipment.
In order to use the available target object of matching of the first image data and the second image data in 3-D image
Three-dimensional coordinate range, then need the first acquisition equipment acquiring equipment with second and merge.Wherein, two acquire melting for equipment
Conjunction may include fusion and temporal matching two parts spatially.
In target image range of the fusion mark by the target object in the first image data for recognizing spatially
Pixel can match in the corresponding three-dimensional scenic of the corresponding three dimensional point cloud of the second image data and find unique data
Point is corresponding.The fusion spatially of two acquisition equipment is described so that the second acquisition equipment is laser radar as an example below.
It is M (X, Y, Z) that bidding, which determines three-dimensional coordinate of the reference point in the coordinate system of laser radar, in the first acquisition equipment
The image coordinate of coordinate system be m (u, v), Two coordinate system transformational relation can indicate are as follows:
Wherein, P3×4Indicate the projective transformation matrix from laser radar coordinate system to the coordinate system in the first acquisition equipment, Zc
For any proportion factor, Z is eliminatedc, it can obtain:
Wherein, p'=(p11,p12,p13,p14,p21,p22,p23,p24,p31,p32,p33,p34)T, then 6 calibration are at least needed
Point can just solve projection point transfer matrix P3×4.It then can be by using first group of calibration of the first acquisition equipment acquisition
Above-mentioned transformation matrix P is calculated in point set and second group of calibration point set for acquiring equipment acquisition using second3×4.Wherein,
One group of calibration point set and second group of calibration point set correspond, the quantity at least six of first group of calibration point set.
In an example, the autoware_camera_lidar_ in autoware group of functions can be used in calibration process
Calibrator () function reads the first acquisition equipment and the collected data of calibration point of laser radar, is simultaneously displayed on rviz
In, gridiron pattern is placed in front of laser radar, and image midpoint can be clicked with point corresponding in match point cloud by finding one in the picture
Pixel, hit corresponding three-dimensional point in lidar image using Publish Point tool.Use at least nine differences
Repeat aforesaid operations.After the completion of calibration, the two-dimensional image vegetarian refreshments and laser radar coordinate under the coordinate system of the first acquisition equipment are obtained
The transition matrix of three-dimensional image vegetarian refreshments under system.
Temporal Data Matching is to guarantee that laser radar is synchronous when acquiring data with camera.Respectively laser radar
Create thread with camera, the every acquisition of two kinds of sensors allowed once to be separated by same time, and assign identical GPS time (alternatively,
Other navigation system times can be used), laser radar and camera data are thus reached synchronization process in time, also
It is temporal matching.
Specifically, the data of the first acquisition equipment and the second acquisition equipment in time can be realized by following steps
Match, may include: that identical time system is set for the first acquisition equipment and the second acquisition equipment;For first acquisition
Device configuration first controls thread;Thread is controlled for the second acquisition device configuration second;Pass through the first control thread
With the second control thread, control between the first acquisition equipment and the second acquisition equipment adjacent time for acquiring data twice
Every identical, so that the first image data of the first acquisition equipment acquisition and the second acquisition collected second figure of equipment
As the time synchronization of data.
Above-mentioned the first control thread and the second control thread acquires target device periphery at set time intervals
Image data.
Two acquisition equipment are controlled by thread, and identical time system is set for two acquisition equipment, can be made
The image data for acquiring equipment acquisition with two can match in time, to make the testing result of target object also just more
It is accurate.
Sub- projection point set corresponding in the object region is mapped back the three dimensional point cloud by step S2032
In corresponding three-dimensional system of coordinate, to obtain the target three-dimensional coordinate range of the object region.
Specifically, sub- projection point set can be mapped back above-mentioned according to the above-mentioned corresponding recovery matrix of transformation matrix
In the corresponding three-dimensional system of coordinate of three dimensional point cloud.
Optionally, above-mentioned step S203 can also be implemented as: by target image described in the first image data
Pixel in region carries out coordinate conversion, to obtain the three-dimensional coordinate of the pixel in the object region;It will be described
Three dimensional point cloud of the three-dimensional coordinate of pixel in object region in second image data is matched, with
Determine the target three-dimensional coordinate range of the pixel in the object region.
Wherein, wherein above-mentioned three-dimensional coordinate is based on the three-dimensional system of coordinate where each point in second image data
Under coordinate.
By the way that the pixel in object region is carried out coordinate conversion, object region is made to project the second image
In data, the target three-dimensional coordinate range of target object is obtained.The sky of target object can be obtained by relatively small number of calculating
Between on coverage area, realize the mapping of the object of different dimensions.
By using the first acquisition equipment and the corresponding transition matrix of the second acquisition equipment to the picture in object region
Vegetarian refreshments carries out coordinate conversion, and the coordinate system for using the pixel in the three-dimensional coordinate being converted to and the second image data is more
Matching.It is additionally based on first group of calibration point set of the first acquisition equipment acquisition, and uses the second of the second acquisition equipment acquisition
Group calibration point set determines transition matrix, and the transition matrix determined and two acquisition equipment can be made interrelated.
In another embodiment, the second above-mentioned image data can be two-dimensional image data.It then can will be above-mentioned
The first image data and the second image data pass through binocular depth estimation obtain three dimensional point cloud.Optionally, above-mentioned pair
Mesh estimation of Depth can use: SAD (Sum of absolute differences, Chinese claim: the sum of antipode) algorithm,
BM (blockmatching, Chinese claim: Block- matching) algorithm, (semi-global block matching, Chinese claim SGBM: half
Global Block- matching) algorithm, PSMNet (Pyramid Stereo Matching Network, Chinese title: pyramid Stereo matching
Network) algorithm etc..
Step S204 obtains the first of the target object and the target device according to the target three-dimensional coordinate range
Distance.
Above-mentioned target three-dimensional coordinate range can be corresponding using the coordinate system of position as the origin of the second acquisition equipment
Coordinate range.It is alternatively possible to by the above-mentioned corresponding target three-dimensional coordinate range of target object any point and seat
The distance for marking origin, as first distance.
In a kind of optional embodiment, a point nearest apart from origin in target three-dimensional coordinate range can be calculated
At a distance from origin, as first distance.
With reference to first aspect, the embodiment of the present application provides the 4th kind of possible embodiment of first aspect, in which: institute
The step of stating the first distance that the target object and the target device are obtained according to the target three-dimensional coordinate range, packet
It includes:
The center point coordinate of the target object is obtained according to the target three-dimensional coordinate range computation;According to the center
The first distance of the target object Yu the target device is calculated in point coordinate.
Illustratively, above-mentioned center point coordinate can indicate that target device described in distance is most in target three-dimensional coordinate range
Close coordinate.
The central point of target object can preferably characterize the position of target object, to pass through center point coordinate and target
The distance of equipment obtains target object at a distance from target device, can more accurately characterize target object and target device
Position.
Since equipment influences the distance of the safe major embodiment of related object in the horizontal direction, the distance of horizontal direction is used
As target object at a distance from target device, above-mentioned step S204 may include: the water obtained in the center point coordinate
Square two upward coordinate values;According to two coordinate values in the horizontal direction be calculated the target object with it is described
The horizontal distance of target device, as the first distance.
It can be avoided the influence adjusted the distance on vertical direction through the above way, to preferably identify in setting value
Object in range, to improve the safety for being in the personnel in same working environment with equipment.
Optionally, above-mentioned that the center point coordinate of the target object is obtained according to the target three-dimensional coordinate range computation
The step of, comprising: frame fitting is carried out to the target three-dimensional coordinate range, obtains the space cube where the target object
Body;The centre coordinate for calculating the spatial cuboids, using the centre coordinate of the spatial cuboids as the target object
Center point coordinate.
Above-mentioned frame fitting can carry out frame fitting using the method for L-shape and minimum area rectangle.It obtains first
The point Yun Qun for taking target three-dimensional coordinate range traverses each point of point Yun Qunli, find out on floor projection point away from
Other clouds are calculated apart from diagonal line using the line between the two points as the diagonal line of boundary rectangle from maximum two points
Vertical range, take the maximum point of vertical range be boundary rectangle frame point, it is true with two diagonal line points and a frame point
A fixed projection rectangle, the highest z-value to put in rectangular projection carry out three-dimensional side frames fitting for height.
Target three-dimensional coordinate range is first mapped to the geometry for facilitating calculating central point by the placement being fitted by frame
In shape, then based on obtained geometry computations obtain centre coordinate.In addition, the geometry computations fitted by frame
Obtained centre coordinate also can preferably indicate the center of target object.
It is possible to further be directly based upon above-mentioned target object target three-dimensional coordinate range computation first distance.
Consider there may be the influence of identification of the environmental data to target object in target device local environment, it can also be with
Some pretreatments are carried out to the pixel in above-mentioned target three-dimensional coordinate range, then calculate first distance.
It is possible, firstly, to first carry out ground rejecting, object and unique object interested for distinguishing untraceable are rejected in ground
Body, the background parts such as untraceable object such as landform.It is as follows that process is embodied: it is possible, firstly, to first establish coordinate grid map;So
Afterwards, the height value for filtering out coordinate based on above-mentioned grid map is more than the pixel of setting value;Use gaussian filtering or calculating
Gradient between the pixel of the adjacent cells of same channel;Fall discontinuous pixel by gradient filtration;It is filtered using intermediate value
Ground and overhead pixel are isolated in wave and exterior point filtering.
Then, remaining pixel after ground rejecting processing is subjected to Euclidean distance cluster, Euclidean distance cluster is to mesh
The discrete point cloud of the corresponding three-dimensional coordinate range in logo image region is clustered, and is made two neighboring apart from close discrete point conjunction
And at a pixel.It is clustered by Euclidean distance and more polymerize the corresponding three-dimensional coordinate range of target object.
Step S205, when the first distance be less than setting value, control the target device stop working state or reduction
Operating rate.
Above-mentioned setting value can be a safe distance of user setting, be also possible to the homework department according to target device
The safety value of the size length setting of part.
In an example, above-mentioned target device can be excavator, and above-mentioned setting value is not less than bucket stretch-like
The distance length at bucket edge and the second acquisition equipment under state.Above-mentioned setting value can also refer to actual excavation machine tonnage, machinery
Arm lengths etc..
In an example, above-mentioned target device can be multifunctional excavator.Multifunctional excavator may include determining
Plan module, feedback control module.Wherein decision-making module can be used for carrying out processing knowledge to acquisition equipment acquired image data
Not, and according to whether recognition result is controlled excavator stop or reduce to run.Feedback control module then can be used for root
Excavator bucket is controlled according to the result that decision-making module obtains.When the first distance being calculated is less than setting value,
The decision-making module of excavator sends urgent position code 0x0001 to feedback control module, and the solenoid valve for controlling excavator is opened, excavated
Machine stops working and buzzing warning;Excavator can also be controlled by switch valve to stop operation.When the first distance being calculated
When any object being not present when being greater than the set value or in the first image data, decision-making module sends position code 0x0000 to feedback
Module is controlled, nonintervention excavator works normally.Further, stop if controlling excavator when the first moment, it can
When not having any object in the second time collected first image data, or the first distance that is calculated is greater than and sets
When definite value, then it can control again excavator and work on.
Mode provided by the embodiments of the present application can be first to detecting, if detecting target pair in the first image data
As, then the first distance of target object and equipment is judged, if first distance is less than setting value, target object just may be used
There can be potential danger.It is identified by two-wheeled, the detection of the precarious position of target object can be improved.Further, target pair
As there may be when potential danger, then change the working condition of target device, in environment when target device running can be improved
Each object it is safe while can also reduce operating state in the clothes for recognizing construction personnel with regard to arrestment, from
And it can also improve the working efficiency of target device.
On the basis of shown in Fig. 2, the apparatus control method in the present embodiment can also include: when the first distance is small
When setting value, warning message is generated.
Above-mentioned warning message may include light warning, the buzzing alarm lamp etc. of flashing.Certainly, above-mentioned warning message
It can also be voice prompt alarm lamp.Such as: audio alert can be " you are now arranged in danger position ".
Illustratively, target device is excavator, and the excavator includes digging machine arm, and above-mentioned setting value can be digging machine
Arm longest spread length.Certainly, the value that above-mentioned setting value can also be bigger than digging machine arm longest spread length.Optionally, may be used
Above-mentioned setting value is arranged according to factors such as the operating status of target device, running track, operation elasticity.
Operator or target object potential danger that may be present can be reminded by warning message, to effectively drop
Low risk.
Embodiment three
Conceived based on same application, equipment control dress corresponding with apparatus control method is additionally provided in the embodiment of the present application
It sets, since the principle that the device in the embodiment of the present application solves the problems, such as is similar to the embodiment of the present application above equipment control method,
Therefore the implementation of device may refer to the implementation of method, and overlaps will not be repeated.
Referring to Fig. 4, being the functional block diagram of plant control unit provided by the embodiments of the present application.In the present embodiment
Plant control unit be used to execute each step in method in embodiment two.Plant control unit packet in the present embodiment
Include: the first acquisition module 301, matching module 303, obtains module 304 and control module 305 at second acquisition module 302;Its
In,
First acquisition module 301, for acquiring the first image data in the corresponding limited area of target device;
Second acquisition module 302, the second image data of the limited area for acquiring the target device;
Matching module 303, for when detecting in the first image data there are when target object, by the target pair
As the object region at place is matched with each point in second image data, to obtain the object region
Target three-dimensional coordinate range;
Module 304 is obtained, is set for obtaining the target object according to the target three-dimensional coordinate range with the target
Standby first distance;
Control module 305, for controlling the target device and stopping working shape when the first distance is less than setting value
State reduces operating rate.
In a kind of possible embodiment, matching module 303 is also used to:
Pixel in object region described in the first image data is subjected to coordinate conversion, it is described to obtain
The three-dimensional coordinate of pixel in object region, the three-dimensional coordinate are based on where each point in second image data
Three-dimensional system of coordinate under coordinate;
By three-dimensional point cloud of the three-dimensional coordinate of the pixel in the object region in second image data
Data are matched, to determine the target three-dimensional coordinate range of the pixel in the object region.
In a kind of possible embodiment, matching module 303 is also used to:
Using the transition matrix between the first acquisition equipment and the image of the second acquisition equipment acquisition to the three-dimensional point cloud
Data carry out coordinate conversion, and to obtain the projection point set, the first acquisition equipment is acquisition the first image data
Equipment, the second acquisition equipment is the equipment for acquiring the second image data;
The transition matrix is and to use the according to using first group of calibration point set of the first acquisition equipment acquisition
What second group of calibration point set of two acquisition equipment acquisitions determined.
In a kind of possible embodiment, plant control unit further include: configuration module is used for:
For the first acquisition equipment and the second acquisition equipment, identical time system is set;
Thread is controlled for the first acquisition device configuration first;
Thread is controlled for the second acquisition device configuration second;
By the first control thread and the second control thread, the first acquisition equipment and the second acquisition are controlled
The adjacent time interval for acquiring data twice of equipment is identical, so that the first image data of the first acquisition equipment acquisition and institute
State the time synchronization of collected second image data of the second acquisition equipment.
In a kind of possible embodiment, module 304 is obtained, is also used to:
The center point coordinate of the target object is obtained according to the target three-dimensional coordinate range computation;
The first distance of the target object Yu the target device is calculated according to the center point coordinate.
In a kind of possible embodiment, module 304 is obtained, is also used to:
Obtain two coordinate values in the horizontal direction in the center point coordinate;
The water of the target object Yu the target device is calculated according to two coordinate values in the horizontal direction
Flat distance, as the first distance.
In a kind of possible embodiment, module 304 is obtained, is also used to:
Frame fitting is carried out to the target three-dimensional coordinate range, obtains the spatial cuboids where the target object;
The centre coordinate for calculating the spatial cuboids, using the centre coordinate of the spatial cuboids as the target pair
The center point coordinate of elephant.
In a kind of possible embodiment, plant control unit further include: alarm module, for when the first distance it is small
When setting value, warning message is generated.
In addition, the embodiment of the present application also provides a kind of computer readable storage medium, on the computer readable storage medium
It is stored with computer program, equipment described in above method embodiment is executed when which is run by processor and is controlled
The step of method.
The computer program product of apparatus control method provided by the embodiment of the present application, including storing program code
Computer readable storage medium, the instruction that said program code includes can be used for executing equipment described in above method embodiment
The step of control method, for details, reference can be made to above method embodiments, and details are not described herein.
In several embodiments provided herein, it should be understood that disclosed device and method can also pass through
Other modes are realized.The apparatus embodiments described above are merely exemplary, for example, flow chart and block diagram in attached drawing
Show the device of multiple embodiments according to the application, the architectural framework in the cards of method and computer program product,
Function and operation.In this regard, each box in flowchart or block diagram can represent the one of a module, section or code
Part, a part of the module, section or code, which includes that one or more is for implementing the specified logical function, to be held
Row instruction.It should also be noted that function marked in the box can also be to be different from some implementations as replacement
The sequence marked in attached drawing occurs.For example, two continuous boxes can actually be basically executed in parallel, they are sometimes
It can execute in the opposite order, this depends on the function involved.It is also noted that every in block diagram and or flow chart
The combination of box in a box and block diagram and or flow chart can use the dedicated base for executing defined function or movement
It realizes, or can realize using a combination of dedicated hardware and computer instructions in the system of hardware.
In addition, each functional module in each embodiment of the application can integrate one independent portion of formation together
Point, it is also possible to modules individualism, an independent part can also be integrated to form with two or more modules.
It, can be with if the function is realized and when sold or used as an independent product in the form of software function module
It is stored in a computer readable storage medium.Based on this understanding, the technical solution of the application is substantially in other words
The part of the part that contributes to existing technology or the technical solution can be embodied in the form of software products, the meter
Calculation machine software product is stored in a storage medium, including some instructions are used so that a computer equipment (can be a
People's computer, server or network equipment etc.) execute each embodiment the method for the application all or part of the steps.
And storage medium above-mentioned includes: that USB flash disk, mobile hard disk, read-only memory (ROM, Read-Only Memory), arbitrary access are deposited
The various media that can store program code such as reservoir (RAM, Random Access Memory), magnetic or disk.It needs
Illustrate, herein, relational terms such as first and second and the like be used merely to by an entity or operation with
Another entity or operation distinguish, and without necessarily requiring or implying between these entities or operation, there are any this realities
The relationship or sequence on border.Moreover, the terms "include", "comprise" or its any other variant are intended to the packet of nonexcludability
Contain, so that the process, method, article or equipment for including a series of elements not only includes those elements, but also including
Other elements that are not explicitly listed, or further include for elements inherent to such a process, method, article, or device.
In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including the element
Process, method, article or equipment in there is also other identical elements.
The foregoing is merely preferred embodiment of the present application, are not intended to limit this application, for the skill of this field
For art personnel, various changes and changes are possible in this application.Within the spirit and principles of this application, made any to repair
Change, equivalent replacement, improvement etc., should be included within the scope of protection of this application.It should also be noted that similar label and letter exist
Similar terms are indicated in following attached drawing, therefore, once being defined in a certain Xiang Yi attached drawing, are then not required in subsequent attached drawing
It is further defined and explained.
The above, the only specific embodiment of the application, but the protection scope of the application is not limited thereto, it is any
Those familiar with the art within the technical scope of the present application, can easily think of the change or the replacement, and should all contain
Lid is within the scope of protection of this application.Therefore, the protection scope of the application should be subject to the protection scope in claims.
Claims (10)
1. a kind of apparatus control method characterized by comprising
Acquire the first image data in the corresponding limited area of target device;
Acquire the second image data of the limited area of the target device;
When detecting in the first image data there are when target object, by the object region where the target object
It is matched with each point in second image data, to obtain the target three-dimensional coordinate range of the object region;
The first distance of the target object Yu the target device is obtained according to the target three-dimensional coordinate range;
When the first distance be less than setting value, control the target device stop working state or reduce operating rate.
2. the method as described in claim 1, which is characterized in that second image data includes three dimensional point cloud;It is described
Object region where the target object is matched with each point in second image data, to obtain the mesh
The step of target three-dimensional coordinate range in logo image region, comprising:
The three dimensional point cloud is projected in the corresponding coordinate system of the first image data by coordinate conversion, to obtain
Project point set;
Sub- projection point set corresponding in the object region is mapped back into the corresponding three-dimensional coordinate of the three dimensional point cloud
In system, to obtain the target three-dimensional coordinate range of the object region.
3. method according to claim 2, which is characterized in that described by object-image region described in the first image data
Pixel in domain carries out coordinate conversion, the step of to obtain the three-dimensional coordinate of the pixel in the object region, packet
It includes:
Using the transition matrix between the first acquisition equipment and the image of the second acquisition equipment acquisition to the three dimensional point cloud
Coordinate conversion is carried out, to obtain the projection point set, the first acquisition equipment is the equipment for acquiring the first image data,
The second acquisition equipment is the equipment for acquiring the second image data;
The transition matrix is adopted according to first group of calibration point set for using the first acquisition equipment acquisition, and using second
Collect what second group of calibration point set that equipment acquires determined.
4. according to the method described in claim 3, it is characterized in that, the method also includes:
For the first acquisition equipment and the second acquisition equipment, identical time system is set;
Thread is controlled for the first acquisition device configuration first;
Thread is controlled for the second acquisition device configuration second;
By the first control thread and the second control thread, the first acquisition equipment and the second acquisition equipment are controlled
The adjacent time interval for acquiring data twice is identical, so that the first image data and described the of the first acquisition equipment acquisition
The time synchronization of two acquisition collected second image datas of equipment.
5. the method according to claim 1, wherein described according to target three-dimensional coordinate range acquisition
The step of first distance of target object and the target device, comprising:
The center point coordinate of the target object, the center point coordinate table are obtained according to the target three-dimensional coordinate range computation
Show the coordinate that target device described in distance is nearest in the target three-dimensional coordinate range;
The first distance of the target object Yu the target device is calculated according to the center point coordinate.
6. according to the method described in claim 5, it is characterized in that, described be calculated the mesh according to the center point coordinate
The step of marking the first distance of object and the target device, comprising:
Obtain two coordinate values in the horizontal direction in the center point coordinate;
According to two coordinate values in the horizontal direction be calculated the target object and the target device it is horizontal away from
From as the first distance.
7. according to the method described in claim 5, it is characterized in that, described obtain according to the target three-dimensional coordinate range computation
The step of center point coordinate of the target object, comprising:
Frame fitting is carried out to the target three-dimensional coordinate range, obtains the spatial cuboids where the target object;
The centre coordinate for calculating the spatial cuboids, using the centre coordinate of the spatial cuboids as the target object
Center point coordinate.
8. the method according to claim 1, wherein the method also includes:
When the first distance is less than setting value, warning message is generated.
9. according to the method described in claim 8, the excavator includes it is characterized in that, the target device is excavator
Digging machine arm, it is described when the first distance be less than setting value when, generate warning message the step of, comprising:
When the first distance is less than the longest spread length of the digging machine arm, warning message is generated.
10. a kind of excavator characterized by comprising processor, memory and bus, the memory are stored with the place
The executable machine readable instructions of device are managed, when electronic equipment operation, pass through bus between the processor and the memory
Communication, when the machine readable instructions are executed by the processor perform claim require 1 to 9 it is any described in method the step of.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910358054.8A CN110067274B (en) | 2019-04-29 | 2019-04-29 | Equipment control method and excavator |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910358054.8A CN110067274B (en) | 2019-04-29 | 2019-04-29 | Equipment control method and excavator |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110067274A true CN110067274A (en) | 2019-07-30 |
CN110067274B CN110067274B (en) | 2021-08-13 |
Family
ID=67369703
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910358054.8A Active CN110067274B (en) | 2019-04-29 | 2019-04-29 | Equipment control method and excavator |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110067274B (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110766623A (en) * | 2019-10-12 | 2020-02-07 | 北京工业大学 | Stereo image restoration method based on deep learning |
CN111968102A (en) * | 2020-08-27 | 2020-11-20 | 中冶赛迪重庆信息技术有限公司 | Target equipment detection method, system, medium and electronic terminal |
CN113055821A (en) * | 2021-03-15 | 2021-06-29 | 北京京东乾石科技有限公司 | Method and apparatus for transmitting information |
CN113463718A (en) * | 2021-06-30 | 2021-10-01 | 广西柳工机械股份有限公司 | Anti-collision control system and control method for loader |
CN114710228A (en) * | 2022-05-31 | 2022-07-05 | 杭州闪马智擎科技有限公司 | Time synchronization method and device, storage medium and electronic device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH03228929A (en) * | 1990-02-02 | 1991-10-09 | Yutani Heavy Ind Ltd | Interference avoidance device for working machine |
EP2395764B1 (en) * | 2010-06-14 | 2016-02-17 | Nintendo Co., Ltd. | Storage medium having stored therein stereoscopic image display program, stereoscopic image display device, stereoscopic image display system, and stereoscopic image display method |
CN109252563A (en) * | 2017-07-14 | 2019-01-22 | 神钢建机株式会社 | engineering machinery |
CN109472831A (en) * | 2018-11-19 | 2019-03-15 | 东南大学 | Obstacle recognition range-measurement system and method towards road roller work progress |
-
2019
- 2019-04-29 CN CN201910358054.8A patent/CN110067274B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH03228929A (en) * | 1990-02-02 | 1991-10-09 | Yutani Heavy Ind Ltd | Interference avoidance device for working machine |
EP2395764B1 (en) * | 2010-06-14 | 2016-02-17 | Nintendo Co., Ltd. | Storage medium having stored therein stereoscopic image display program, stereoscopic image display device, stereoscopic image display system, and stereoscopic image display method |
CN109252563A (en) * | 2017-07-14 | 2019-01-22 | 神钢建机株式会社 | engineering machinery |
CN109472831A (en) * | 2018-11-19 | 2019-03-15 | 东南大学 | Obstacle recognition range-measurement system and method towards road roller work progress |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110766623A (en) * | 2019-10-12 | 2020-02-07 | 北京工业大学 | Stereo image restoration method based on deep learning |
CN111968102A (en) * | 2020-08-27 | 2020-11-20 | 中冶赛迪重庆信息技术有限公司 | Target equipment detection method, system, medium and electronic terminal |
CN111968102B (en) * | 2020-08-27 | 2023-04-07 | 中冶赛迪信息技术(重庆)有限公司 | Target equipment detection method, system, medium and electronic terminal |
CN113055821A (en) * | 2021-03-15 | 2021-06-29 | 北京京东乾石科技有限公司 | Method and apparatus for transmitting information |
CN113055821B (en) * | 2021-03-15 | 2023-01-31 | 北京京东乾石科技有限公司 | Method and apparatus for transmitting information |
CN113463718A (en) * | 2021-06-30 | 2021-10-01 | 广西柳工机械股份有限公司 | Anti-collision control system and control method for loader |
CN114710228A (en) * | 2022-05-31 | 2022-07-05 | 杭州闪马智擎科技有限公司 | Time synchronization method and device, storage medium and electronic device |
CN114710228B (en) * | 2022-05-31 | 2022-09-09 | 杭州闪马智擎科技有限公司 | Time synchronization method and device, storage medium and electronic device |
Also Published As
Publication number | Publication date |
---|---|
CN110067274B (en) | 2021-08-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110067274A (en) | Apparatus control method and excavator | |
US10970859B2 (en) | Monitoring method and device for mobile target, monitoring system and mobile robot | |
US10949995B2 (en) | Image capture direction recognition method and server, surveillance method and system and image capture device | |
Son et al. | Real-time vision-based warning system for prevention of collisions between workers and heavy equipment | |
CN108304075B (en) | Method and device for performing man-machine interaction on augmented reality device | |
CN108318043A (en) | Method, apparatus for updating electronic map and computer readable storage medium | |
CN109002039A (en) | Avoidance based reminding method, relevant apparatus and computer readable storage medium | |
CN107145851A (en) | Constructions work area dangerous matter sources intelligent identifying system | |
CN110785774A (en) | Method and system for closed loop sensing in autonomous vehicles | |
WO2019129255A1 (en) | Target tracking method and device | |
KR20130139622A (en) | Convergence security control system and method thereof | |
RU2656711C2 (en) | Method and system for detecting and tracking of moving objects based on three-dimensional sensor data | |
CN103605978A (en) | Urban illegal building identification system and method based on three-dimensional live-action data | |
Fang et al. | A sematic and prior‐knowledge‐aided monocular localization method for construction‐related entities | |
CN104964708B (en) | A kind of road surface pit detection method based on vehicle-mounted binocular vision | |
KR101992662B1 (en) | Gis 3 cctv 3 cctv | |
CN105608417A (en) | Traffic signal lamp detection method and device | |
KR101602471B1 (en) | River water level measurement and warning system. | |
CN107610393A (en) | A kind of intelligent office monitoring system | |
US20220044558A1 (en) | Method and device for generating a digital representation of traffic on a road | |
Madrid et al. | Lane departure warning for mobile devices based on a fuzzy representation of images | |
CN108225334A (en) | A kind of localization method and device based on three-dimensional live-action data | |
CN109448326A (en) | A kind of anti-monitoring system of geological disaster intelligence group based on rapid image identification | |
Sirmacek et al. | Automatic classification of trees from laser scanning point clouds | |
CN108510528A (en) | A kind of method and device of visible light and infrared image registration fusion |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |