CN112631312A - Unmanned equipment control method and device, storage medium and electronic equipment - Google Patents

Unmanned equipment control method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN112631312A
CN112631312A CN202110248684.7A CN202110248684A CN112631312A CN 112631312 A CN112631312 A CN 112631312A CN 202110248684 A CN202110248684 A CN 202110248684A CN 112631312 A CN112631312 A CN 112631312A
Authority
CN
China
Prior art keywords
image
determining
unmanned equipment
unmanned
foreground pixels
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110248684.7A
Other languages
Chinese (zh)
Other versions
CN112631312B (en
Inventor
张涛
董岩
夏华夏
申浩
何祎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Beijing Sankuai Online Technology Co Ltd
Original Assignee
Tsinghua University
Beijing Sankuai Online Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University, Beijing Sankuai Online Technology Co Ltd filed Critical Tsinghua University
Priority to CN202110248684.7A priority Critical patent/CN112631312B/en
Publication of CN112631312A publication Critical patent/CN112631312A/en
Application granted granted Critical
Publication of CN112631312B publication Critical patent/CN112631312B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the specification presets different types of targets in a field and is used for guiding the unmanned equipment, pixels located in a foreground area in an image collected by image collecting equipment are used as foreground pixels, quantity distribution histograms of the foreground pixels in the horizontal direction and the vertical direction are determined, the types of the targets in the image are identified according to the quantity distribution histograms of the foreground pixels in different directions, then the current optional driving direction of the unmanned equipment is determined according to the identified types of the targets, and the automatic control of the unmanned equipment is realized according to the optional driving directions and a driving route corresponding to a task currently executed by the unmanned equipment, so that the simple, efficient and accurate automatic control is realized through the preset strips in the field.

Description

Unmanned equipment control method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of automatic driving technologies, and in particular, to a method and an apparatus for controlling an unmanned aerial vehicle, a storage medium, and an electronic device.
Background
Along with the development of automation and robotics, unmanned equipment plays an increasingly important role in various fields.
In the prior art, the automatic control of the unmanned equipment is mainly realized through a more complex dynamics model or a machine learning model, the realization difficulty is higher, and the result of the automatic control is often unsatisfactory.
Therefore, how to simply, efficiently and accurately realize the automatic control of the unmanned equipment becomes a problem to be solved urgently.
Disclosure of Invention
The embodiment of the specification provides a control method and device for an unmanned aerial vehicle, a storage medium and an electronic device, so as to partially solve the problems in the prior art.
The embodiment of the specification adopts the following technical scheme:
in a control method of an unmanned aerial vehicle provided by this specification, an image capture device is installed on the unmanned aerial vehicle, different types of targets are preset in a field where the unmanned aerial vehicle is located, and the different types of targets are used to guide the unmanned aerial vehicle, and the method includes:
the unmanned equipment identifies a foreground area in an image acquired by the image acquisition equipment, and takes pixels in the foreground area as foreground pixels;
determining a number distribution histogram of foreground pixels in different directions in the image; wherein the different directions include a horizontal direction and a vertical direction;
identifying the type of a target in the image according to the quantity distribution histograms of the foreground pixels in different directions;
determining the current optional driving direction of the unmanned equipment according to the recognized type of the target;
and controlling the unmanned equipment according to the determined optional driving directions and the driving route corresponding to the task currently executed by the unmanned equipment.
Optionally, the different types of targets include: the horizontal strips in the horizontal direction, the vertical strips in the vertical direction and the crossed strips formed by crossing the horizontal strips and the vertical strips.
Optionally, determining a number distribution histogram of the foreground pixels in different directions specifically includes:
for each pixel column in the image, determining the number of foreground pixels in the pixel column; determining a quantity distribution histogram of the foreground pixels in the horizontal direction according to the quantity of the foreground pixels in each pixel column;
for each pixel row in the image, determining the number of foreground pixels in the pixel row; and determining a quantity distribution histogram of the foreground pixels in the vertical direction according to the quantity of the foreground pixels in each pixel row.
Optionally, identifying the type of the target in the image according to the number distribution histograms of the foreground pixels in different directions, specifically including:
determining a pixel column of which the number of foreground pixels is greater than a first threshold value as a designated pixel column according to the number distribution histogram of the foreground pixels in the horizontal direction; determining pixel rows with the number of the foreground pixels larger than a second threshold value as designated pixel rows according to the number distribution histogram of the foreground pixels in the vertical direction;
if the number of the specified pixel columns is larger than a third threshold value and the number of the specified pixel rows is not larger than a fourth threshold value, determining that the type of the target in the image is a vertical strip;
if the number of the specified pixel columns is not larger than a third threshold value and the number of the specified pixel rows is larger than a fourth threshold value, determining that the type of the target in the image is a horizontal stripe;
and if the number of the specified pixel columns is larger than a third threshold value and the number of the specified pixel rows is larger than a fourth threshold value, determining that the type of the target in the image is a cross stripe.
Optionally, determining a current optional driving direction of the unmanned aerial vehicle according to the identified type of the target, specifically including:
when the type of the identified target is a horizontal strip, determining that the current optional driving direction of the unmanned equipment is a horizontal direction;
when the type of the identified target is a vertical strip, determining that the current optional driving direction of the unmanned equipment is a vertical direction;
when the type of the identified target is a cross stripe, determining that the current selectable driving directions of the unmanned equipment are a horizontal direction and a vertical direction.
Optionally, the controlling the unmanned aerial vehicle itself according to the determined optional driving directions and the driving route corresponding to the task currently executed by the unmanned aerial vehicle specifically includes:
determining the horizontal section midpoint of an abscissa section formed by the abscissa corresponding to each appointed pixel column in the image coordinates corresponding to the image; determining a deviation of an abscissa corresponding to a specified point in the image from a midpoint in the horizontal section as a positional deviation of the unmanned device in a horizontal direction with respect to the target; determining the middle point of a vertical interval of a vertical coordinate interval formed by the vertical coordinates corresponding to each appointed pixel row in the image coordinates corresponding to the image; determining the offset of a vertical coordinate corresponding to a designated point in the image relative to a midpoint in the vertical interval as the position offset of the unmanned equipment relative to the target in the vertical direction;
and controlling the unmanned equipment according to the determined optional driving directions, the driving route corresponding to the task currently executed by the unmanned equipment and the position offset of the unmanned equipment relative to the target in the horizontal direction and/or the vertical direction.
Optionally, the position, corresponding to the center point of the image acquired by the image acquisition device, in the field coincides with the projection of the center point of the unmanned device on the field;
the specified point in the image is a center point of the image.
Optionally, the controlling the unmanned aerial vehicle itself according to the determined optional driving directions and the driving route corresponding to the task currently executed by the unmanned aerial vehicle specifically includes:
carrying out linear detection on the contour of the foreground area to obtain each linear corresponding to the contour of the foreground area;
screening out straight lines of which the absolute value of an included angle with the vertical direction is smaller than a fifth threshold value from the obtained straight lines, and determining the angle deviation of the unmanned equipment relative to the vertical direction according to the angle between the screened straight lines and the vertical direction and the length of the screened straight lines; screening out straight lines of which the absolute value of an included angle with the horizontal direction is smaller than a sixth threshold value from the obtained straight lines, and determining the angle deviation of the unmanned equipment relative to the horizontal direction according to the angle between the screened straight lines and the horizontal direction and the length of the screened straight lines;
and controlling the unmanned equipment according to the determined optional driving directions, the driving route corresponding to the task currently executed by the unmanned equipment and the angle deviation of the unmanned equipment relative to the horizontal direction and/or the vertical direction.
This description provides a controlling means of unmanned aerial vehicle, install image acquisition equipment on the unmanned aerial vehicle, different types of targets have been preset in the place that unmanned aerial vehicle belongs to, different types of targets are used for guiding the unmanned aerial vehicle, the device includes:
the foreground extraction module is used for identifying a foreground area in the image acquired by the image acquisition equipment and taking pixels in the foreground area as foreground pixels;
the statistical module is used for determining quantity distribution histograms of foreground pixels in different directions in the image; wherein the different directions include a horizontal direction and a vertical direction;
the identification module is used for identifying the type of the target in the image according to the quantity distribution histograms of the foreground pixels in different directions;
the determining module is used for determining the current optional driving direction of the unmanned equipment according to the recognized type of the target;
and the control module is used for controlling the unmanned equipment according to the determined optional driving directions and the driving route corresponding to the task currently executed by the unmanned equipment.
The present specification provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the above-described control method for an unmanned aerial device.
The electronic device provided by the specification comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein the processor executes the program to realize the control method of the unmanned device.
The embodiment of the specification adopts at least one technical scheme which can achieve the following beneficial effects:
the method includes the steps that different types of targets are preset in a field and used for guiding the unmanned equipment, pixels located in a foreground area in an image collected by image collection equipment are used as foreground pixels, quantity distribution histograms of the foreground pixels in the horizontal direction and the vertical direction are determined, the types of the targets in the image are identified according to the quantity distribution histograms of the foreground pixels in the different directions, the current optional driving direction of the unmanned equipment is determined according to the identified type of the target, and automatic control of the unmanned equipment is achieved according to the optional driving directions and a driving route corresponding to a task currently executed by the unmanned equipment, so that simple, efficient and accurate automatic control is achieved through strips preset in the field.
Drawings
The accompanying drawings, which are included to provide a further understanding of the specification and are incorporated in and constitute a part of this specification, illustrate embodiments of the specification and together with the description serve to explain the specification and not to limit the specification in a non-limiting sense. In the drawings:
fig. 1 is a schematic diagram of a control method of an unmanned aerial vehicle provided in an embodiment of the present specification;
FIG. 2A is a schematic diagram of vertical stripes appearing in an image provided by an embodiment of the present disclosure;
FIG. 2B is a schematic diagram of horizontal stripes appearing in an image provided by an embodiment of the present disclosure;
FIG. 2C is a schematic diagram of an image with crossed stripes according to an embodiment of the present disclosure;
fig. 3A is a histogram of the distribution of the number of foreground pixels in the vertical direction when there is a vertical stripe in the image provided by the embodiments of the present specification;
fig. 3B is a histogram of the number distribution of foreground pixels in the horizontal direction when there is a vertical stripe in the image provided by the embodiments of the present specification;
fig. 3C is a histogram of the distribution of the number of foreground pixels in the vertical direction when there is a horizontal stripe in the image provided by the embodiments of the present specification;
fig. 3D is a histogram of the number distribution of foreground pixels in the horizontal direction when horizontal stripes exist in the image provided by the embodiments of the present specification;
fig. 3E is a histogram of the distribution of the number of foreground pixels in the vertical direction when there are crossed stripes in the image provided by the embodiments of the present specification;
fig. 3F is a histogram of the number distribution of foreground pixels in the horizontal direction when there are crossed stripes in the image provided by the embodiments of the present specification;
fig. 4 is a schematic structural diagram of a control device of an unmanned aerial vehicle according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an electronic device provided in an embodiment of this specification.
Detailed Description
The image acquisition equipment is installed on the unmanned equipment, and different types of targets are preset on a field where the unmanned equipment runs, and the different types of targets are used for guiding the unmanned equipment. Specifically, a plurality of strips which are vertical to each other can be preset in the field to guide the unmanned equipment, the strips are arranged in the field in a grid mode in a transverse and longitudinal staggered mode, therefore, an image acquisition device on the unmanned equipment is required to acquire an image of the ground of the field, the strips in the image are identified, and finally the unmanned equipment controls the unmanned equipment to drive along the extension direction of the identified strips. In order to make the objects, technical solutions and advantages of the present disclosure more clear, the technical solutions of the present disclosure will be clearly and completely described below with reference to the specific embodiments of the present disclosure and the accompanying drawings. It is to be understood that the embodiments described are only a few embodiments of the present disclosure, and not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present specification without any creative effort belong to the protection scope of the present specification.
The technical solutions provided by the embodiments of the present description are described in detail below with reference to the accompanying drawings.
Fig. 1 is a schematic diagram of a control method of an unmanned aerial vehicle provided in an embodiment of the present specification, including:
s100: and the unmanned equipment identifies a foreground area in the image acquired by the image acquisition equipment and takes the pixels in the foreground area as foreground pixels.
In this specification embodiment, the image capturing device may be mounted on an unmanned device, which is then operated in a field, and different types of targets preset in the field for guiding the unmanned device may be a plurality of strips perpendicular to each other. The target in the embodiment of the present specification may be a strip, a two-dimensional code, or another object that can be used to guide an unmanned device.
It should be noted that the unmanned aerial vehicle described in the embodiments of the present disclosure may be a transportation device used in the warehousing field for transporting goods and goods, or an unmanned vehicle for performing distribution tasks, where the distribution tasks include takeaway distribution and other types of logistics distribution.
The following description will be given by taking only the object to be recognized as a band as an example.
In this embodiment, an image acquired by an image acquisition device installed on an unmanned device may be binarized according to a preset binarization threshold. Specifically, the binarization threshold may be set as required, and when the image is binarized, for each pixel in the image, if the pixel value of the pixel is greater than the binarization threshold, the pixel value of the pixel is set to be a first pixel value (e.g. 255, i.e. pure white), and if the pixel value of the pixel is not greater than the binarization threshold, the pixel value of the pixel is set to be a second pixel value (e.g. 0, i.e. pure black). The first pixel value is larger than the second pixel value.
After the image is binarized, the foreground area in the binarized image can be identified. Specifically, a region in which a pixel having a first pixel value is located may be determined as a foreground region, or a region in which a pixel having a second pixel value is located may be determined as a foreground region. For example, if the ground of the field is dark and the laid strip is light, the region in which the pixel having the first pixel value is located may be determined as the foreground region, whereas if the ground of the field is light and the laid strip is dark, the region in which the pixel having the second pixel value is located may be determined as the foreground region.
After the foreground area is determined, pixels in the foreground area in the image are the foreground pixels.
S102: determining a number distribution histogram of foreground pixels in different directions in the image; wherein the different directions include a horizontal direction and a vertical direction.
According to the expression form of the strips which are laid in the field and are perpendicular to each other in the image acquired by the image acquisition equipment, the description divides the strips into three types, namely horizontal strips in the horizontal direction, vertical strips in the vertical direction and crossed strips formed by crossing the horizontal strips and the vertical strips.
The vertical direction refers to a direction from head to tail or from tail to head of the unmanned aerial vehicle, the expression form of the vertical strip in the image acquired by the image acquisition device is shown in fig. 2A, and fig. 2A is the situation that the vertical strip appears in the image acquired by the image acquisition device on the unmanned aerial vehicle.
The horizontal direction is perpendicular to the vertical direction, that is, the direction from left to right or from right to left of the unmanned aerial vehicle, the representation form of the horizontal stripe in the image acquired by the image acquisition device is shown in fig. 2B, that is, the situation in fig. 2B is that the horizontal stripe appears in the image acquired by the image acquisition device on the unmanned aerial vehicle.
The cross strips are formed by mutually and vertically crossing horizontal strips and vertical strips, the representation form of the cross strips in the image acquired by the image acquisition equipment is shown in fig. 2C, and fig. 2C shows the situation that the cross strips appear in the image acquired by the image acquisition equipment on the unmanned equipment.
In order to simply and accurately identify three different types of targets, namely, a vertical stripe, a horizontal stripe, and a cross stripe, after foreground pixels in an image are determined, in the embodiment of the present specification, a number distribution histogram of the foreground pixels in different directions (horizontal direction and vertical direction) can be determined. That is, the number distribution histogram of the foreground pixels in the horizontal direction is determined, and the number distribution histogram of the foreground pixels in the vertical direction is determined.
Specifically, when determining the number distribution histogram of the foreground pixels in the horizontal direction, the number of the foreground pixels in each pixel column in the image may be determined, and the number distribution histogram of the foreground pixels in the horizontal direction may be determined according to the number of the foreground pixels in each pixel column. When determining the histogram of the number distribution of the foreground pixels in the vertical direction, the number of foreground pixels in each pixel row in the image can be determined; and determining a quantity distribution histogram of the foreground pixels in the vertical direction according to the quantity of the foreground pixels in each pixel row.
S104: and identifying the type of the target in the image according to the quantity distribution histograms of the foreground pixels in different directions.
Since the vertical stripes, the horizontal stripes and the crossed stripes are substantially as shown in fig. 2A to 2C in the image, if the vertical stripes exist in the image, the histogram of the number distribution of the foreground pixels in the image in the horizontal direction has a more prominent peak value and the histogram of the number distribution of the foreground pixels in the vertical direction is more average as shown in fig. 3A to 3B. In contrast, if there are horizontal stripes in the image, the histogram of the number distribution of foreground pixels in the image in the vertical direction has a more prominent peak, and the histogram of the number distribution of foreground pixels in the horizontal direction is more average, as shown in fig. 3C to 3D. Further, if there are crossed stripes in the image, the histogram of the number distribution of foreground pixels in the image in both vertical and horizontal directions has a more prominent peak, as shown in fig. 3E to 3F.
Thus, it is identified which of the vertical band, the horizontal band, and the cross band the object in the image is based on the number distribution histograms of the foreground pixels in the horizontal direction and the vertical direction.
Specifically, the pixel columns with the number of foreground pixels larger than the first threshold may be determined as designated pixel columns according to the number distribution histogram of the foreground pixels in the horizontal direction, and the pixel rows with the number of foreground pixels larger than the second threshold may be determined as designated pixel rows according to the number distribution histogram of the foreground pixels in the vertical direction. The first threshold may be equal to or different from the second threshold, the first threshold may be set as a preset percentage of the total pixel row number in the image, and the second threshold may be set as a preset percentage of the total pixel column number in the image, for example, the first threshold may be set as 50% of the total pixel row number in the image, and the second threshold may be set as 50% of the total pixel column number in the image.
And if the number of the designated pixel columns is greater than the third threshold value and the number of the designated pixel rows is not greater than the fourth threshold value, namely that the number distribution histogram of the foreground pixels in the horizontal direction has a more prominent peak value and the number distribution histogram of the foreground pixels in the vertical direction is more average, determining that the type of the target in the image is a vertical stripe.
And if the number of the designated pixel columns is not more than the third threshold value and the number of the designated pixel rows is more than the fourth threshold value, namely that the number distribution histogram of the foreground pixels in the vertical direction has a more prominent peak value and the number distribution histogram of the foreground pixels in the horizontal direction is more average, determining that the type of the target in the image is a horizontal stripe.
And if the number of the designated pixel columns is greater than the third threshold value and the number of the designated pixel rows is greater than the fourth threshold value, namely that the number distribution histograms of the foreground pixels in the vertical direction and the horizontal direction have more prominent peak values, determining that the type of the target in the image is a cross stripe.
The third threshold and the fourth threshold may be equal or unequal, and for example, both the third threshold and the fourth threshold may be set to 5.
S106: and determining the current optional driving direction of the unmanned equipment according to the recognized type of the target.
Since the unmanned aerial vehicle needs to control the unmanned aerial vehicle to travel along the extension direction of the identified strip, in the embodiment of the present specification, after the strip is identified from the image acquired by the image acquisition device, the extension direction of the strip may be determined as the current optional travel direction of the unmanned aerial vehicle. That is, when the type of the recognized target is a horizontal stripe, the current optional traveling direction of the unmanned aerial vehicle is determined to be a horizontal direction, when the type of the recognized target is a vertical stripe, the current optional traveling direction of the unmanned aerial vehicle is determined to be a vertical direction, and when the type of the recognized target is a cross stripe, the current optional traveling direction of the unmanned aerial vehicle is determined to be a horizontal direction and a vertical direction.
S108: and controlling the unmanned equipment according to the determined optional driving directions and the driving route corresponding to the task currently executed by the unmanned equipment.
Since the unmanned aerial vehicle determines the driving route corresponding to the task in advance when the task is executed, after the current optional driving direction of the unmanned aerial vehicle is determined, the unmanned aerial vehicle can be controlled according to the optional driving directions and the driving route corresponding to the task. Specifically, the selectable traveling direction with the smallest included angle of the traveling route corresponding to the task may be selected from the selectable traveling directions, and the vehicle may be controlled to travel in the selected selectable traveling direction.
According to the method, the unmanned equipment can be automatically controlled simply and efficiently through the strips laid in the field. In addition, a machine learning model is abandoned in the identification method for different types of strips, and a simpler and more accurate quantity distribution histogram is adopted for identification, so that the identification precision can be effectively improved, and the accuracy of automatic control is improved.
Although there are some prior art methods for identifying stripes, the prior art methods for identifying stripes often use different identification models to identify different types of stripes, that is, one identification model can only identify stripes in one direction, for example, only vertical stripes in the direction from head to tail, and cannot identify horizontal stripes perpendicular to the vertical stripes and cross stripes. Therefore, with the method for identifying stripes in the prior art, the unmanned device can only identify one type of stripe by using one identification model and cannot switch to another identification model to identify other types of stripes, or simultaneously identify the stripes by using multiple identification models, which increases the complexity of identification, and also generates misjudgment when the identification results of multiple identification models conflict, which obviously is not beneficial to the automatic control of the unmanned device.
In addition, the expressions of the three types of bands shown in fig. 2A to 2C in the image are only ideal expressions, and in an actual application scenario, when the unmanned device has a deviation with respect to the band, the position of the band in the image also has a deviation, so the unmanned device needs to determine the position deviation and/or the angle deviation of the unmanned device with respect to the band according to the position and/or the angle of the band in the image, and perform automatic control of the unmanned device according to the position deviation and/or the angle deviation.
The following describes the process of determining the positional offset and the angular offset, respectively.
When determining the position offset of the unmanned aerial vehicle relative to the target, image coordinates corresponding to an image captured by an image capture device on the unmanned aerial vehicle may be established, a horizontal section midpoint of an abscissa section formed by abscissas corresponding to each specified pixel column (a pixel column in which the number of foreground pixels is greater than a first threshold) is determined in the image coordinates, and an offset of the abscissa corresponding to the specified point in the image relative to the horizontal section midpoint is determined as the position offset of the unmanned aerial vehicle relative to the target in the horizontal direction.
Accordingly, when determining the positional deviation of the unmanned aerial vehicle with respect to the target in the vertical direction, the vertical section midpoint of the vertical coordinate section constituted by the vertical coordinate corresponding to each specified pixel row (the pixel row of which the number of foreground pixels is larger than the second threshold) may be determined in the image coordinates, and the deviation of the vertical coordinate corresponding to the specified point in the image with respect to the vertical section midpoint may be determined as the positional deviation of the unmanned aerial vehicle with respect to the target in the vertical direction.
In an embodiment of the present specification, a position, in a field, corresponding to a center point of an image acquired by an image acquisition device coincides with a projection of the center point of the unmanned device on the field. That is, the lens of the image capture device may be shot vertically down towards the field, with the center point of the unmanned device on the axis of the lens. This means that the center point of the image captured by the image capturing device is the ground point of the field directly below the center point of the drone. Thus, the specified point in the image may be the center point of the image.
After the position deviation of the unmanned equipment relative to the target in the horizontal direction and the vertical direction is determined, the unmanned equipment can be controlled according to the determined optional driving directions, the driving route corresponding to the task currently executed by the unmanned equipment and the position deviation of the unmanned equipment relative to the target in the horizontal direction and/or the vertical direction. For example, the compensation direction may be determined according to the position deviation of the unmanned aerial vehicle relative to the target in the horizontal direction and/or the vertical direction, the optional driving direction with the smallest included angle with the driving route may be determined in each optional driving direction, the final driving direction may be determined according to the compensation direction and the selected optional driving direction, and the unmanned aerial vehicle may be controlled to drive in the final driving direction.
Further, if it is recognized through steps S100 to S104 shown in fig. 1 that a vertical stripe exists in the image, only the position deviation of the drone with respect to the target (i.e., the vertical stripe) in the horizontal direction may be determined, and then, automatic control may be performed only according to the position deviation of the drone with respect to the target in the vertical direction. If the horizontal stripe exists in the image, which is recognized through steps S100 to S104 shown in fig. 1, only the position deviation of the unmanned aerial vehicle relative to the target (i.e., the horizontal stripe) in the vertical direction can be determined, and then the automatic control is performed only according to the position deviation of the unmanned aerial vehicle relative to the target in the horizontal direction. If the crossing strips exist in the image, which is identified through steps S100 to S104 shown in fig. 1, it is necessary to determine the position offset of the unmanned aerial vehicle relative to the target (i.e. the crossing strips) in the horizontal direction and the vertical direction, and then, automatic control is performed according to the position offset of the unmanned aerial vehicle relative to the target in the horizontal direction and the vertical direction.
When determining the angular offset of the unmanned aerial vehicle with respect to the target, the foreground region determined by step S100 shown in fig. 1 may be subjected to line detection to obtain each line in the foreground region. Particularly, straight line detection can be performed on the foreground region through Hough transform. Because the pixel values of the pixels in the foreground area are basically the same, the straight line detection can be carried out only on the contour of the foreground area, and all the straight lines corresponding to the contour of the foreground area are obtained.
After the straight lines corresponding to the outline of the foreground area are obtained, the straight lines with the included angle smaller than a fifth threshold value in absolute value with the vertical direction can be screened out from the obtained straight lines, and the angle deviation of the unmanned equipment relative to the vertical direction is determined according to the angle between the screened straight lines and the vertical direction and the length of the screened straight lines. In particular, formulas may be used
Figure 154492DEST_PATH_IMAGE001
Determining an angular offset of the unmanned aerial vehicle relative to a vertical direction, wherein i represents an ith straight line screened out,
Figure 71632DEST_PATH_IMAGE002
indicates the length of the screened ith straight line,
Figure 897374DEST_PATH_IMAGE003
showing the angle of the screened ith straight line with the vertical direction,
Figure 27004DEST_PATH_IMAGE003
are vectors.
Correspondingly, when the angular deviation of the unmanned equipment relative to the horizontal direction is determined, straight lines with the absolute value of the included angle with the horizontal direction smaller than a sixth threshold value can be screened out from all the obtained straight lines, and the angular deviation of the unmanned equipment relative to the horizontal direction is determined according to the angle between the screened straight lines and the horizontal direction and the length of the screened straight lines. In particular, formulas may be used
Figure 638114DEST_PATH_IMAGE004
Determining the angular offset of the unmanned device relative to the horizontal direction, wherein i represents the screened ith straight line,
Figure 194998DEST_PATH_IMAGE002
indicates the length of the screened ith straight line,
Figure 258769DEST_PATH_IMAGE005
showing the angle of the screened ith straight line with the horizontal direction,
Figure 457669DEST_PATH_IMAGE005
are vectors.
The fifth threshold and the sixth threshold may be set as needed, for example, both are set to 5 degrees.
After the angle deviation of the unmanned equipment relative to the horizontal direction and the vertical direction is determined, the unmanned equipment can be controlled according to the determined optional driving directions, the driving route corresponding to the task currently executed by the unmanned equipment and the angle deviation of the unmanned equipment relative to the horizontal direction and/or the vertical direction. For example, the compensation direction may be determined according to the angle deviation of the unmanned aerial vehicle with respect to the horizontal direction and/or the vertical direction, the optional driving direction having the smallest included angle with the driving route may be determined among the optional driving directions, the final driving direction may be determined according to the compensation direction and the selected optional driving direction, and the unmanned aerial vehicle may be controlled to drive in the final driving direction.
Further, if it is recognized through steps S100 to S104 shown in fig. 1 that a vertical stripe exists in the image, only the angular deviation of the unmanned aerial vehicle with respect to the vertical direction may be determined, and then, automatic control is performed only according to the angular deviation of the unmanned aerial vehicle with respect to the vertical direction. If the horizontal stripes exist in the image, which is recognized through the steps S100 to S104 shown in fig. 1, only the angle deviation of the unmanned device relative to the horizontal direction can be determined, and then automatic control is performed only according to the angle deviation of the unmanned device relative to the horizontal direction. If the image is identified to have crossed strips through the steps S100 to S104 shown in fig. 1, the angular offset of the unmanned aerial vehicle with respect to the horizontal direction and the vertical direction needs to be determined, and then automatic control is performed according to the angular offset of the unmanned aerial vehicle with respect to the horizontal direction and the vertical direction.
Based on the same idea, the present specification further provides a corresponding apparatus, a storage medium, and an electronic device.
Fig. 4 is a schematic structural diagram of a control apparatus of an unmanned aerial vehicle according to an embodiment of the present disclosure, where an image capture device is installed on the unmanned aerial vehicle, and different types of targets are preset in a field where the unmanned aerial vehicle is located, where the different types of targets are used to guide the unmanned aerial vehicle, and the apparatus includes:
a foreground extraction module 401, configured to identify a foreground region in an image acquired by the image acquisition device, and use a pixel located in the foreground region as a foreground pixel;
a statistical module 402, configured to determine a histogram of number distribution of foreground pixels in different directions in the image; wherein the different directions include a horizontal direction and a vertical direction;
an identifying module 403, configured to identify a type of a target in the image according to the number distribution histograms of the foreground pixels in different directions;
a determining module 404, configured to determine, according to the identified type of the target, a current optional driving direction of the unmanned aerial vehicle;
and the control module 405 is configured to control the unmanned device according to the determined optional driving directions and the driving route corresponding to the task currently executed by the unmanned device.
Optionally, the different types of targets include: the horizontal strips in the horizontal direction, the vertical strips in the vertical direction and the crossed strips formed by crossing the horizontal strips and the vertical strips.
Optionally, the statistical module 402 is specifically configured to, for each pixel column in the image, determine the number of foreground pixels in the pixel column; determining a quantity distribution histogram of the foreground pixels in the horizontal direction according to the quantity of the foreground pixels in each pixel column; for each pixel row in the image, determining the number of foreground pixels in the pixel row; and determining a quantity distribution histogram of the foreground pixels in the vertical direction according to the quantity of the foreground pixels in each pixel row.
Optionally, the identifying module 403 is specifically configured to determine, according to the histogram of the number distribution of the foreground pixels in the horizontal direction, a pixel column of which the number of the foreground pixels is greater than a first threshold as a designated pixel column; determining pixel rows with the number of the foreground pixels larger than a second threshold value as designated pixel rows according to the number distribution histogram of the foreground pixels in the vertical direction; if the number of the specified pixel columns is larger than a third threshold value and the number of the specified pixel rows is not larger than a fourth threshold value, determining that the type of the target in the image is a vertical strip; if the number of the specified pixel columns is not larger than a third threshold value and the number of the specified pixel rows is larger than a fourth threshold value, determining that the type of the target in the image is a horizontal stripe; and if the number of the specified pixel columns is larger than a third threshold value and the number of the specified pixel rows is larger than a fourth threshold value, determining that the type of the target in the image is a cross stripe.
Optionally, the determining module 404 is specifically configured to, when the identified type of the target is a horizontal strip, determine that a current optional driving direction of the unmanned aerial vehicle is a horizontal direction; when the type of the identified target is a vertical strip, determining that the current optional driving direction of the unmanned equipment is a vertical direction; when the type of the identified target is a cross stripe, determining that the current selectable driving directions of the unmanned equipment are a horizontal direction and a vertical direction.
Optionally, the control module 405 is specifically configured to determine, in the image coordinates corresponding to the image, a horizontal section midpoint of an abscissa section formed by abscissas corresponding to each specified pixel column; determining a deviation of an abscissa corresponding to a specified point in the image from a midpoint in the horizontal section as a positional deviation of the unmanned device in a horizontal direction with respect to the target; determining the middle point of a vertical interval of a vertical coordinate interval formed by the vertical coordinates corresponding to each appointed pixel row in the image coordinates corresponding to the image; determining the offset of a vertical coordinate corresponding to a designated point in the image relative to a midpoint in the vertical interval as the position offset of the unmanned equipment relative to the target in the vertical direction; and controlling the unmanned equipment according to the determined optional driving directions, the driving route corresponding to the task currently executed by the unmanned equipment and the position offset of the unmanned equipment relative to the target in the horizontal direction and/or the vertical direction.
Optionally, the control module 405 is configured to perform line detection on the contour of the foreground region to obtain each line corresponding to the contour of the foreground region; screening out straight lines of which the absolute value of an included angle with the vertical direction is smaller than a fifth threshold value from the obtained straight lines, and determining the angle deviation of the unmanned equipment relative to the vertical direction according to the angle between the screened straight lines and the vertical direction and the length of the screened straight lines; screening out straight lines of which the absolute value of an included angle with the horizontal direction is smaller than a sixth threshold value from the obtained straight lines, and determining the angle deviation of the unmanned equipment relative to the horizontal direction according to the angle between the screened straight lines and the horizontal direction and the length of the screened straight lines; and controlling the unmanned equipment according to the determined optional driving directions, the driving route corresponding to the task currently executed by the unmanned equipment and the angle deviation of the unmanned equipment relative to the horizontal direction and/or the vertical direction.
The present specification also provides a computer-readable storage medium storing a computer program which, when executed by a processor, is operable to perform the above-provided control method of an unmanned aerial device.
Based on the above provided control method for the unmanned aerial vehicle, the embodiment of the present specification further provides a schematic structural diagram of the electronic device shown in fig. 5. As shown in fig. 5, the drone includes, at the hardware level, a processor, an internal bus, a network interface, a memory, and a non-volatile memory, although it may also include hardware required for other services. The processor reads the corresponding computer program from the nonvolatile memory into the memory and then runs the computer program to realize the resource allocation method.
Of course, besides the software implementation, the present specification does not exclude other implementations, such as logic devices or a combination of software and hardware, and the like, that is, the execution subject of the following processing flow is not limited to each logic unit, and may be hardware or logic devices.
In the 90 s of the 20 th century, improvements in a technology could clearly distinguish between improvements in hardware (e.g., improvements in circuit structures such as diodes, transistors, switches, etc.) and improvements in software (improvements in process flow). However, as technology advances, many of today's process flow improvements have been seen as direct improvements in hardware circuit architecture. Designers almost always obtain the corresponding hardware circuit structure by programming an improved method flow into the hardware circuit. Thus, it cannot be said that an improvement in the process flow cannot be realized by hardware physical modules. For example, a Programmable Logic Device (PLD), such as a Field Programmable Gate Array (FPGA), is an integrated circuit whose Logic functions are determined by programming the Device by a user. A digital system is "integrated" on a PLD by the designer's own programming without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Furthermore, nowadays, instead of manually making an Integrated Circuit chip, such Programming is often implemented by "logic compiler" software, which is similar to a software compiler used in program development and writing, but the original code before compiling is also written by a specific Programming Language, which is called Hardware Description Language (HDL), and HDL is not only one but many, such as abel (advanced Boolean Expression Language), ahdl (alternate Hardware Description Language), traffic, pl (core universal Programming Language), HDCal (jhdware Description Language), lang, Lola, HDL, laspam, hardward Description Language (vhr Description Language), vhal (Hardware Description Language), and vhigh-Language, which are currently used in most common. It will also be apparent to those skilled in the art that hardware circuitry that implements the logical method flows can be readily obtained by merely slightly programming the method flows into an integrated circuit using the hardware description languages described above.
The controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, a microprocessor or processor and a computer-readable medium storing computer-readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, an Application Specific Integrated Circuit (ASIC), a programmable logic controller, and an embedded microcontroller, examples of which include, but are not limited to, the following microcontrollers: ARC 625D, Atmel AT91SAM, Microchip PIC18F26K20, and Silicone Labs C8051F320, the memory controller may also be implemented as part of the control logic for the memory. Those skilled in the art will also appreciate that, in addition to implementing the controller as pure computer readable program code, the same functionality can be implemented by logically programming method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Such a controller may thus be considered a hardware component, and the means included therein for performing the various functions may also be considered as a structure within the hardware component. Or even means for performing the functions may be regarded as being both a software module for performing the method and a structure within a hardware component.
The systems, devices, modules or units illustrated in the above embodiments may be implemented by a computer chip or an entity, or by a product with certain functions. One typical implementation device is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smartphone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being divided into various units by function, and are described separately. Of course, the functions of the various elements may be implemented in the same one or more software and/or hardware implementations of the present description.
As will be appreciated by one skilled in the art, embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, the description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the description may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The description has been presented with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the description. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, the description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the description may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
This description may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The specification may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The above description is only an example of the present specification, and is not intended to limit the present specification. Various modifications and alterations to this description will become apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present specification should be included in the scope of the claims of the present specification.

Claims (10)

1. A control method of unmanned equipment is characterized in that image acquisition equipment is installed on the unmanned equipment, different types of targets are preset in a field where the unmanned equipment is located, and the different types of targets are used for guiding the unmanned equipment, and the method comprises the following steps:
the unmanned equipment identifies a foreground area in an image acquired by the image acquisition equipment, and takes pixels in the foreground area as foreground pixels;
determining a number distribution histogram of foreground pixels in different directions in the image; wherein the different directions include a horizontal direction and a vertical direction;
identifying the type of a target in the image according to the quantity distribution histograms of the foreground pixels in different directions;
determining the current optional driving direction of the unmanned equipment according to the recognized type of the target;
and controlling the unmanned equipment according to the determined optional driving directions and the driving route corresponding to the task currently executed by the unmanned equipment.
2. The method of claim 1, wherein the different types of targets comprise: the horizontal strips in the horizontal direction, the vertical strips in the vertical direction and the crossed strips formed by crossing the horizontal strips and the vertical strips.
3. The method of claim 2, wherein determining a histogram of the number distribution of foreground pixels in different directions comprises:
for each pixel column in the image, determining the number of foreground pixels in the pixel column; determining a quantity distribution histogram of the foreground pixels in the horizontal direction according to the quantity of the foreground pixels in each pixel column;
for each pixel row in the image, determining the number of foreground pixels in the pixel row; and determining a quantity distribution histogram of the foreground pixels in the vertical direction according to the quantity of the foreground pixels in each pixel row.
4. The method of claim 3, wherein identifying the type of the object in the image according to the histogram of the number distribution of the foreground pixels in different directions comprises:
determining a pixel column of which the number of foreground pixels is greater than a first threshold value as a designated pixel column according to the number distribution histogram of the foreground pixels in the horizontal direction; determining pixel rows with the number of the foreground pixels larger than a second threshold value as designated pixel rows according to the number distribution histogram of the foreground pixels in the vertical direction;
if the number of the specified pixel columns is larger than a third threshold value and the number of the specified pixel rows is not larger than a fourth threshold value, determining that the type of the target in the image is a vertical strip;
if the number of the specified pixel columns is not larger than a third threshold value and the number of the specified pixel rows is larger than a fourth threshold value, determining that the type of the target in the image is a horizontal stripe;
and if the number of the specified pixel columns is larger than a third threshold value and the number of the specified pixel rows is larger than a fourth threshold value, determining that the type of the target in the image is a cross stripe.
5. The method according to claim 2 or 4, wherein determining the current selectable driving direction of the unmanned aerial vehicle according to the identified type of the target specifically comprises:
when the type of the identified target is a horizontal strip, determining that the current optional driving direction of the unmanned equipment is a horizontal direction;
when the type of the identified target is a vertical strip, determining that the current optional driving direction of the unmanned equipment is a vertical direction;
when the type of the identified target is a cross stripe, determining that the current selectable driving directions of the unmanned equipment are a horizontal direction and a vertical direction.
6. The method according to claim 4, wherein controlling the unmanned aerial vehicle according to the determined optional driving directions and the driving route corresponding to the task currently executed by the unmanned aerial vehicle comprises:
determining the horizontal section midpoint of an abscissa section formed by the abscissa corresponding to each appointed pixel column in the image coordinates corresponding to the image; determining a deviation of an abscissa corresponding to a specified point in the image from a midpoint in the horizontal section as a positional deviation of the unmanned device in a horizontal direction with respect to the target; determining the middle point of a vertical interval of a vertical coordinate interval formed by the vertical coordinates corresponding to each appointed pixel row in the image coordinates corresponding to the image; determining the offset of a vertical coordinate corresponding to a designated point in the image relative to a midpoint in the vertical interval as the position offset of the unmanned equipment relative to the target in the vertical direction;
and controlling the unmanned equipment according to the determined optional driving directions, the driving route corresponding to the task currently executed by the unmanned equipment and the position offset of the unmanned equipment relative to the target in the horizontal direction and/or the vertical direction.
7. The method according to claim 4, wherein controlling the unmanned aerial vehicle according to the determined optional driving directions and the driving route corresponding to the task currently executed by the unmanned aerial vehicle comprises:
carrying out linear detection on the contour of the foreground area to obtain each linear corresponding to the contour of the foreground area;
screening out straight lines of which the absolute value of an included angle with the vertical direction is smaller than a fifth threshold value from the obtained straight lines, and determining the angle deviation of the unmanned equipment relative to the vertical direction according to the angle between the screened straight lines and the vertical direction and the length of the screened straight lines; screening out straight lines of which the absolute value of an included angle with the horizontal direction is smaller than a sixth threshold value from the obtained straight lines, and determining the angle deviation of the unmanned equipment relative to the horizontal direction according to the angle between the screened straight lines and the horizontal direction and the length of the screened straight lines;
and controlling the unmanned equipment according to the determined optional driving directions, the driving route corresponding to the task currently executed by the unmanned equipment and the angle deviation of the unmanned equipment relative to the horizontal direction and/or the vertical direction.
8. A control device of unmanned equipment is characterized in that image acquisition equipment is installed on the unmanned equipment, different types of targets are preset in a field where the unmanned equipment is located, and the different types of targets are used for guiding the unmanned equipment, and the device comprises:
the foreground extraction module is used for identifying a foreground area in the image acquired by the image acquisition equipment and taking pixels in the foreground area as foreground pixels;
the statistical module is used for determining quantity distribution histograms of foreground pixels in different directions in the image; wherein the different directions include a horizontal direction and a vertical direction;
the identification module is used for identifying the type of the target in the image according to the quantity distribution histograms of the foreground pixels in different directions;
the determining module is used for determining the current optional driving direction of the unmanned equipment according to the recognized type of the target;
and the control module is used for controlling the unmanned equipment according to the determined optional driving directions and the driving route corresponding to the task currently executed by the unmanned equipment.
9. A computer-readable storage medium, characterized in that the storage medium stores a computer program which, when executed by a processor, implements the method of any of the preceding claims 1-7.
10. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of any of claims 1-7 when executing the program.
CN202110248684.7A 2021-03-08 2021-03-08 Unmanned equipment control method and device, storage medium and electronic equipment Active CN112631312B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110248684.7A CN112631312B (en) 2021-03-08 2021-03-08 Unmanned equipment control method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110248684.7A CN112631312B (en) 2021-03-08 2021-03-08 Unmanned equipment control method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN112631312A true CN112631312A (en) 2021-04-09
CN112631312B CN112631312B (en) 2021-06-04

Family

ID=75297731

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110248684.7A Active CN112631312B (en) 2021-03-08 2021-03-08 Unmanned equipment control method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN112631312B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103903224A (en) * 2012-12-25 2014-07-02 腾讯科技(深圳)有限公司 Digital image banding noise processing method and apparatus
CN104483966A (en) * 2014-11-17 2015-04-01 范良志 AGV (Automatic Guided Vehicle) navigation control system
CN204480047U (en) * 2015-03-17 2015-07-15 武汉纺织大学 A kind of self-adaptation AGV vision guided navigation sight line adjusting gear
CN106169068A (en) * 2016-07-01 2016-11-30 蔡雄 One can independent navigation wheeled robot locomotive
CN108181897A (en) * 2017-12-11 2018-06-19 华侨大学 A kind of method of biped robot's automatic tracking
CN110135260A (en) * 2019-04-15 2019-08-16 青岛小鸟看看科技有限公司 The determination method, apparatus and electronic equipment on the boundary of area-of-interest in image
US20190325739A1 (en) * 2018-04-18 2019-10-24 Here Global B.V. Lane-level geometry and traffic information
CN111427349A (en) * 2020-03-27 2020-07-17 齐鲁工业大学 Vehicle navigation obstacle avoidance method and system based on laser and vision
CN112149458A (en) * 2019-06-27 2020-12-29 商汤集团有限公司 Obstacle detection method, intelligent driving control method, device, medium, and apparatus

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103903224A (en) * 2012-12-25 2014-07-02 腾讯科技(深圳)有限公司 Digital image banding noise processing method and apparatus
CN104483966A (en) * 2014-11-17 2015-04-01 范良志 AGV (Automatic Guided Vehicle) navigation control system
CN204480047U (en) * 2015-03-17 2015-07-15 武汉纺织大学 A kind of self-adaptation AGV vision guided navigation sight line adjusting gear
CN106169068A (en) * 2016-07-01 2016-11-30 蔡雄 One can independent navigation wheeled robot locomotive
CN108181897A (en) * 2017-12-11 2018-06-19 华侨大学 A kind of method of biped robot's automatic tracking
US20190325739A1 (en) * 2018-04-18 2019-10-24 Here Global B.V. Lane-level geometry and traffic information
CN110135260A (en) * 2019-04-15 2019-08-16 青岛小鸟看看科技有限公司 The determination method, apparatus and electronic equipment on the boundary of area-of-interest in image
CN112149458A (en) * 2019-06-27 2020-12-29 商汤集团有限公司 Obstacle detection method, intelligent driving control method, device, medium, and apparatus
CN111427349A (en) * 2020-03-27 2020-07-17 齐鲁工业大学 Vehicle navigation obstacle avoidance method and system based on laser and vision

Also Published As

Publication number Publication date
CN112631312B (en) 2021-06-04

Similar Documents

Publication Publication Date Title
CN111010590B (en) Video clipping method and device
CN112801229B (en) Training method and device for recognition model
CN111311709B (en) Method and device for generating high-precision map
CN108320296B (en) Method, device and equipment for detecting and tracking target object in video
CN111508258A (en) Positioning method and device
CN112001456A (en) Vehicle positioning method and device, storage medium and electronic equipment
CN111882611A (en) Map construction method and device
CN111797698A (en) Target object identification method and identification device
CN112327864A (en) Control method and control device of unmanned equipment
CN112465029A (en) Instance tracking method and device
CN112990099B (en) Method and device for detecting lane line
CN114440903A (en) High-precision map construction method and device, storage medium and electronic equipment
CN112365513A (en) Model training method and device
CN111426299B (en) Method and device for ranging based on depth of field of target object
CN111476729B (en) Target identification method and device
CN112818968A (en) Target object classification method and device
CN112712009A (en) Method and device for detecting obstacle
CN112631312B (en) Unmanned equipment control method and device, storage medium and electronic equipment
CN112902987A (en) Pose correction method and device
CN112861831A (en) Target object identification method and device, storage medium and electronic equipment
CN112561961A (en) Instance tracking method and device
CN112734851B (en) Pose determination method and device
CN114623824A (en) Method and device for determining barrier speed
CN114549579A (en) Target tracking method and device
CN114187355A (en) Image calibration method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant