CN112305638A - Effective perception range identification method and related equipment - Google Patents

Effective perception range identification method and related equipment Download PDF

Info

Publication number
CN112305638A
CN112305638A CN201910684828.6A CN201910684828A CN112305638A CN 112305638 A CN112305638 A CN 112305638A CN 201910684828 A CN201910684828 A CN 201910684828A CN 112305638 A CN112305638 A CN 112305638A
Authority
CN
China
Prior art keywords
sector
longitude
arc
latitude
photoelectric sensing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910684828.6A
Other languages
Chinese (zh)
Inventor
刘若鹏
栾琳
季春霖
豆晓宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xi'an Guangqi Intelligent Technology Co.,Ltd.
Original Assignee
Xi'an Guangqi Future Technology Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xi'an Guangqi Future Technology Research Institute filed Critical Xi'an Guangqi Future Technology Research Institute
Priority to CN201910684828.6A priority Critical patent/CN112305638A/en
Publication of CN112305638A publication Critical patent/CN112305638A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V13/00Manufacturing, calibrating, cleaning, or repairing instruments or devices covered by groups G01V1/00 – G01V11/00
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation

Abstract

An effective sensing range identification method and related equipment are used for realizing automatic measurement of the effective sensing range of photoelectric sensing equipment and improving the measurement efficiency. The method comprises the following steps: acquiring longitude coordinates lon and latitude coordinates lat of the photoelectric sensing equipment and installation parameters of the photoelectric sensing equipment, wherein the installation parameters at least comprise a horizontal field angle fhAnd an effective shooting distance L; calculating the blind area distance b of the photoelectric sensing equipment; using longitude coordinate lon and latitude coordinate lat as origin and horizontal viewing angle fhIs an included angle, and is defined by a radius r1B and radius r2The first sector and the second sector are respectively formed with the radius of (b + L), and the target area except the overlapping area with the first sector in the second sector is used as the effective sensing range of the photoelectric sensing device.

Description

Effective perception range identification method and related equipment
Technical Field
The invention relates to the technical field of positioning measurement, in particular to an effective sensing range identification method and related equipment.
Background
With the development of artificial intelligence, photoelectric sensing devices are widely used, for example, fixed camera devices for monitoring security on roads and photoelectric sensing devices for collecting driving environments in automatic driving devices.
For determining the close-range spatial sensing range of the photoelectric sensing device, the existing means is to perform field measurement based on a manual measurement mode. Generally, the field is measured by depending on experience of observers, long-time training and abundant working experience are required, the manual measurement scheme is low in measurement efficiency and poor in measurement accuracy, and meanwhile, the manual measurement cost is high.
Disclosure of Invention
The embodiment of the invention provides an effective sensing range identification method and related equipment, which are used for realizing automatic measurement of the effective sensing range of photoelectric sensing equipment and improving the measurement efficiency.
The first aspect of the embodiments of the present invention provides a method for identifying an effective sensing range, including:
acquiring longitude coordinates lon and latitude coordinates lat of photoelectric sensing equipment and installation parameters of the photoelectric sensing equipment, wherein the installation parameters at least comprise a horizontal field angle fhAnd an effective shooting distance L;
acquiring a blind area distance b of the photoelectric sensing equipment;
using the longitude coordinate lon and the latitude coordinate lat as the origin and the horizontal field angle fhIs an included angle, and is defined by a radius r1B and radius r2Forming a first sector and a second sector with a radius of (b + L), and taking a target region of the second sector except an overlapping region with the first sector as an effective sensing range of the photoelectric sensing device;
and the photoelectric sensing equipment detects and identifies the effective sensing range.
Optionally, as a possible implementation manner, the installation parameters in the embodiment of the present invention further include a height h, a pitch angle tilt, and a horizontal field angle fhPerpendicular field of view fvAnd the step of obtaining the blind area distance b of the photoelectric sensing device comprises:
calculating b ═ h tan (tilt-f)v/2)。
Optionally, as a possible implementation manner, the method for identifying an effective sensing range in the embodiment of the present invention may further include:
equally dividing the arc n of the first sector into n +1 points, equally dividing the arc m of the second sector into m +1 points, and calculating the longitude and latitude coordinates of the n +1 points on the arc of the first sector and the longitude and latitude coordinates of the m +1 points on the arc of the second sector, wherein both m and n are integers not less than 2;
and drawing a boundary range corresponding to the target area on a map according to the longitude and latitude coordinates of the n +1 point on the arc of the first sector and the longitude and latitude coordinates of the m +1 point on the arc of the second sector.
Optionally, as a possible implementation manner, in the embodiment of the present invention, the installation parameter further includes a current azimuth of the optoelectronic sensing device;
calculating the longitude and latitude coordinates of the n +1 point on the arc of the first sector and the longitude and latitude coordinates of the m +1 point on the arc of the second sector includes:
the longitude coordinate and the latitude coordinate of the nth point on the circular arc of the first sector are respectively as follows:
Nlon=lon+r3*sin(((azimuth-fh/2)+(N-1)*fh/n)*π/180),
Nlat=lat+r3*cos(((azimuth-fh/2)+(N-1)*fh/n)*π/180),
wherein r is3B/c, and c is a conversion parameter;
the longitude coordinate and the latitude coordinate of the M-th point on the circular arc of the second sector are respectively as follows:
Mlon=lon+r4*sin(((azimuth+fh/2)-(M-1)*fh/m)*π/180),
Mlat=lat+r4*cos(((azimuth+fh/2)-(M-1)*fh/m)*π/180),
wherein r is4=(b+L)/c。
A second aspect of an embodiment of the present invention provides a computer-readable storage medium having a computer program stored thereon, wherein: the computer program, when being executed by a processor, carries out the steps of the method for identifying an effective sensing range as described in the first aspect and any one of the possible embodiments of the first aspect.
A third aspect of an embodiment of the present invention provides an automatic driving control device, where the automatic driving control device includes a memory and a processor, where the memory stores a computer program that is executable on the processor, and the computer program, when executed by the processor, implements the steps in the effective sensing range recognition method according to the first aspect and any one of the possible implementations of the first aspect, so as to determine an effective sensing range of a photoelectric sensing device associated with the automatic driving control device.
A fourth aspect of the embodiments of the present invention provides a face recognition device, where the face recognition device includes a memory and a processor, where the memory stores a computer program that is executable on the processor, and the computer program, when executed by the processor, implements the steps in the effective sensing range recognition method according to the first aspect and any one of the possible implementation manners of the first aspect, so as to determine an effective sensing range of a photoelectric sensing device associated with the face recognition device.
According to the technical scheme, the embodiment of the invention has the following advantages:
in the embodiment of the invention, the system can respectively form the first sector and the second sector by taking the longitude and latitude coordinates as the original point, the horizontal field angle as the included angle and the sum of the blind area distance and the effective shooting distance as the radius according to the acquired longitude and latitude coordinates of the photoelectric sensing equipment and the installation parameters of the photoelectric sensing equipment, and the target area except the overlapping area with the first sector in the second sector is taken as the effective sensing range of the photoelectric sensing equipment, so that the effective sensing range of the photoelectric sensing equipment can be automatically calculated and determined, the measurement efficiency is improved and the measurement cost is saved compared with manual measurement.
Drawings
Fig. 1 is a schematic diagram of an embodiment of a valid sensing range identification method according to an embodiment of the present invention;
FIG. 2 is a side view of the installation of the optoelectronic sensing device in an embodiment of the present invention;
FIG. 3 is a top view of the installation of the optoelectronic sensing device in the embodiment of the present invention;
FIG. 4 is a schematic diagram of an embodiment of an automatic driving control apparatus according to an embodiment of the present invention;
fig. 5 is a schematic diagram of an embodiment of a face recognition device in the embodiment of the present invention.
Detailed Description
The embodiment of the invention provides an effective sensing range identification method and related equipment, which are used for realizing automatic measurement of the effective sensing range of photoelectric sensing equipment and improving the measurement efficiency.
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that the embodiments described herein may be practiced otherwise than as specifically illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be noted that the determination photoelectric sensing device in the embodiment of the present invention may be a device having an image capturing function and a device having a photoelectric sensor, such as a camera in a scene of a large business trip, a station, and the like, or an image capturing device in a driving device (e.g., an unmanned automobile, an industrial robot).
For convenience of understanding, a detailed process in the embodiment of the present invention is described below, and referring to fig. 1, an embodiment of a method for identifying an effective sensing range in the embodiment of the present invention may include:
101. acquiring longitude coordinates lon and latitude coordinates lat of the photoelectric sensing equipment and installation parameters of the photoelectric sensing equipment;
the effective sensing range of the photoelectric sensing equipment is related to the geographical position of the photoelectric sensing equipment and the installation parameters of the photoelectric sensing equipment, in order to automatically calculate the effective sensing range, the geographical position of the photoelectric sensing equipment and the installation parameters of the photoelectric sensing equipment need to be acquired, and the installation parameters at least comprise a horizontal field angle fhAnd an effective shooting distance L. The photoelectric sensing equipment can measure the geographical position information and the installation parameters of the photoelectric sensing equipment per se, and can also receive the geographical position information and the installation parameters of the photoelectric sensing equipment per se which are measured in advance from external equipment. The pitch angle is an included angle between an angular bisector of the vertical field angle and the vertical direction.
102. Acquiring a blind area distance b of the photoelectric sensing equipment;
because the installation parameters of the photoelectric sensing equipment are different, the corresponding blind area distances are also different, the blind area distance after installation can be calculated according to the installation parameters, and the specific calculation mode is not limited here.
Optionally, as a possible implementation manner, the embodiment of the present invention may calculate the blind area distance by using the installation parameters, where the installation parameters further include a height h, a pitch angle tilt, and a horizontal field angle fhPerpendicular field of view fvIn the installation side view of the photoelectric sensing device, the photoelectric sensing node height h and the vertical field angle f are known, exemplarily, as shown in fig. 2vPitch angle tilt, dead zone distance of photoelectric sensing nodeB, wherein b ═ h tan (tilt-f)v/2). It is understood that the above formula for calculating the blind area distance is only exemplary, and in practical applications, the calculation formula may also be modified according to the transformation of the trigonometric function, and the specific formula form is not limited herein.
103. Using longitude coordinate lon and latitude coordinate lat as origin and horizontal viewing angle fhIs an included angle, and is defined by a radius r1B and radius r2The first sector and the second sector are respectively formed with the radius of (b + L), and the target area except the overlapping area with the first sector in the second sector is used as the effective sensing range of the photoelectric sensing device.
As shown in fig. 3, in the top view of the installation of the optoelectronic sensing device, the applicant noticed that a plane can be established with the optoelectronic sensing device as the origin, in the north and east directions, and with a horizontal field angle fhIs an included angle, and is defined by a radius r1B and radius r2And (b + L) is the radius, and a first fan shape and a second fan shape are respectively formed, wherein the first fan shape is a blind area, and a target area except an overlapping area with the first fan shape in the second fan shape is used as an effective sensing range of the photoelectric sensing device. Optionally, the photoelectric sensing device may be controlled to perform detection and identification to the effective sensing range.
In the embodiment of the invention, the system can respectively form the first sector and the second sector by taking the longitude and latitude coordinates as the original point, the horizontal field angle as the included angle and the sum of the blind area distance and the effective shooting distance as the radius according to the acquired longitude and latitude coordinates of the photoelectric sensing equipment and the installation parameters of the photoelectric sensing equipment, and the target area except the overlapping area with the first sector in the second sector is taken as the effective sensing range of the photoelectric sensing equipment, so that the effective sensing range of the photoelectric sensing equipment can be automatically calculated and determined, the measurement efficiency is improved and the measurement cost is saved compared with manual measurement.
On the basis of the embodiments shown in fig. 1 to fig. 3, in order to more intuitively show the effective sensing range of the photoelectric sensing device, a boundary range corresponding to the target area needs to be drawn on a map.
Optionally, as a possible implementation manner, in the embodiment of the present invention, the arc n of the first sector may be equally divided into n +1 points, the arc m of the second sector may be equally divided into m +1 points, the longitude and latitude coordinates of the n +1 point on the arc of the first sector and the longitude and latitude coordinates of the m +1 point on the arc of the second sector are calculated, and the boundary range corresponding to the target area is drawn on the map according to the longitude and latitude coordinates of the n +1 point on the arc of the first sector and the longitude and latitude coordinates of the m +1 point on the arc of the second sector.
It can be understood that m and n are integers not less than 2, the larger the values of m and n are, the higher the drawing precision is, and the specific values of m and n can be reasonably set according to the use precision requirement of a user, which is not limited herein.
Optionally, as a possible implementation manner, in the embodiment of the present invention, the installation parameters further include a current azimuth angle azimuth of the optoelectronic sensing device, where the azimuth angle is the horizontal field angle fhThe included angle between the angular bisector of the angle and the due north direction;
taking WGS84 coordinate system as an example, the process of calculating the longitude and latitude of a point on the circular arc is described, and the process of calculating the longitude and latitude coordinates of n +1 point on the circular arc of the first sector and the longitude and latitude coordinates of m +1 point on the circular arc of the second sector is as follows:
when the arc n of the first sector is equally divided, n +1 points are generated, and the radian of each arc is fhpi/(180N), taking the highest point on the arc of the first sector in fig. 3 as the first point, as an example, and calculating according to the clockwise direction, the angle between the connecting line of the nth point on the arc of the first sector and the origin and the north direction N is (azimuth-f)h/2)+(N-1)*fh(iv) n, after conversion to radians is (((azimuth-f)h/2)+(N-1)*fhN) × π/180), and the projection of the line connecting the Nth point and the origin in the east direction is b × sin (((azimuth-f)h/2)+(N-1)*fhN) pi/180), and a projection in the positive north direction b cos (((azimuth-f)h/2)+(N-1)*fhN) × pi/180) in meters, in the WGS84 coordinate system the map in degrees, and the conversion parameter between the map units and the meters in c (c in the WGS84 coordinate system in degrees and meters)The scaling parameter between, usually 111194.872221777), the longitude and latitude coordinates of the nth point are:
Nlon=lon+r3*sin(((azimuth-fh/2)+(N-1)*fh/n)*π/180),
Nlat=lat+r3*cos(((azimuth-fh/2)+(N-1)*fh/n)*π/180),
wherein r is3=b/c;
After the arc M of the second sector is equally divided into M +1 points, taking the lowest point of the arc of the second sector in fig. 3 as the first point to calculate counterclockwise, the angle between the line connecting the mth point and the origin on the arc of the second sector and the true north direction is ((azimuth + f) after being converted into arch/2)-(M-1)*fhand/M) pi/180, and the connecting line of the M point and the origin on the circular arc of the second fan shape is projected as (b + L) sin (((azimuth + f) in the east directionh/2)-(M-1)*fh(/ m) × π/180), projection in the positive north direction is (b + L) × cos (((azimuth + f)h/2)-(M-1)*fh/M) × pi/180), the longitude and latitude coordinates of the mth point are respectively:
Mlon=lon+r4*sin(((azimuth+fh/2)-(M-1)*fh/m)*π/180),
Mlat=lat+r4*cos(((azimuth+fh/2)-(M-1)*fh/m)*π/180),
wherein r is4=(b+L)/c。
It is to be understood that the calculation process and the formula in the calculation process are only exemplary, and may be adjusted according to requirements in actual application, for example, the process uses the lowest point of the arc of the second sector as the first point to perform counterclockwise calculation, and in actual application, the highest point of the arc of the second sector may also be used as the first point to perform clockwise calculation, and the formula used in the process is adjusted correspondingly, and a specific calculation sequence is not limited here, and then the formula used in the process may also be deformed according to a trigonometric function, only the calculation result is the longitude and latitude of the points on the first sector and the second sector, and the specific calculation formula deformation is not limited here.
It should be understood that, in various embodiments of the present invention, the sequence numbers of the above steps do not mean the execution sequence, and the execution sequence of each step should be determined by its function and inherent logic, and should not constitute any limitation on the implementation process of the embodiments of the present invention.
An embodiment of the present invention further provides an automatic driving control apparatus, please refer to fig. 4, which includes a memory 410, a processor 420, a wired or wireless network module 430, and a computer program stored in the memory and running on the processor. The processor, when executing the computer program, implements the steps in each of the above embodiments of the effective sensing range identification method, such as steps 101 to 103 shown in fig. 1. Or the processor executes the computer program to realize the functions of the modules or units in the device embodiments so as to determine the effective sensing range of the photoelectric sensing device associated with the automatic driving control device.
An embodiment of the present invention further provides a face recognition apparatus, please refer to fig. 5, which includes a memory 510, a processor 520, a wired or wireless network module 530, and a computer program stored in the memory and running on the processor. The processor, when executing the computer program, implements the steps in each of the above embodiments of the effective sensing range identification method, such as steps 101 to 103 shown in fig. 1. Or the processor executes the computer program to realize the functions of the modules or units in the above device embodiments, so as to determine the effective sensing range of the photoelectric sensing device associated with the face recognition device.
Those skilled in the art will appreciate that the configurations shown in fig. 4, 5 do not constitute a limitation of an automatic driving control apparatus or a face recognition apparatus, which may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components, e.g., a computer apparatus may also include an input-output device, a bus, etc.
The Processor may be a general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, or the like. The general purpose processor may be a microprocessor or the processor may be any conventional processor or the like, the processor being the control center of the computer device and the various interfaces and lines connecting the various parts of the overall computer device.
The memory may be used to store computer programs and/or modules, and the processor may implement various functions of the computer device by executing or executing the computer programs and/or modules stored in the memory, as well as by invoking data stored in the memory. The memory may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. In addition, the memory may include high speed random access memory, and may also include non-volatile memory, such as a hard disk, a memory, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), at least one magnetic disk storage device, a Flash memory device, or other volatile solid state storage device.
The present application further provides a computer-readable storage medium having a computer program stored thereon, which when executed by a processor, can implement the steps of:
acquiring longitude coordinates lon and latitude coordinates lat of the photoelectric sensing equipment and installation parameters of the photoelectric sensing equipment, wherein the installation parameters at least comprise a horizontal field angle fhAnd an effective shooting distance L;
acquiring a blind area distance b of the photoelectric sensing equipment;
using longitude coordinate lon and latitude coordinate lat as origin and horizontal viewing angle fhIs an included angle, and is defined by a radius r1B and radius r2Forming a first sector and a second sector with radius (b + L), and dividing the second sector into two sectorsExcept for the overlapping area with the first sector as the effective sensing range of the photoelectric sensing device.
Optionally, in some embodiments of the present application, the installation parameters further include a height h, a pitch angle tilt, and a horizontal field angle fhPerpendicular field of view fvThe processor may be further configured to implement the steps of:
calculating b ═ h tan (tilt-f)v/2)。
Optionally, in some embodiments of the present application, the processor may be further configured to implement the following steps:
equally dividing the arc n of the first sector into n +1 points, equally dividing the arc m of the second sector into m +1 points, and calculating the longitude and latitude coordinates of the n +1 points on the arc of the first sector and the longitude and latitude coordinates of the m +1 points on the arc of the second sector, wherein both m and n are integers not less than 2;
and drawing a boundary range corresponding to the target area on the map according to the longitude and latitude coordinates of the n +1 point on the arc of the first sector and the longitude and latitude coordinates of the m +1 point on the arc of the second sector.
Optionally, in some embodiments of the present application, the installation parameters further include a current azimuth angle azimuth of the optoelectronic sensing device;
the processor may be further configured to:
calculating longitude and latitude coordinates of N +1 points on the arc of the first sector, wherein the longitude coordinate and the latitude coordinate of the Nth point are respectively as follows:
Nlon=lon+r3*sin(((azimuth-fh/2)+(N-1)*fh/n)*π/180),
Nlat=lat+r3*cos(((azimuth-fh/2)+(N-1)*fh/n)*π/180),
wherein r is3B/c, c is a conversion parameter between degrees and meters in a WGS84 coordinate system, namely converting the angle into the length;
and calculating longitude and latitude coordinates of M +1 points on the circular arc of the second sector, wherein the longitude coordinate and the latitude coordinate of the Mth point are respectively as follows:
Mlon=lon+r4*sin(((azimuth+fh/2)-(M-1)*fh/m)*π/180),
Mlat=lat+r4*cos(((azimuth+fh/2)-(M-1)*fh/m)*π/180),
wherein r is4=(b+L)/c。
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. A method for identifying a valid sensing range, comprising:
acquiring longitude coordinates lon and latitude coordinates lat of photoelectric sensing equipment and installation parameters of the photoelectric sensing equipment, wherein the installation parameters at least comprise a horizontal field angle fhAnd an effective shooting distance L;
acquiring a blind area distance b of the photoelectric sensing equipment;
using the longitude coordinate lon and the latitude coordinate lat as the origin and the horizontal field angle fhIs an included angle, and is defined by a radius r1B and radius r2Forming a first sector and a second sector with a radius of (b + L), and taking a target region of the second sector except an overlapping region with the first sector as an effective sensing range of the photoelectric sensing device;
and the photoelectric sensing equipment detects and identifies the effective sensing range.
2. According to claimThe method of claim 1, wherein the installation parameters further include altitude h, pitch angle tilt, and horizontal field angle fhPerpendicular field of view fvAnd the obtained blind area distance b of the photoelectric sensing equipment is as follows:
b=h*tan(tilt-fv/2)。
3. the method of claim 1 or 2, further comprising:
equally dividing the arc n of the first sector into n +1 points, equally dividing the arc m of the second sector into m +1 points, and calculating the longitude and latitude coordinates of the n +1 points on the arc of the first sector and the longitude and latitude coordinates of the m +1 points on the arc of the second sector, wherein both m and n are integers not less than 2;
and drawing a boundary range corresponding to the target area on a map according to the longitude and latitude coordinates of the n +1 point on the arc of the first sector and the longitude and latitude coordinates of the m +1 point on the arc of the second sector.
4. A method according to claim 3, characterized in that said installation parameters also comprise the current azimuth angle azimuth of said optoelectronic perceiving device;
calculating the longitude and latitude coordinates of the n +1 point on the arc of the first sector and the longitude and latitude coordinates of the m +1 point on the arc of the second sector includes:
the longitude coordinate and the latitude coordinate of the nth point on the circular arc of the first sector are respectively as follows:
Nlon=lon+r3*sin(((azimuth-fh/2)+(N-1)*fh/n)*π/180),
Nlat=lat+r3*cos(((azimuth-fh/2)+(N-1)*fh/n)*π/180),
wherein r is3B/c, and c is a conversion parameter;
the longitude coordinate and the latitude coordinate of the M-th point on the circular arc of the second sector are respectively as follows:
Mlon=lon+r4*sin(((azimuth+fh/2)-(M-1)*fh/m)*π/180),
Mlat=lat+r4*cos(((azimuth+fh/2)-(M-1)*fh/m)*π/180),
wherein r is4=(b+L)/c。
5. The method of claim 4, wherein calculating the latitude and longitude coordinates of n +1 points on the arc of the first sector and the latitude and longitude coordinates of m +1 points on the arc of the second sector comprises:
and under the WGS84 coordinate, calculating the longitude and latitude coordinate of n +1 point on the circular arc of the first fan shape and the longitude and latitude coordinate of m +1 point on the circular arc of the second fan shape.
6. The method of claim 4 wherein c is a scaling parameter between angle and length in WGS84 coordinates.
7. The method of claim 6, wherein c is a scaling parameter between c degrees and meters in WGS84 coordinate system, c being equal to 111194.872221777.
8. A computer-readable storage medium having stored thereon a computer program, characterized in that: the computer program, when executed by a processor, implements the method of any one of claims 1 to 7.
9. An automatic driving control device, characterized in that it comprises a memory and a processor, said memory having stored thereon a computer program operable on said processor, said computer program, when executed by said processor, implementing a method according to any one of claims 1 to 7 for determining an effective sensing range of an opto-electronic sensing device associated with said automatic driving control device.
10. A face recognition device, characterized in that the face recognition device comprises a memory and a processor, the memory having stored thereon a computer program operable on the processor, the computer program, when executed by the processor, implementing the method according to any one of claims 1 to 7 for determining an effective sensing range of a photoelectric sensing device associated with the face recognition device.
CN201910684828.6A 2019-07-26 2019-07-26 Effective perception range identification method and related equipment Pending CN112305638A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910684828.6A CN112305638A (en) 2019-07-26 2019-07-26 Effective perception range identification method and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910684828.6A CN112305638A (en) 2019-07-26 2019-07-26 Effective perception range identification method and related equipment

Publications (1)

Publication Number Publication Date
CN112305638A true CN112305638A (en) 2021-02-02

Family

ID=74329843

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910684828.6A Pending CN112305638A (en) 2019-07-26 2019-07-26 Effective perception range identification method and related equipment

Country Status (1)

Country Link
CN (1) CN112305638A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000345767A (en) * 1999-06-08 2000-12-12 Okamura Corp Safety stopper in electric partitioning system
CN101473652A (en) * 2006-04-26 2009-07-01 Opt株式会社 Camera apparatus and image recording/reproducing method
CN102656547A (en) * 2009-10-19 2012-09-05 平蛙实验室股份公司 Determining touch data for one or more objects on a touch surface
CN103370495A (en) * 2011-01-20 2013-10-23 光帆能源公司 Compressed air energy storage system utilizing two-phase flow to facilitate heat exchange
CN105044673A (en) * 2015-07-31 2015-11-11 中北大学 Distributed target tracking and positioning system and method based on pyroelectric sensing
CN105163385A (en) * 2015-08-25 2015-12-16 华南理工大学 Localization algorithm based on sector overlapping area of clustering analysis
CN106595658A (en) * 2016-10-21 2017-04-26 乐视控股(北京)有限公司 Method and system for positioning objects in object group in local scope
CN107295534A (en) * 2017-07-06 2017-10-24 北京农业信息技术研究中心 A kind of oriented sensor coverage Enhancement Method of agriculture wireless multimedia sensor network
JP2019009752A (en) * 2017-06-20 2019-01-17 一般社団法人 日本画像認識協会 Image processing device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000345767A (en) * 1999-06-08 2000-12-12 Okamura Corp Safety stopper in electric partitioning system
CN101473652A (en) * 2006-04-26 2009-07-01 Opt株式会社 Camera apparatus and image recording/reproducing method
CN102656547A (en) * 2009-10-19 2012-09-05 平蛙实验室股份公司 Determining touch data for one or more objects on a touch surface
CN103370495A (en) * 2011-01-20 2013-10-23 光帆能源公司 Compressed air energy storage system utilizing two-phase flow to facilitate heat exchange
CN105044673A (en) * 2015-07-31 2015-11-11 中北大学 Distributed target tracking and positioning system and method based on pyroelectric sensing
CN105163385A (en) * 2015-08-25 2015-12-16 华南理工大学 Localization algorithm based on sector overlapping area of clustering analysis
CN106595658A (en) * 2016-10-21 2017-04-26 乐视控股(北京)有限公司 Method and system for positioning objects in object group in local scope
JP2019009752A (en) * 2017-06-20 2019-01-17 一般社団法人 日本画像認識協会 Image processing device
CN107295534A (en) * 2017-07-06 2017-10-24 北京农业信息技术研究中心 A kind of oriented sensor coverage Enhancement Method of agriculture wireless multimedia sensor network

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
吴海;赵巍;田斌;: "无线传感器网中的加权距离节点选择法", 北京航空航天大学学报, no. 03, 15 March 2008 (2008-03-15) *
张舒野;: "VR摄影机视场与现场调度形式研究", 新媒体研究, no. 12, 5 July 2018 (2018-07-05) *
杨国栋: "《汽车电子控制技术》", 31 January 2015, 西南交通大学出版社, pages: 234 *
王春光;叶江华;吴先翔;: "基于改进粒子群优化的机载预警雷达组网", 计算机仿真, no. 08, 15 August 2013 (2013-08-15) *
高玉德: "《航海学 第2版》", 31 August 2007, 大连海事大学出版社, pages: 440 - 442 *

Similar Documents

Publication Publication Date Title
CN108369743B (en) Mapping a space using a multi-directional camera
CN106530218B (en) Coordinate conversion method and device
KR101900873B1 (en) Method, device and system for acquiring antenna engineering parameters
JP6676082B2 (en) Indoor positioning method and system, and device for creating the indoor map
CN111337947A (en) Instant mapping and positioning method, device, system and storage medium
CN109559349B (en) Method and device for calibration
US11204249B2 (en) Positioning method and robot with the same
CN111121754A (en) Mobile robot positioning navigation method and device, mobile robot and storage medium
US20160183116A1 (en) Method and apparatus of positioning mobile terminal based on geomagnetism
CN111383279B (en) External parameter calibration method and device and electronic equipment
CN110146096B (en) Vehicle positioning method and device based on image perception
US8155387B2 (en) Method and system for position determination using image deformation
CN113592989B (en) Three-dimensional scene reconstruction system, method, equipment and storage medium
CN103256920A (en) Determining tilt angle and tilt direction using image processing
CN112556685B (en) Navigation route display method and device, storage medium and electronic equipment
CN113447923A (en) Target detection method, device, system, electronic equipment and storage medium
EP3321882A1 (en) Matching cost computation method and device
CN107193820B (en) Position information acquisition method, device and equipment
WO2022088613A1 (en) Robot positioning method and apparatus, device and storage medium
US20170091945A1 (en) Point and sensor estimation from images
CN111862208A (en) Vehicle positioning method and device based on screen optical communication and server
CN112305638A (en) Effective perception range identification method and related equipment
CN111104861A (en) Method and apparatus for determining position of electric wire and storage medium
CN113932793B (en) Three-dimensional coordinate positioning method, three-dimensional coordinate positioning device, electronic equipment and storage medium
CN107703954B (en) Target position surveying method and device for unmanned aerial vehicle and unmanned aerial vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20221130

Address after: 710000 second floor, building B3, yunhuigu, No. 156, Tiangu 8th Road, software new town, high tech Zone, Xi'an, Shaanxi

Applicant after: Xi'an Guangqi Intelligent Technology Co.,Ltd.

Address before: Second floor, B3, yunhuigu, 156 Tiangu 8th Road, software new town, Xi'an City, Shaanxi Province 710000

Applicant before: Xi'an Guangqi Future Technology Research Institute