CN109444905B - Dynamic object detection method and device based on laser and terminal equipment - Google Patents

Dynamic object detection method and device based on laser and terminal equipment Download PDF

Info

Publication number
CN109444905B
CN109444905B CN201811064921.9A CN201811064921A CN109444905B CN 109444905 B CN109444905 B CN 109444905B CN 201811064921 A CN201811064921 A CN 201811064921A CN 109444905 B CN109444905 B CN 109444905B
Authority
CN
China
Prior art keywords
length
data point
angle
marked data
distance data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811064921.9A
Other languages
Chinese (zh)
Other versions
CN109444905A (en
Inventor
杨勇
宫海涛
吴泽晓
宋昱慧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen 3irobotix Co Ltd
Original Assignee
Shenzhen 3irobotix Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen 3irobotix Co Ltd filed Critical Shenzhen 3irobotix Co Ltd
Priority to CN201811064921.9A priority Critical patent/CN109444905B/en
Publication of CN109444905A publication Critical patent/CN109444905A/en
Application granted granted Critical
Publication of CN109444905B publication Critical patent/CN109444905B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/50Systems of measurement based on relative movement of target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications

Abstract

The application is applicable to the technical field of data processing, and provides a dynamic object detection method, a device and terminal equipment based on laser, wherein the method comprises the following steps: acquiring laser distance data of each angle in a laser distance data frame, and respectively calculating the difference value between the background distance data of each angle and the laser distance data of the corresponding angle; marking data points corresponding to the laser distance data with the difference value larger than a preset distance threshold value, and judging whether the length of each continuously marked data point is larger than a preset length or not; and when the length of each continuously marked data point is greater than the preset length, determining that the object corresponding to each continuously marked data point is a dynamic object. The camera can be used for detecting dynamic objects through the camera in the prior art, and the problem of poor recognition effect in a dark environment can be solved.

Description

Dynamic object detection method and device based on laser and terminal equipment
Technical Field
The application belongs to the technical field of data processing, and particularly relates to a dynamic object detection method and device based on laser and terminal equipment.
Background
With the development of smart homes, in order to improve the intelligence of each household appliance and for security and protection, the demand for dynamic object detection is increasing, for example, the position of a user can be detected through a dynamic object detection technology, and light in a corresponding area is turned on intelligently, or when no one is in the home, the situation of burglary is warned through the dynamic object detection technology.
At present, a camera is mainly used for shooting a video picture in real time, and the video picture is subjected to image recognition to judge whether a dynamic object exists. However, the detection of a dynamic object by a camera is limited by the ambient brightness, and when the camera is in a dark environment, it is difficult to recognize the dynamic object by a picture shot by the camera.
In summary, in the prior art, dynamic object detection is performed through a camera, and the recognition effect is poor in a dark environment.
Disclosure of Invention
In view of this, embodiments of the present application provide a method and an apparatus for detecting a dynamic object based on laser, and a terminal device, so as to solve the problem in the prior art that the recognition effect is poor in a dark environment when a camera is used for detecting the dynamic object.
A first aspect of an embodiment of the present application provides a dynamic object detection method based on laser, including:
acquiring laser distance data of each angle in a laser distance data frame, and respectively calculating the difference value between the background distance data of each angle and the laser distance data of the corresponding angle;
marking data points corresponding to the laser distance data with the difference value larger than a preset distance threshold value, and judging whether the length of each continuously marked data point is larger than a preset length or not;
and when the length of each continuously marked data point is greater than the preset length, determining that the object corresponding to each continuously marked data point is a dynamic object.
A second aspect of an embodiment of the present application provides a dynamic object detection apparatus based on laser, including:
the difference value calculation module is used for acquiring laser distance data of each angle in the laser distance data frame and calculating the difference value between the background distance data of each angle and the laser distance data of the corresponding angle respectively;
the length comparison module is used for marking data points corresponding to the laser distance data with the difference value larger than the preset distance threshold value and judging whether the length of each continuously marked data point is larger than the preset length or not;
and the dynamic judgment module is used for judging that the object corresponding to each continuously marked data point is a dynamic object when the length of each continuously marked data point is greater than the preset length.
A third aspect of the embodiments of the present application provides a terminal device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and the processor implements the steps of the method when executing the computer program.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium, in which a computer program is stored, which, when executed by a processor, implements the steps of the method as described above.
Compared with the prior art, the embodiment of the application has the advantages that:
according to the dynamic object detection method based on laser, dynamic object detection is carried out by using laser distance data obtained by laser ranging, the difference value between background distance data and the laser distance data of each angle is calculated, when the difference value is larger than a preset distance threshold value, it is shown that a data point corresponding to the laser distance data possibly falls on a dynamic object, the data point is marked, when the length of a continuously marked data point is larger than a preset length, the object corresponding to the continuously marked data point is judged to be the dynamic object, only the laser distance data obtained by the laser ranging is used in the detection process, normal work can be carried out under the environment with normal light or darker light, the problems that dynamic object detection is carried out through a camera in the prior art, and the identification effect is poor under the environment with darker light are solved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flow chart illustrating an implementation of a dynamic object detection method based on laser according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a dynamic laser-based object detection apparatus according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram of a terminal device provided in an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
In particular implementations, the mobile terminals described in embodiments of the present application include, but are not limited to, other portable devices such as mobile phones, laptop computers, or tablet computers having touch sensitive surfaces (e.g., touch screen displays and/or touch pads). It should also be understood that in some embodiments, the devices described above are not portable communication devices, but rather are desktop computers having touch-sensitive surfaces (e.g., touch screen displays and/or touch pads).
In the discussion that follows, a mobile terminal that includes a display and a touch-sensitive surface is described. However, it should be understood that the mobile terminal may include one or more other physical user interface devices such as a physical keyboard, mouse, and/or joystick.
The mobile terminal supports various applications, such as one or more of the following: a drawing application, a presentation application, a word processing application, a website creation application, a disc burning application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an email application, an instant messaging application, an exercise support application, a photo management application, a digital camera application, a web browsing application, a digital music player application, and/or a digital video player application.
Various applications that may be executed on the mobile terminal may use at least one common physical user interface device, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the terminal can be adjusted and/or changed between applications and/or within respective applications. In this way, a common physical architecture (e.g., touch-sensitive surface) of the terminal can support various applications with user interfaces that are intuitive and transparent to the user.
The first embodiment is as follows:
referring to fig. 1, a method for detecting a dynamic object based on laser according to a first embodiment of the present application is described below, where the method for detecting a dynamic object based on laser according to the first embodiment of the present application includes:
step S101, laser distance data of each angle in a laser distance data frame are obtained, and the difference value between the background distance data of each angle and the laser distance data of the corresponding angle is calculated respectively;
the laser distance data of each angle can be detected through the laser radar, for example, the laser distance data of each angle at different time can be detected by setting the laser radar to rotate at a preset speed on the same plane, and one laser distance data frame comprises the laser distance data of each angle detected by the laser radar in one period.
After the laser distance data frame is acquired, the laser distance data in the laser distance data frame is divided, for example, when the detection range of the laser radar is 360 degrees, the detected data in the laser distance data frame may be divided into 360 parts, that is, 0 degree to 359 degrees.
Because an error exists in the distance measurement process, laser distance data obtained by measuring the distance of the same object may also be different, at this time, a plurality of laser distance data may exist in each angle, an average value of the laser distance data may be calculated, the average value is used as the laser distance data corresponding to the angle, or a maximum value is used as the laser distance data corresponding to the angle, or a minimum value is used as the laser distance data corresponding to the angle, and a specific setting mode may be determined according to an actual situation.
After the laser distance data of each angle is obtained, the difference between the background distance data of each angle and the laser distance data of the corresponding angle can be calculated respectively.
Step S102, marking data points corresponding to the laser distance data with the difference value larger than a preset distance threshold value, and judging whether the length of each continuously marked data point is larger than a preset length or not;
and when the difference is larger than the preset distance threshold, the data point corresponding to the laser distance data is considered to possibly fall on the dynamic object, and the data point is marked.
The data points are determined through laser distance data and angles, and represent the positions of the laser falling points on the object when the laser radar carries out laser ranging.
The preset distance threshold may be set as a fixed value or a proportional value, for example, the preset distance threshold may be set as 10% of the background distance data of the angle, for example, the background distance data of the angle is 10m, the laser distance data is 5.3m, and the difference is 4.7m greater than 1m (10% of the background distance data), and then the data point corresponding to the laser distance data is marked.
A single data point may be labeled due to environmental interference or calculation error, and therefore, only the continuously labeled data points are analyzed, where the continuously labeled data points refer to continuous angles of the labeled data points, for example, the labeled data points are (12 degrees, 8.0m), (30 degrees, 5.7m), (31 degrees, 5.8m), (32 degrees, 5.6m), (45 degrees, 3.2m) and (67 degrees, 6.9m), and the continuously labeled data points are (30 degrees, 5.7m), (31 degrees, 5.8m) and (32 degrees, 5.6 m).
The length of the continuously marked data points is calculated, and whether the length of each continuously marked data point is greater than a preset length is judged, wherein the preset length can be set according to actual conditions, for example, when a dynamic object with the length less than 15cm is considered to be not concerned, the preset length can be set to 15 cm.
Step S103, when the length of each continuously marked data point is greater than a preset length, determining that the object corresponding to each continuously marked data point is a dynamic object.
And when the length of each continuously marked data point is greater than the preset length, determining that the object corresponding to each continuously marked data point is a dynamic object.
In the dynamic object detection method based on laser, dynamic object detection is carried out through laser distance data obtained through laser ranging, the dynamic object detection method can work normally in any brightness environment, and the situation of poor recognition effect cannot occur even under the situation of dark light.
After determining that the object corresponding to each of the continuously marked data points is a dynamic object, the method may further locate the center position of the dynamic object, including:
and acquiring a median angle of angles corresponding to each continuously marked data point, and taking the median angle and laser distance data corresponding to the median angle as the central position of the dynamic object.
In some smart home applications, such as smart fans and smart interactive products, it is necessary to acquire the position of the dynamic object, so as to control the orientation of the fan and the orientation of the interactive interface to point at the dynamic object.
At this time, the median angle of the angles corresponding to the continuously marked data points may be obtained, and the laser distance data corresponding to the median angle and the median angle may be used as the center position of the dynamic object, for example, if the continuously marked data points are (30 degrees, 5.7m), (31 degrees, 5.8m), and (32 degrees, 5.6m), 31 degrees and 5.8m may be used as the center position of the dynamic object.
In addition, the method of finding the center position by the median angle is only one of the center position calculation methods, and other calculation methods may be selected according to actual situations to calculate the center position of each continuously marked data point, for example, a position with the minimum sum of distances to each continuously marked data point is used as the center position, and the specific calculation method of the center position is not limited herein.
Meanwhile, the background distance data can be updated by using a laser distance data frame obtained by each laser ranging, and the method comprises the following steps:
and comparing the background distance data of each angle with the corresponding laser distance data, and taking the larger value as new background distance data of each angle.
After each laser distance measurement obtains a laser distance data frame, the background distance data may be updated by using the laser distance data in the laser distance data frame, the background distance data of each angle and the corresponding laser distance data are respectively compared, a larger value is taken as new background distance data of each angle, for example, the background distance data corresponding to 30 degrees is 8m, if the laser distance data corresponding to 30 degrees in the laser distance data frame is 9m, the background distance data corresponding to 30 degrees is updated to 9m, and if the laser distance data corresponding to 30 degrees in the laser distance data frame is 7m, the background distance data corresponding to 30 degrees is not updated.
The updating process of the background distance data can be set before the difference value between the background distance data and the corresponding laser distance data is obtained, or can be set after the difference value between the background distance data and the corresponding laser distance data is obtained, and the effects of the two setting modes are consistent in the actual use process.
The length of each of the successively labeled data points is calculated by:
and A1, in each continuously marked data point, respectively calculating the length of a line between the adjacent data points according to the corresponding angle and laser distance data of the adjacent data points, and summing the lengths of the line between the adjacent data points to obtain the length of each continuously marked data point.
Calculating the length of each continuously marked data point may calculate the length of a line between adjacent data points, respectively, and then sum to obtain the length of each continuously marked data point, for example, the continuously marked data points are (30 degrees, 5.7m), (31 degrees, 5.8m) and (32 degrees, 5.6m), the length of a line between data points (30 degrees, 5.7m) and (31 degrees, 5.8m) and the length of a line between data points (31 degrees, 5.8m) and (32 degrees, 5.6m) may be first obtained, and then the lengths of the two lines are summed to obtain the length of each continuously marked data point.
When calculating the length of the line between the two data points, the length of the line between the two data points can be calculated according to the laser distance data of the two data points and the angle difference of the two data points.
Or a2, selecting a first marked data point and a last marked data point from the continuously marked data points, calculating a line segment length between the first marked data point and the last marked data point according to the corresponding angle of the first marked data point and the last marked data point and the laser distance data, and taking the line segment length as the length of each continuously marked data point.
Calculating the length of each successively marked data point may also directly calculate the length of a line between the first marked data point and the last marked data point as the length of each successively marked data point, for example, the length of a line between the data points (30 degrees, 5.7m), (31 degrees, 5.8m) and (32 degrees, 5.6m) may be directly calculated as the length of each successively marked data point, and the length of a line between the data points (30 degrees, 5.7m) and (32 degrees, 5.6m) may be directly calculated as the length of each successively marked data point.
Or, a3, selecting a first marked data point and a last marked data point from the continuously marked data points, calculating an arc length according to the radius and the angle corresponding to the first marked data point and the last marked data point by taking the laser distance data of the first marked data point as a radius, and taking the arc length as the length of each continuously marked data point.
Calculating the length of each of the consecutively marked data points may further include calculating the laser distance data of the first marked data point as a radius, calculating the arc length based on the radius and the angle corresponding to the first marked data point and the last marked data point, and calculating the arc length based on the arc length as the length of each of the consecutively marked data points, for example, the consecutively marked data points are (30 degrees, 5.7m), (31 degrees, 5.8m), and (32 degrees, 5.6m), the laser distance data 5.7m of the first marked data point (30 degrees, 5.7m) is taken as the radius, the angular difference between the data points (30 degrees, 5.7m), and (32 degrees, 5.6m) is 2 degrees, calculating the arc length based on the radius and the angular difference, and the arc length is taken as the length of each of the consecutively marked data points.
The initial background distance data of each angle is obtained by the following method:
and receiving a dynamic object detection instruction, and taking the laser distance data of each angle in the first frame of laser distance data frame obtained by detection as the initial background distance data of each angle according to the dynamic object detection instruction.
The initial background distance data may be preset, or laser ranging may be performed according to the dynamic object detection instruction when the dynamic object detection instruction is received, and the laser distance data of each angle in the first frame of laser distance data frame obtained by the laser ranging is used as the initial background distance data of each angle.
In the dynamic object detection method based on laser that this embodiment provided, the laser distance data that obtains through laser rangefinder carries out dynamic object and detects, can normally work under the environment that light is normal or light is darker, has solved among the prior art and has carried out dynamic object through the camera and detect, the not good problem of recognition effect under the environment that light is darker.
In addition, when the laser distance data obtained by laser ranging is used for detecting dynamic objects, the scanning frequency is high, the processing speed is high, the dynamic objects are detected in real time, the dynamic objects are detected through the camera, the dynamic objects need to be identified through a complex image identification algorithm, the processing efficiency is low, and the real-time performance of identification results is poor.
Meanwhile, the center position of the dynamic object can be determined by finding a median angle and the like, so that the center position of the dynamic object can be tracked, equipment such as an intelligent fan and an intelligent interactive product can interact with a user better, and the experience of the user is improved.
When a new laser distance data frame is obtained every time, the background distance data can be updated according to the laser distance data of each angle in the laser distance data frame, so that the timeliness of the background distance data is improved, and the dynamic object can be identified more accurately.
In the process of identifying the dynamic object, the length of each continuously marked data point needs to be calculated, calculation modes such as calculating the length of the line segments of adjacent data points and then summing, calculating the length of the line segments of head and tail data points or calculating the length of the arc line of the head and tail data points can be selected for calculation, and the selection of a specific calculation mode can be determined according to an actual application scene.
The initial background distance data may be preset, or the laser distance data of the first frame of laser distance data frame may be used as the background distance data.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Example two:
the second embodiment of the present application provides a laser-based dynamic object detecting device, which is only shown in relevant parts for convenience of description, and as shown in fig. 2, the laser-based dynamic object detecting device includes,
a difference value calculating module 201, configured to obtain laser distance data of each angle in a laser distance data frame, and calculate a difference value between background distance data of each angle and the laser distance data of a corresponding angle;
a length comparison module 202, configured to mark data points corresponding to the laser distance data whose difference is greater than the preset distance threshold, and determine whether the length of each continuously marked data point is greater than a preset length;
and the dynamic determination module 203 is configured to determine that the object corresponding to each continuously marked data point is a dynamic object when the length of each continuously marked data point is greater than a preset length.
The device further comprises:
and the central positioning module is used for acquiring a median angle of angles corresponding to each continuously marked data point, and taking the median angle and laser distance data corresponding to the median angle as the central position of the dynamic object.
The device further comprises:
and the background updating module is used for respectively comparing the background distance data of each angle with the corresponding laser distance data and taking the larger value as the new background distance data of each angle.
The device further comprises:
and the length calculation module is used for calculating the length of a line between the adjacent data points according to the angle and the laser distance data corresponding to the adjacent data points in each continuously marked data point, and summing the length of the line between each adjacent data point to obtain the length of each continuously marked data point.
Or, the length calculating module is configured to select a first marked data point and a last marked data point from the continuously marked data points, calculate a segment length between the first marked data point and the last marked data point according to the angle and the laser distance data corresponding to the first marked data point and the last marked data point, and use the segment length as the length of each continuously marked data point.
Or, the length calculating module is configured to select a first marked data point and a last marked data point from the continuously marked data points, calculate an arc length according to the radius and an angle corresponding to the first marked data point and the last marked data point by using the laser distance data of the first marked data point as a radius, and use the arc length as the length of each continuously marked data point.
The device further comprises:
and the background modeling module is used for receiving a dynamic object detection instruction and taking the laser distance data of each angle in the first frame of laser distance data frame obtained by detection as the initial background distance data of each angle according to the dynamic object detection instruction.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
Example three:
fig. 3 is a schematic diagram of a terminal device provided in the third embodiment of the present application. As shown in fig. 3, the terminal device 3 of this embodiment includes: a processor 30, a memory 31 and a computer program 32 stored in said memory 31 and executable on said processor 30. The processor 30, when executing the computer program 32, implements the steps in the above-described embodiment of the laser-based dynamic object detection method, such as steps S101 to S103 shown in fig. 1. Alternatively, the processor 30, when executing the computer program 32, implements the functions of each module/unit in each device embodiment described above, for example, the functions of the modules 201 to 203 shown in fig. 2.
Illustratively, the computer program 32 may be partitioned into one or more modules/units that are stored in the memory 31 and executed by the processor 30 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program 32 in the terminal device 3. For example, the computer program 32 may be divided into a difference calculation module, a length comparison module and a dynamic determination module, and each module has the following specific functions:
acquiring laser distance data of each angle in a laser distance data frame, and respectively calculating the difference value between the background distance data of each angle and the laser distance data of the corresponding angle;
marking data points corresponding to the laser distance data with the difference value larger than a preset distance threshold value, and judging whether the length of each continuously marked data point is larger than a preset length or not;
and when the length of each continuously marked data point is greater than the preset length, determining that the object corresponding to each continuously marked data point is a dynamic object.
The terminal device 3 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The terminal device may include, but is not limited to, a processor 30, a memory 31. It will be understood by those skilled in the art that fig. 3 is only an example of the terminal device 3, and does not constitute a limitation to the terminal device 3, and may include more or less components than those shown, or combine some components, or different components, for example, the terminal device may also include an input-output device, a network access device, a bus, etc.
The Processor 30 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 31 may be an internal storage unit of the terminal device 3, such as a hard disk or a memory of the terminal device 3. The memory 31 may also be an external storage device of the terminal device 3, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the terminal device 3. Further, the memory 31 may also include both an internal storage unit and an external storage device of the terminal device 3. The memory 31 is used for storing the computer program and other programs and data required by the terminal device. The memory 31 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (7)

1. A dynamic object detection method based on laser is characterized by comprising the following steps:
acquiring laser distance data of each angle in a laser distance data frame, and respectively calculating the difference value between the background distance data of each angle and the laser distance data of the corresponding angle;
marking data points corresponding to the laser distance data with the difference value larger than a preset distance threshold value, and judging whether the length of each continuously marked data point is larger than a preset length or not;
when the length of each continuously marked data point is greater than the preset length, determining that the object corresponding to each continuously marked data point is a dynamic object;
the length of each of the successively labeled data points is calculated by:
in each continuously marked data point, respectively calculating the length of a line between the adjacent data points according to the angle and laser distance data corresponding to the adjacent data points, and summing the lengths of the line between the adjacent data points to obtain the length of each continuously marked data point;
or, in each of the continuously marked data points, selecting a first marked data point and a last marked data point, calculating a segment length between the first marked data point and the last marked data point according to the corresponding angle and laser distance data of the first marked data point and the last marked data point, and taking the segment length as the length of each of the continuously marked data points;
or, in each of the continuously marked data points, selecting a first marked data point and a last marked data point, taking the laser distance data of the first marked data point as a radius, calculating the length of an arc according to the radius and the angle corresponding to the first marked data point and the last marked data point, and taking the length of the arc as the length of each continuously marked data point.
2. The laser-based dynamic object detection method of claim 1, wherein said determining that the object corresponding to each of said successively labeled data points is a dynamic object further comprises:
and acquiring a median angle of angles corresponding to each continuously marked data point, and taking the median angle and laser distance data corresponding to the median angle as the central position of the dynamic object.
3. The laser-based dynamic object detection method of claim 1, wherein the method further comprises:
and comparing the background distance data of each angle with the corresponding laser distance data, and taking the larger value as new background distance data of each angle.
4. A laser-based dynamic object detection method as claimed in any one of claims 1 to 3, wherein the background distance data for each angle origin is obtained by:
and receiving a dynamic object detection instruction, and taking the laser distance data of each angle in the first frame of laser distance data frame obtained by detection as the initial background distance data of each angle according to the dynamic object detection instruction.
5. A laser-based dynamic object detection device, comprising:
the difference value calculation module is used for acquiring laser distance data of each angle in the laser distance data frame and calculating the difference value between the background distance data of each angle and the laser distance data of the corresponding angle respectively;
the length comparison module is used for marking data points corresponding to the laser distance data with the difference value larger than the preset distance threshold value and judging whether the length of each continuously marked data point is larger than the preset length or not;
the dynamic judgment module is used for judging that the object corresponding to each continuously marked data point is a dynamic object when the length of each continuously marked data point is greater than the preset length;
the device further comprises:
the length calculation module is used for calculating the length of a line between the adjacent data points according to the angle corresponding to the adjacent data points and the laser distance data in each continuously marked data point, and summing the length of the line between each adjacent data point to obtain the length of each continuously marked data point;
or, the length calculating module is configured to select a first marked data point and a last marked data point from the continuously marked data points, calculate a segment length between the first marked data point and the last marked data point according to the angle and the laser distance data corresponding to the first marked data point and the last marked data point, and use the segment length as the length of each continuously marked data point;
or, the length calculating module is configured to select a first marked data point and a last marked data point from the continuously marked data points, calculate an arc length according to the radius and an angle corresponding to the first marked data point and the last marked data point by using the laser distance data of the first marked data point as a radius, and use the arc length as the length of each continuously marked data point.
6. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 4 when executing the computer program.
7. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 4.
CN201811064921.9A 2018-09-12 2018-09-12 Dynamic object detection method and device based on laser and terminal equipment Active CN109444905B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811064921.9A CN109444905B (en) 2018-09-12 2018-09-12 Dynamic object detection method and device based on laser and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811064921.9A CN109444905B (en) 2018-09-12 2018-09-12 Dynamic object detection method and device based on laser and terminal equipment

Publications (2)

Publication Number Publication Date
CN109444905A CN109444905A (en) 2019-03-08
CN109444905B true CN109444905B (en) 2020-08-25

Family

ID=65532798

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811064921.9A Active CN109444905B (en) 2018-09-12 2018-09-12 Dynamic object detection method and device based on laser and terminal equipment

Country Status (1)

Country Link
CN (1) CN109444905B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112629470A (en) * 2020-12-22 2021-04-09 石家庄市科恒电子有限公司 Depth measurement method and depth measurement system
CN113156453A (en) * 2021-04-09 2021-07-23 武汉联一合立技术有限公司 Moving object detection method, apparatus, device and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101957197A (en) * 2009-07-17 2011-01-26 株式会社拓普康 Location measurement method and position measuring instrument
CN103780837A (en) * 2014-01-02 2014-05-07 中安消技术有限公司 Motion detection and positioning photography method and device thereof
CN104318561A (en) * 2014-10-22 2015-01-28 上海理工大学 Method for detecting vehicle motion information based on integration of binocular stereoscopic vision and optical flow
CN106226777A (en) * 2016-08-17 2016-12-14 乐视控股(北京)有限公司 Infrared acquisition localization method and system
CN106529528A (en) * 2016-09-30 2017-03-22 浙江宇视科技有限公司 Method and equipment for identifying effective moving target
CN106530328A (en) * 2016-11-04 2017-03-22 深圳维周机器人科技有限公司 Method for detecting and smoothly following moving object based on video images
CN107798700A (en) * 2017-09-27 2018-03-13 歌尔科技有限公司 Determination method and device, projecting apparatus, the optical projection system of user's finger positional information
CN108363060A (en) * 2018-01-19 2018-08-03 上海思岚科技有限公司 A kind of dynamic disorder object detecting method and equipment

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105157583B (en) * 2015-09-16 2018-06-29 北京新联铁集团股份有限公司 A kind of axle journal length measuring system
CN106323182A (en) * 2016-08-08 2017-01-11 北京农业信息技术研究中心 Plant stalk diameter measuring method and device
CN108196468A (en) * 2018-03-26 2018-06-22 京东方科技集团股份有限公司 Intelligent home furnishing control method and intelligent domestic system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101957197A (en) * 2009-07-17 2011-01-26 株式会社拓普康 Location measurement method and position measuring instrument
CN103780837A (en) * 2014-01-02 2014-05-07 中安消技术有限公司 Motion detection and positioning photography method and device thereof
CN104318561A (en) * 2014-10-22 2015-01-28 上海理工大学 Method for detecting vehicle motion information based on integration of binocular stereoscopic vision and optical flow
CN106226777A (en) * 2016-08-17 2016-12-14 乐视控股(北京)有限公司 Infrared acquisition localization method and system
CN106529528A (en) * 2016-09-30 2017-03-22 浙江宇视科技有限公司 Method and equipment for identifying effective moving target
CN106530328A (en) * 2016-11-04 2017-03-22 深圳维周机器人科技有限公司 Method for detecting and smoothly following moving object based on video images
CN107798700A (en) * 2017-09-27 2018-03-13 歌尔科技有限公司 Determination method and device, projecting apparatus, the optical projection system of user's finger positional information
CN108363060A (en) * 2018-01-19 2018-08-03 上海思岚科技有限公司 A kind of dynamic disorder object detecting method and equipment

Also Published As

Publication number Publication date
CN109444905A (en) 2019-03-08

Similar Documents

Publication Publication Date Title
CN109739223B (en) Robot obstacle avoidance control method and device, terminal device and storage medium
CN109215037B (en) Target image segmentation method and device and terminal equipment
CN108961157B (en) Picture processing method, picture processing device and terminal equipment
CN108765340B (en) Blurred image processing method and device and terminal equipment
CN108965835B (en) Image processing method, image processing device and terminal equipment
CN110410353B (en) Fan control method and device and terminal equipment
CN109376645B (en) Face image data optimization method and device and terminal equipment
CN110286768B (en) Virtual object display method, terminal device and computer-readable storage medium
CN108961267B (en) Picture processing method, picture processing device and terminal equipment
CN108961183B (en) Image processing method, terminal device and computer-readable storage medium
CN110288710B (en) Three-dimensional map processing method and device and terminal equipment
CN103679788A (en) 3D image generating method and device in mobile terminal
CN109359582B (en) Information searching method, information searching device and mobile terminal
CN109444905B (en) Dynamic object detection method and device based on laser and terminal equipment
CN116168038B (en) Image reproduction detection method and device, electronic equipment and storage medium
CN109873980B (en) Video monitoring method and device and terminal equipment
CN111381224B (en) Laser data calibration method and device and mobile terminal
CN109358927B (en) Application program display method and device and terminal equipment
CN107679222B (en) Picture processing method, mobile terminal and computer readable storage medium
CN112217992A (en) Image blurring method, image blurring device, mobile terminal, and storage medium
CN108520063B (en) Event log processing method and device and terminal equipment
CN108932704B (en) Picture processing method, picture processing device and terminal equipment
CN107862010B (en) Method and device for acquiring information of application system of Internet of things and mobile terminal
CN111951349A (en) Method and device for adjusting graph vertex type and electronic equipment
CN109089040B (en) Image processing method, image processing device and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant