CN113446971B - Space recognition method, electronic device and non-transitory computer readable storage medium - Google Patents

Space recognition method, electronic device and non-transitory computer readable storage medium Download PDF

Info

Publication number
CN113446971B
CN113446971B CN202110312927.9A CN202110312927A CN113446971B CN 113446971 B CN113446971 B CN 113446971B CN 202110312927 A CN202110312927 A CN 202110312927A CN 113446971 B CN113446971 B CN 113446971B
Authority
CN
China
Prior art keywords
electronic device
boundary line
obstacle
boundary
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110312927.9A
Other languages
Chinese (zh)
Other versions
CN113446971A (en
Inventor
房育维
陈水石
郭家瑞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ali Corp
Original Assignee
Ali Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/180,684 external-priority patent/US11875572B2/en
Application filed by Ali Corp filed Critical Ali Corp
Publication of CN113446971A publication Critical patent/CN113446971A/en
Application granted granted Critical
Publication of CN113446971B publication Critical patent/CN113446971B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B21/00Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention provides a space recognition method, an electronic device and a non-transitory computer readable storage medium. The method comprises the following steps. Sensor data for detecting the position of the obstacle is obtained from a sensor associated with the electronic device. A plurality of coordinates respectively corresponding to the obstacle positions are generated based on the sensor data. Boundary line information of a space around the electronic device is updated according to the coordinates until an optimization condition of each boundary line is satisfied. A spatial range of a space around the electronic device is identified based on the boundary line information. The movement of the electronic device is guided by the spatial range.

Description

Space recognition method, electronic device and non-transitory computer readable storage medium
Technical Field
The present invention relates to an environmental information recognition technology, and more particularly, to a spatial recognition method, an electronic device and a non-transitory computer readable storage medium.
Background
With advances in technology, more and more intelligent mobile devices have become more common in recent years, such as automatic sweeping robots, navigation robots, industrial robots, unmanned vehicles, and the like. Depending on the application, these smart mobile devices may move themselves in the workspace without manual assistance. These smart mobile devices require environmental information of the working space to enable the smart mobile device to move without manual assistance to accomplish its task. The environment information of the intelligent mobile device can directly influence the behavior decision of the intelligent mobile device and the working efficiency of the intelligent mobile device. Typically, the smart mobile device may have a map of the workspace or a spatial extent of the workspace so that the smart mobile device moves in the workspace by using such information. The map may include boundary information of a work space, obstacle positions, movable ranges, and the like. A map of the workspace may be obtained through interaction with a person, pre-stored in the smart mobile device, or automatically constructed by the smart mobile device performing a particular fixed athletic activity. However, since it is necessary to store a map, which generally has a huge data amount, in the smart mobile device, there is a high demand for a memory for storing the map. In addition, when the smart mobile device performs a specific stationary movement behavior, such as moving along a wall, to identify a spatial range of a work space, some fragmented areas having a small area may be generated due to the above-described limitation of the specific stationary movement behavior. The presence of fragmented areas may result in poor operating efficiency of the intelligent mobile device.
Disclosure of Invention
In view of this, the present disclosure provides a space identification method, an electronic device and a non-transitory computer readable storage medium, which can identify a complete space range of a space around the electronic device without constructing a map.
Embodiments of the present disclosure provide a spatial recognition method, which includes the following steps. Sensor data for detecting the position of the obstacle is obtained from a sensor associated with the electronic device. A plurality of coordinates respectively corresponding to the obstacle positions are generated based on the sensor data. Boundary line information of a space around the electronic device is updated according to the coordinates until an optimization condition of each boundary line is satisfied. A spatial range of a space around the electronic device is identified based on the boundary line information. The movement of the electronic device is guided by the spatial range.
Embodiments of the present disclosure provide an electronic device that includes one or more sensors, a memory, and a processor. One or more sensors are disposed on the electronic device and configured to acquire sensor data for detecting a position of the obstacle. The memory records a plurality of instructions. The processor is coupled to the sensor and the memory for executing the instructions to perform the following steps. Sensor data is obtained from the sensor. A plurality of coordinates respectively corresponding to the obstacle positions are generated based on the sensor data. Boundary line information of a space around the electronic device is updated according to the coordinates until an optimization condition of each boundary line is satisfied. A spatial range of a space around the electronic device is identified based on the boundary line information. The movement of the electronic device is guided by the spatial range.
Embodiments of the present disclosure provide a non-transitory computer-readable storage medium recording a set of instructions for execution by one or more processors of a device to cause the device to perform a spatial identification method. The spatial recognition method includes the following steps. Sensor data for detecting the position of the obstacle is obtained from a sensor associated with the electronic device. A plurality of coordinates respectively corresponding to the obstacle positions are generated based on the sensor data. Boundary line information of a space around the electronic device is updated according to the coordinates until an optimization condition of each boundary line is satisfied. A spatial range of a space around the electronic device is identified based on the boundary line information. The movement of the electronic device is guided by the spatial range.
Based on the above, in the embodiments of the disclosure, when the electronic device moves to the current position, the electronic device may acquire a plurality of coordinates corresponding to the obstacle position using the sensor data, and the plurality of coordinates corresponding to the obstacle position are used to determine boundary line information of a space around the electronic device. Boundary line information for the space around the electronic device may then be updated based on the sensor data. In response to meeting the optimization conditions for each boundary line, the electronic device will cease optimizing the boundary line information. Therefore, without storing or constructing a map having a huge amount of data, a spatial range to guide movement of the electronic device can be identified from the boundary line information. In addition, the probability of occurrence of the fragment area can be obviously reduced, and the working efficiency of the intelligent mobile equipment is improved.
In order to make the above features and advantages of the present invention more comprehensible, embodiments accompanied with figures are described in detail below.
Drawings
The accompanying drawings are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification. The drawings illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
FIG. 1 is a block diagram of an electronic device according to an embodiment of the disclosure;
FIG. 2 is a flow chart of a method of spatial identification according to an embodiment of the disclosure;
FIG. 3 is a flow chart of a method of spatial identification according to an embodiment of the disclosure;
fig. 4A to 4C are schematic diagrams illustrating optimizing boundary line information to obtain a spatial range according to an embodiment of the disclosure.
Detailed Description
Reference will now be made in detail to the exemplary embodiments of the present invention, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers are used in the drawings and the description to refer to the same or like parts.
FIG. 1 is a block diagram of an electronic device according to an embodiment of the disclosure. All components of the electronic device and their configuration are first described in fig. 1. The functions of the above components are disclosed in more detail together with fig. 2.
Referring to fig. 1, an electronic device 10 includes one or more sensors 110, a memory 120, and a processor 130. In various embodiments, the electronic device 10 may be any self-propelled apparatus that moves over the floor of a work space without manual intervention. For example, the electronic device 10 may be an autonomous mobile robot (autonomous mobile robot), an unmanned ground vehicle (unmanned ground vehicle), an automated guided vehicle (automated guided vehicles) or a sweeping robot (robotic vacuum cleaner) or other similar mobile electronic device, but the disclosure is not limited thereto. In some embodiments, the electronic device 10 may make some behavioral decisions based on the environmental information to complete its tasks.
The sensor 110 is disposed on the electronic device 10 and is used to acquire sensor data for detecting the position of an obstacle. The sensor 110 may sense environmental information of the space surrounding the electronic device 10 by different sensing means to provide sensor data to the processor 130. In some embodiments, sensor 110 may be, for example, a radar, an acoustic sensing device, an optical radar using optical ranging, a depth of field camera, or an image capture device, among others, although the disclosure is not limited in this regard. In some embodiments, the sensor 110 may detect distance information between some obstacles in the space in which the electronic device 10 is located and the electronic device 10. For example, the sensor 110 may be a light arrival (LIDAR) of a 360 degree two-dimensional laser range finder, and the sensor 110 may detect obstacle distances at a plurality of sensing angles. Alternatively, the sensor 110 may be a depth-of-field camera (depth-of-field camera) or any other image capturing device having at least two lenses and at least two image sensors, and the sensor 110 may detect depth information about a plurality of image feature points.
The memory 120 is used for storing files, images, instructions, program codes, software components, etc., and may be, for example, any type of fixed or removable random access memory (random access memory, RAM), read-only memory (ROM), flash memory (flash memory), hard disk or other similar devices, integrated circuits, or combinations thereof.
The processor 130 is coupled to the sensor 110 and the memory 120 for controlling the operation of the components of the electronic device 10, such as a central processing unit (Central Processing Unit, CPU), or other general purpose or special purpose Microprocessor (Microprocessor), digital signal processor (Digital Signal Processor, DSP), programmable controller, application specific integrated circuit (Application Specific Integrated Circuits, ASIC), programmable logic device (Programmable Logic Device, PLD), or other similar devices or combinations thereof.
However, in addition to the sensor 110, the memory 120, and the processor 130, the electronic device 10 may further include components not shown in fig. 1, such as an odometer, a display, a gyroscope, a power supply device, a motor, etc., which are not limited in this disclosure.
In an embodiment of the present disclosure, processor 130 is configured to access and execute modules, sets of instructions, and/or program code stored in memory 120 to implement the spatial identification methods provided by the present disclosure, as will be further described below.
Referring to fig. 2, fig. 2 is a flowchart illustrating a spatial recognition method according to an embodiment of the disclosure. Referring to fig. 1 and fig. 2, the manner of the present embodiment is applicable to the electronic device 10 in the above embodiment, and the detailed steps of the spatial recognition method of the present embodiment are described below with respect to each component in the electronic device 10.
In step S210, the processor 130 may obtain sensor data for detecting the position of the obstacle from the sensor 110 associated with the electronic device 10. In some embodiments, the sensor data may include data reflecting a plurality of obstacle distances at a plurality of sensing angles. For example, the sensing angle may be 1 degree, 2 degrees, 360 degrees. However, the present disclosure does not limit the accuracy and number of sensing angles, and may be configured according to actual needs. In some embodiments, the sensor data may include depth information about a plurality of image feature points. For example, the sensor data may be a depth map generated from at least two images captured from different perspectives. In some embodiments, the obstacle location is the location of an obstacle located in the same space as the electronic device 10. The obstacle will obstruct the movement of the electronic device 10, such as a wall, furniture, etc. The sensor 110 may detect distance information between the location of the electronic device 10 and the location of the obstacle.
In step S220, the processor 130 may generate a plurality of coordinates respectively corresponding to the obstacle positions based on the sensor data. In detail, since distance information between the position of the electronic device 10 and the obstacle position may be detected by the sensor 110, the processor 130 may determine 2D coordinates corresponding to the obstacle position, respectively, based on the position of the electronic device 10 and sensor data provided by the sensor 110. Specifically, each 2D coordinate corresponding to one of the obstacle positions may include an X-axis coordinate component and a Y-axis coordinate component, and coordinates respectively corresponding to the obstacle positions may be generated based on an origin of a rectangular coordinate system (Cartesian coordinate system). It should be noted that the position of the obstacle detected by the sensor 110 may vary, as the electronic device 10 may move to a different location. That is, the coordinates of the obstacle position detected at the different positions are varied in response to the electronic device 10 being moved to the different positions.
In some embodiments, processor 130 may calculate a plurality of coordinates corresponding to the obstacle locations, respectively, from the obstacle distance at the sensing angle and the current location of electronic device 10. In detail, the sensor 110 may provide sensor data, which may be data reflecting a plurality of obstacle distances at a plurality of sensing angles. In some embodiments, the sensor 110 may emit light beams directed at a plurality of sensing angles, respectively, and data reflecting a plurality of obstacle distances at the plurality of sensing angles may be estimated from the reflected light beams received by the sensor 110. Based on this, the processor 130 may calculate a plurality of coordinates corresponding to the obstacle positions, respectively, from the obstacle distance at the sensing angle and the current position of the electronic device 10.
In some embodiments, processor 130 may calculate a plurality of coordinates corresponding to the obstacle positions, respectively, from the depth information about the image feature points and the current position of electronic device 10. In detail, the sensor 110 may provide sensor data, which may be depth information about a plurality of image feature points. In some embodiments, the sensor 110 may estimate depth information of image feature points by capturing images from at least two perspectives and using a depth estimation algorithm. For example, the processor 130 may obtain depth information by a visual simultaneous localization and mapping (v-SLAM) technique to calculate a plurality of coordinates respectively corresponding to the obstacle positions. In this way, the processor 130 may calculate a plurality of coordinates corresponding to the obstacle positions, respectively, from the depth information and the current position of the electronic device 10.
In step S230, the processor 130 may update boundary line information of the space around the electronic device 10 according to the coordinates until an optimization condition of each boundary line is satisfied. In detail, after the electronic device 10 at the current position calculates coordinates respectively corresponding to the obstacle positions, the processor 130 may update boundary line information of the space by using the coordinates associated with the current position. Here, the boundary line information of the space may be updated by using a local extremum in the coordinate component of the coordinates associated with the current position. When the optimization condition for each boundary line is satisfied, the processor 130 may stop updating the boundary line information of the space. When the optimization condition for each boundary line has not been satisfied, the processor 130 may continue to update boundary line information for the space.
In some embodiments, the processor 130 may continue to collect boundary line information for a space until an optimization condition for each boundary line is satisfied, where the boundary line information for the space may include a plurality of possible locations for a plurality of boundary lines. For example, boundary line information collected at two different locations may be represented as table 1.
TABLE 1
Position of electronic device Upper boundary line Lower boundary line Right boundary line Left boundary line
Previous position y max1 y min1 x max1 x min1
Current position y max2 y min2 x max2 x min2
In some embodiments, the processor 130 may update the position of the temporary boundary lines by comparing the temporary boundary lines with extrema in coordinates corresponding to the obstacle positions, respectively, until an optimization condition for each boundary line is satisfied. For example, in a rectangular coordinate system, four temporary boundary lines in the boundary line information may be represented as x=a1, x=a2, y=b1, and y=b2, where a1+.a2 and b1+.b2. When the processor 130 obtains that the coordinates corresponding to one obstacle position are (a 3, b 3), if a3 is greater than a1 and a1 is greater than a2, the boundary line x=a1 may be updated to x=a3.
In step S240, the processor 130 may identify a spatial range of the space around the electronic device 10 based on the boundary line information. That is, after the optimization condition of each boundary line is satisfied, the processor 130 may identify the spatial range of the space according to the coordinate information of the boundary line. In some embodiments, the processor 130 may identify the quadrilateral region as a spatial range based on the boundary line information.
In some embodiments, the processor 130 may obtain the first maximum value and the first minimum value of the first axis (i.e., the X-axis) based on the boundary line information. The processor 130 may obtain a second maximum value and a second minimum value of the second axis (i.e., the Y-axis) based on the boundary line information. For example, the processor 130 may obtain the first maximum value "Amax" of the X-axis from boundary line information including a plurality of possible positions of the right boundary line. The processor 130 may obtain a first minimum value "Amin" of the X-axis from boundary line information including a plurality of possible positions of the left boundary line. The processor 130 may obtain the second minimum value "Bmin" of the Y-axis from boundary line information including a plurality of possible positions of the lower boundary line. The processor 130 may obtain the second maximum value "Bmax" of the Y-axis from boundary line information including a plurality of possible positions of the upper boundary line. In this way, the spatial range of the space can be identified based on four boundary lines whose four linear equations are x=amax, x=amin, y=bmax, and y=bmin, respectively.
In some embodiments, the boundary line information may include a most recent position for each boundary line. Thus, after the optimization condition of each boundary line is satisfied, the processor 130 may identify a spatial range of the space using the final position of each boundary line.
In step S250, the processor 130 may use the spatial range to guide the movement of the electronic device 10. In some embodiments, the processor 130 determines the path of movement of the electronic device 10 from the spatial range. For example, the cleaning path of the sweeping robot may be determined according to the spatial range.
Referring to fig. 3, fig. 3 is a flowchart illustrating a spatial recognition method according to an embodiment of the disclosure. Referring to fig. 1 and 3, the manner of the present embodiment is applicable to the electronic device 10 in the above embodiment, and the detailed steps of the spatial recognition method of the present embodiment are described below with respect to each component in the electronic device 10.
In step S310, the processor 130 may acquire the current location of the electronic device 10. In some embodiments, the processor 130 may acquire displacement data of the electronic device 10 by using an odometer to acquire displacement amounts in the X-axis and the Y-axis. Processor 130 may then identify the current location of electronic device 10 based on the reference location and the displacement data. The reference location may be the origin of the coordinate system or a previous location of the electronic device 10. In other embodiments, the processor 130 may obtain the current location of the electronic device 10 by other location means, which is not limited in this disclosure.
In step S320, the processor 130 may obtain sensor data for detecting the position of the obstacle from the sensor 110 associated with the electronic device 10. In step S330, the processor 130 may generate a plurality of coordinates respectively corresponding to the obstacle positions based on the sensor data. The operations and details of step S320 and step S330 are similar to those of step S210 and step S220, and thus will not be described again.
In step S340, the processor 130 may update boundary line information of the space around the electronic device 10 according to the coordinates until the optimization condition of each boundary line is satisfied. In the present embodiment, step S340 may be implemented according to steps S341 to S342.
In step S341, the processor 130 may determine whether the current local extremum in the obstacle position is less than the minimum boundary parameter of the temporary boundary line or greater than the maximum boundary parameter of the temporary boundary line. In detail, the electronic device 10 can be moved from the previous position to the current position, and the boundary line information is updated when the electronic device 10 is located at the current position, and the temporary boundary line is determined when the electronic device 10 is located at the previous position. More specifically, the processor 130 may obtain a plurality of coordinates respectively corresponding to the obstacle positions, and the processor 130 may obtain the current local extremum in four axis directions (such as a negative X-axis direction, a positive X-axis direction, a negative Y-axis direction, a positive Y-axis direction) from the plurality of coordinates respectively corresponding to the obstacle positions. The processor 130 may then compare the current local extremum in the four axis directions to the position of the temporary boundary line, respectively.
If the determination in step S341 is yes, in step S342, the processor 130 may use the current local extremum to move the temporary boundary line and obtain the updated boundary line.
For example, in a rectangular coordinate system, four temporary boundary lines of boundary line information may be represented as x=a1, x=a2, y=b1, and y=b2, where a1+.a2 and b1+.b2. The processor 130 may obtain the current local extremum x=a3, x=a4, y=b3 and y=b4 in the four axis directions from coordinates corresponding to the obstacle positions, respectively. If the processor 130 determines that the current local extremum "a3" is greater than a1 and a1 is greater than a2, the processor 130 may move the temporary boundary line x=a1 toward the current local extremum "a3" to obtain an updated boundary line x=a3. If the processor 130 determines that the current local extremum "b3" is less than b1 and b1 is less than b2, the processor 130 may move the temporary boundary line y=b1 toward the current local extremum "b3" to obtain an updated boundary line y=b3.
In other embodiments, as described above, the processor 130 may update boundary line information of the space around the electronic device 10 by adding current local extrema corresponding to the four axis directions to the boundary line information of the space.
In step S350, the processor 130 may determine whether the optimization condition for each boundary line is satisfied. If step S350 is negative, the processor 130 may control the electronic device 10 to move to the next position, and the processor 130 will repeatedly perform steps S310 to S340. In some embodiments, the electronic device 10 may be randomly movable. In some embodiments, if the optimization condition of one of the boundary lines is satisfied, the electronic device 10 may be controlled by the processor 130 to move toward the other boundary line. For example, if the processor 130 stops updating the boundary information associated with the left boundary line, the processor 130 may control the electronic device 10 to move to the right boundary line, the lower boundary line, or the upper boundary line. Thus, in the present embodiment, the four boundary lines may move along the negative X-axis direction, the positive X-axis direction, the negative Y-axis direction, and the positive Y-axis direction, respectively, while the electronic device 10 continuously moves from one position to another. That is, when the optimization condition of each boundary line is not satisfied, the processor 130 may control the electronic device 10 to move to the next position, and may determine the moving direction of the electronic device 10 according to the determination result of step S350.
It should be noted that in some embodiments, the optimization condition may be determined to be satisfied when the distance between the boundary line and the current position of the electronic device 10 is less than a distance threshold. That is, if the electronic device 10 is sufficiently close to a boundary line, the processor 130 may cease updating boundary line information associated with the boundary line that the electronic device 10 is approaching.
In some embodiments, the sensor data for detecting the position of the obstacle includes data reflecting the distance of the obstacle at the sensing angle. When the difference in the obstacle distance with respect to two adjacent sensing angles is smaller than one boundary continuous threshold value, it can be judged that the optimization condition of each boundary line is satisfied. Specifically, a sensor 110 such as LiDAR may detect a plurality of obstacle distances corresponding to a plurality of sensing angles, respectively. If the difference in obstacle distances between any two adjacent sensing angles is less than the boundary continuity threshold, the processor 130 may cease updating boundary line information associated with the boundary line that the electronic device 10 is proximate.
If step S350 determines yes, in step S360, the processor 130 may identify a spatial range of the space around the electronic device 10 based on the boundary line information. In the present embodiment, after the processor 130 stops updating the position of the boundary line, the processor 130 may identify the spatial range of the space from the final boundary line. In step S370, the processor 130 may use the spatial range to guide the movement of the electronic device 10.
Fig. 4A to 4C are schematic diagrams illustrating optimizing boundary line information to obtain a spatial range according to an embodiment of the disclosure. In the example of fig. 4A-4C, it is assumed that sensor 110 is LiDAR and that the sensor data is data reflecting obstacle distance at the sensing angle.
Referring to fig. 4A, at a time point T1, the electronic device 10 is located at a position P1. The processor 130 of the electronic device 10 may obtain sensor data 40 from LiDAR for detecting the location of an obstacle. The sensor data 40 is data reflecting the obstacle distance at the sensing angle when the electronic device 10 is located at the position P1. The processor 130 may generate a plurality of coordinates respectively corresponding to the obstacle positions based on the sensor data 40, and each of the coordinates includes an X-axis coordinate component and a Y-axis coordinate component. Processor 130 may then obtain four current local extrema from the X-axis coordinate component and the Y-axis coordinate component of the coordinates corresponding to the obstacle location, respectively. The four current local extrema include two maxima among the X-axis coordinate component and the Y-axis coordinate component and two minima among the X-axis coordinate component and the Y-axis coordinate component. In FIG. 4A, the current local extremum may be represented as { b, d, a, c }. The four boundary lines may be denoted as x=b, x=d, y=a and y=c. The boundary line information of the space can be represented as table 2 herein.
TABLE 2
Position of electronic device Upper boundary line Lower boundary line Right boundary line Left boundary line
P1 c a d b
Referring to fig. 4B, it is assumed that the optimization condition of each boundary line is not satisfied. At point in time T2, processor 130 may control electronic device 10 to move to the next location, location P2. The sensor data 41 is data reflecting the obstacle distance at the sensing angle when the electronic apparatus 10 is located at the position P2. The processor 130 may generate a plurality of coordinates respectively corresponding to the obstacle positions based on the sensor data 41, and each of the coordinates includes an X-axis coordinate component and a Y-axis coordinate component. Processor 130 may then obtain four current local extrema from the X-axis coordinate component and the Y-axis coordinate component of the coordinates corresponding to the obstacle location, respectively. In FIG. 4B, the current local extremum may be represented as { g, e, h, f }. The four boundary lines may be denoted as x=g, x=e, y=h and y=f. Therefore, boundary line information of the space can be updated and expressed as table 3.
TABLE 3 Table 3
Position of electronic device Upper boundary line Lower boundary line Right boundary line Left boundary line
P1 c a d b
P2 f h e g
Referring to fig. 4C, it is assumed that the optimization condition for each boundary line is satisfied. The processor 130 may then identify a spatial extent of the space around the electronic device 10 based on the boundary line information. In the current example, the processor 130 may obtain the maximum value of the X-axis from the boundary line information { d, e }. The processor 130 may obtain the minimum value of the X-axis from the boundary line information b, g. The processor 130 may obtain the minimum value of the Y-axis from the boundary line information { a, h }. The processor 130 may obtain the maximum value of the Y-axis from the boundary line information { c, f }. Assuming a=h=i, c=f=j, b < g and e > d, the spatial extent of the space can be identified as a quadrilateral region with four corner points (b, j), (e, j), (b, i) and (e, i).
The present disclosure further provides a non-transitory computer readable storage medium. The computer readable recording medium may store a plurality of program code sections (e.g., an organization chart creation program code section, an approval table program code section, a setup program code section, and a deployment program code section). The steps in the spatial identification method described above may be completed after the processor 130 of the electronic device 10 loads and executes the program code segments.
In summary, in the embodiment of the invention, by using the sensor data, the electronic device can identify the spatial range of the working space where the electronic device is located, and a map of the working space is not required to be constructed. Therefore, the electronic device does not need a storage space for recording a map having a huge data amount. In addition, the electronic device can identify the space range of the working space without executing any specific fixed movement behavior, so that the possibility of occurrence of a broken area can be obviously reduced. Thus, the efficiency of identifying the spatial extent of the job space or the efficiency of performing other tasks based on the spatial extent may be improved.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the invention.

Claims (18)

1. A method of spatial recognition, comprising:
obtaining sensor data for detecting an obstacle location from a sensor associated with an electronic device, wherein the electronic device is a self-propelled apparatus;
generating a plurality of coordinates respectively corresponding to the obstacle positions based on the sensor data;
updating boundary line information of a plurality of boundary lines of a space around the electronic device according to the coordinates, and judging whether the optimization condition of each boundary line is met or not until the optimization condition of each boundary line is met;
stopping updating the boundary line information of the space when the optimization condition of each boundary line is satisfied, and identifying a space range of the space around the electronic device based on the boundary line information; and
the spatial extent is utilized to guide the movement of the electronic device,
and when the distance between the boundary line and the current position of the electronic device is smaller than a distance threshold value, judging that the optimization condition is met.
2. The spatial recognition method according to claim 1, wherein the sensor data for detecting the obstacle position includes data reflecting an obstacle distance at a sensing angle, and the optimization condition of each boundary line is determined to be satisfied when a difference in obstacle distances with respect to two adjacent sensing angles is smaller than a boundary continuous threshold.
3. The spatial identification method as set forth in claim 1, further comprising:
acquiring displacement data of the electronic device; and
and identifying the current position of the electronic device according to the reference position and the displacement data.
4. The spatial recognition method according to claim 1, wherein the step of updating the boundary line information includes:
judging whether the current local extremum in the obstacle position is smaller than the minimum boundary parameter of a temporary boundary line or larger than the maximum boundary parameter of the temporary boundary line;
when the current local extremum is less than the minimum boundary parameter or greater than the maximum boundary parameter, the temporary boundary line is moved using the current local extremum to obtain an updated boundary line,
wherein the electronic device moves from a previous location to a current location, the boundary line information is updated when the electronic device is located at the current location, and the temporary boundary line is determined when the electronic device is located at the previous location.
5. The spatial identification method according to claim 1, wherein the step of identifying the spatial range comprises:
acquiring a first maximum value and a first minimum value of a first axis based on the boundary line information; and
and acquiring a second maximum value and a second minimum value of a second axis based on the boundary line information.
6. The method of spatial identification of claim 1, wherein the sensor data comprises data reflecting a plurality of obstacle distances at a plurality of sensing angles, and the step of generating the coordinates comprises:
the coordinates respectively corresponding to the obstacle positions are calculated according to the obstacle distance at the sensing angle and the current position of the electronic device.
7. The spatial recognition method according to claim 1, wherein the sensor data includes depth information about a plurality of image feature points, and the step of generating the coordinates includes:
the coordinates corresponding to the obstacle positions are calculated according to the depth information about the image feature points and the current position of the electronic device.
8. The spatial identification method according to claim 1, wherein the step of identifying the spatial range comprises:
and identifying a quadrilateral region as the spatial range based on the boundary line information.
9. An electronic device, comprising:
one or more sensors arranged on the electronic device and used for acquiring sensor data for detecting the position of the obstacle, wherein the electronic device is self-propelled equipment;
a memory that records a plurality of instructions;
a processor, coupled to the sensor and the memory, for executing the instructions to:
obtaining the sensor data from the sensor;
generating a plurality of coordinates respectively corresponding to the obstacle positions based on the sensor data;
updating boundary line information of a plurality of boundary lines of a space around the electronic device according to the coordinates, and judging whether the optimization condition of each boundary line is met or not until the optimization condition of each boundary line is met;
stopping updating the boundary line information of the space when the optimization condition of each boundary line is satisfied, and identifying a space range of the space around the electronic device based on the boundary line information; and
the spatial extent is utilized to guide the movement of the electronic device,
and when the distance between the boundary line and the current position of the electronic device is smaller than a distance threshold value, judging that the optimization condition is met.
10. The electronic device according to claim 9, wherein the sensor data for detecting the obstacle position includes data reflecting an obstacle distance at a sensing angle, and the optimization condition of each boundary line is determined to be satisfied when a difference in obstacle distance with respect to two adjacent sensing angles is smaller than a boundary continuous threshold.
11. The electronic device of claim 9, further comprising a positioning device coupled to the processor and configured to obtain displacement data of the electronic device, the processor further configured to:
acquiring the displacement data from the positioning device; and
and identifying the current position of the electronic device according to the reference position and the displacement data.
12. The electronic device of claim 9, wherein the processor is further configured to:
judging whether the current local extremum in the obstacle position is smaller than the minimum boundary parameter of a temporary boundary line or larger than the maximum boundary parameter of the temporary boundary line;
when the current local extremum is smaller than the minimum boundary parameter or larger than the maximum boundary parameter, the temporary boundary line is moved by using the current local extremum to obtain an updated boundary line,
wherein the electronic device moves from a previous location to a current location, the boundary line information is updated when the electronic device is located at the current location, and the temporary boundary line is determined when the electronic device is located at the previous location.
13. The electronic device of claim 9, wherein the processor is further configured to:
acquiring a first maximum value and a first minimum value of a first axis based on the boundary line information; and
and acquiring a second maximum value and a second minimum value of a second axis based on the boundary line information.
14. The electronic device of claim 9, wherein the sensor data comprises data reflecting a plurality of obstacle distances at a plurality of sensing angles, and the processor is further configured to:
the coordinates respectively corresponding to the obstacle positions are calculated according to the obstacle distance at the sensing angle and the current position of the electronic device.
15. The electronic device of claim 9, wherein the sensor data includes depth information about a plurality of image feature points, and the processor is further configured to:
the coordinates corresponding to the obstacle positions are calculated according to the depth information about the image feature points and the current position of the electronic device.
16. The electronic device of claim 9, wherein the spatial extent is a quadrilateral area based on the boundary information.
17. A non-transitory computer-readable storage medium, recording a set of instructions for execution by one or more processors of an apparatus to cause the apparatus to perform a spatial identification method, the spatial identification method comprising:
obtaining sensor data for detecting an obstacle location from a sensor associated with an electronic device, wherein the electronic device is a self-propelled apparatus;
generating a plurality of coordinates respectively corresponding to the obstacle positions based on the sensor data;
updating boundary line information of a plurality of boundary lines of a space around the electronic device according to the coordinates, and judging whether the optimization condition of each boundary line is met or not until the optimization condition of each boundary line is met;
stopping updating the boundary line information of the space when the optimization condition of each boundary line is satisfied, and identifying a space range of the space around the electronic device based on the boundary line information; and
the spatial extent is utilized to guide the movement of the electronic device,
and when the distance between the boundary line and the current position of the electronic device is smaller than a distance threshold value, judging that the optimization condition is met.
18. The non-transitory computer readable storage medium of claim 17, wherein the sensor data includes data reflecting an obstacle distance at a sensing angle, and when a difference in obstacle distances relative to two adjacent sensing angles is less than a boundary continuity threshold.
CN202110312927.9A 2020-03-25 2021-03-24 Space recognition method, electronic device and non-transitory computer readable storage medium Active CN113446971B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202062994288P 2020-03-25 2020-03-25
US62/994,288 2020-03-25
US17/180,684 US11875572B2 (en) 2020-03-25 2021-02-19 Space recognition method, electronic device and non-transitory computer-readable storage medium
US17/180,684 2021-02-19

Publications (2)

Publication Number Publication Date
CN113446971A CN113446971A (en) 2021-09-28
CN113446971B true CN113446971B (en) 2023-08-08

Family

ID=77809411

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110312927.9A Active CN113446971B (en) 2020-03-25 2021-03-24 Space recognition method, electronic device and non-transitory computer readable storage medium

Country Status (1)

Country Link
CN (1) CN113446971B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114527880B (en) * 2022-02-25 2024-07-02 歌尔科技有限公司 Spatial position identification method, device, equipment and storage medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101000507A (en) * 2006-09-29 2007-07-18 浙江大学 Method for moving robot simultanously positioning and map structuring at unknown environment
CN101093503A (en) * 2006-06-20 2007-12-26 三星电子株式会社 Method, apparatus, and medium for building grid map in mobile robot and method, apparatus, and medium for cell decomposition that uses grid map
CN106054882A (en) * 2016-06-15 2016-10-26 深圳市金佰科创发展有限公司 Robot obstacle avoidance method
CN106441275A (en) * 2016-09-23 2017-02-22 深圳大学 Method and device for updating planned path of robot
US9626874B1 (en) * 2016-01-06 2017-04-18 Qualcomm Incorporated Systems and methods for managing restricted areas for unmanned autonomous vehicles
CN106997721A (en) * 2017-04-17 2017-08-01 深圳奥比中光科技有限公司 Draw method, device and the storage device of 2D maps
CN108290294A (en) * 2015-11-26 2018-07-17 三星电子株式会社 Mobile robot and its control method
CN108628318A (en) * 2018-06-28 2018-10-09 广州视源电子科技股份有限公司 Congestion environment detection method and device, robot and storage medium
CN109464074A (en) * 2018-11-29 2019-03-15 深圳市银星智能科技股份有限公司 Area division method, subarea cleaning method and robot thereof
CN109813317A (en) * 2019-01-30 2019-05-28 京东方科技集团股份有限公司 A kind of barrier-avoiding method, electronic equipment and virtual reality device
JP2019145039A (en) * 2018-02-23 2019-08-29 Cyberdyne株式会社 Self-traveling robot and self-traveling robot control method
KR20190109632A (en) * 2018-02-28 2019-09-26 엘지전자 주식회사 Moving robot and Moving robot system

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI423779B (en) * 2011-01-28 2014-01-21 Micro Star Int Co Ltd Cleaning robot and control method thereof
KR101610502B1 (en) * 2014-09-02 2016-04-07 현대자동차주식회사 Apparatus and method for recognizing driving enviroment for autonomous vehicle
CN106705951A (en) * 2015-11-13 2017-05-24 恩斯迈电子(深圳)有限公司 Movable device
EP3525002A1 (en) * 2018-02-12 2019-08-14 Imec Methods for the determination of a boundary of a space of interest using radar sensors
US10816984B2 (en) * 2018-04-13 2020-10-27 Baidu Usa Llc Automatic data labelling for autonomous driving vehicles
JP2020017111A (en) * 2018-07-26 2020-01-30 ファナック株式会社 Work measurement device, work measurement method and program

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101093503A (en) * 2006-06-20 2007-12-26 三星电子株式会社 Method, apparatus, and medium for building grid map in mobile robot and method, apparatus, and medium for cell decomposition that uses grid map
CN101000507A (en) * 2006-09-29 2007-07-18 浙江大学 Method for moving robot simultanously positioning and map structuring at unknown environment
CN108290294A (en) * 2015-11-26 2018-07-17 三星电子株式会社 Mobile robot and its control method
US9626874B1 (en) * 2016-01-06 2017-04-18 Qualcomm Incorporated Systems and methods for managing restricted areas for unmanned autonomous vehicles
CN106054882A (en) * 2016-06-15 2016-10-26 深圳市金佰科创发展有限公司 Robot obstacle avoidance method
CN106441275A (en) * 2016-09-23 2017-02-22 深圳大学 Method and device for updating planned path of robot
CN106997721A (en) * 2017-04-17 2017-08-01 深圳奥比中光科技有限公司 Draw method, device and the storage device of 2D maps
JP2019145039A (en) * 2018-02-23 2019-08-29 Cyberdyne株式会社 Self-traveling robot and self-traveling robot control method
KR20190109632A (en) * 2018-02-28 2019-09-26 엘지전자 주식회사 Moving robot and Moving robot system
CN108628318A (en) * 2018-06-28 2018-10-09 广州视源电子科技股份有限公司 Congestion environment detection method and device, robot and storage medium
CN109464074A (en) * 2018-11-29 2019-03-15 深圳市银星智能科技股份有限公司 Area division method, subarea cleaning method and robot thereof
CN109813317A (en) * 2019-01-30 2019-05-28 京东方科技集团股份有限公司 A kind of barrier-avoiding method, electronic equipment and virtual reality device

Also Published As

Publication number Publication date
CN113446971A (en) 2021-09-28

Similar Documents

Publication Publication Date Title
US11567502B2 (en) Autonomous exploration framework for indoor mobile robotics using reduced approximated generalized Voronoi graph
US10127677B1 (en) Using observations from one or more robots to generate a spatio-temporal model that defines pose values for a plurality of objects in an environment
CN108007452B (en) Method and device for updating environment map according to obstacle and robot
CN110858076B (en) Equipment positioning and grid map construction method and mobile robot
US8903160B2 (en) Apparatus and method with traveling path planning
KR102041664B1 (en) Method and apparatus for estimating localization of robot in wide range of indoor space using qr marker and laser scanner
CN111609852A (en) Semantic map construction method, sweeping robot and electronic equipment
CN112180931B (en) Cleaning path planning method and device of sweeper and readable storage medium
CN108628318B (en) Congestion environment detection method and device, robot and storage medium
CN104737085A (en) Robot and method for autonomous inspection or processing of floor areas
CN111258320A (en) Robot obstacle avoidance method and device, robot and readable storage medium
KR101341204B1 (en) Device and method for estimating location of mobile robot using raiser scanner and structure
US10778902B2 (en) Sensor control device, object search system, object search method, and program
JP2017217726A (en) robot
CN113446971B (en) Space recognition method, electronic device and non-transitory computer readable storage medium
CN114812539B (en) Map searching method, map using method, map searching device, map using device, robot and storage medium
CN115437384A (en) Obstacle avoidance method, equipment and medium for mobile robot
CN115179287A (en) Path planning method of mechanical arm
CN114489050A (en) Obstacle avoidance route control method, device, equipment and storage medium for straight line driving
US11875572B2 (en) Space recognition method, electronic device and non-transitory computer-readable storage medium
Miura et al. Adaptive robot speed control by considering map and motion uncertainty
EP4390313A1 (en) Navigation method and self-propelled apparatus
CN112214018B (en) Robot path planning method and device
KR102009479B1 (en) Apparatus and method for controlling mobile robot
KR102275671B1 (en) Object contour detection apparatus and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant