CN108724172B - Method and device for controlling leading equipment - Google Patents

Method and device for controlling leading equipment Download PDF

Info

Publication number
CN108724172B
CN108724172B CN201711252444.4A CN201711252444A CN108724172B CN 108724172 B CN108724172 B CN 108724172B CN 201711252444 A CN201711252444 A CN 201711252444A CN 108724172 B CN108724172 B CN 108724172B
Authority
CN
China
Prior art keywords
leading
guided
guided object
leading device
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711252444.4A
Other languages
Chinese (zh)
Other versions
CN108724172A (en
Inventor
张胜美
周金利
王雪松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Orion Star Technology Co Ltd
Original Assignee
Beijing Orion Star Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Orion Star Technology Co Ltd filed Critical Beijing Orion Star Technology Co Ltd
Priority to CN201711252444.4A priority Critical patent/CN108724172B/en
Publication of CN108724172A publication Critical patent/CN108724172A/en
Application granted granted Critical
Publication of CN108724172B publication Critical patent/CN108724172B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/1653Programme controls characterised by the control loop parameters identification, estimation, stiffness, accuracy, error analysis
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1671Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1689Teleoperation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Studio Devices (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention provides a method and a device for controlling a leading device, wherein the method comprises the following steps: monitoring the guided object and judging whether the guided object is lost; when the guided object is lost, controlling the guiding equipment to stop advancing, and detecting whether the guided object exists in the surrounding environment; if the guided object does not exist in the surrounding environment, the guiding device is controlled to move to the preset destination, and whether the guided object exists in the en route environment or not is detected, so that the guided object can be found in the surrounding environment or the en route environment when the guided object is lost, and the guided object is continuously guided to reach the preset destination when the guided object is found, so that the guiding efficiency is improved, and the guiding experience of the guided object is improved.

Description

Method and device for controlling leading equipment
Technical Field
The invention relates to the technical field of robot service, in particular to a method and a device for controlling a leading device.
Background
Conventionally, when a robot leads a guided object, the robot directly travels to the destination of the guided object. When the guided object leaves the robot due to various reasons, such as going back to take things, changing a route to another place, etc., the guided object cannot follow the robot to reach the destination, which reduces the guiding efficiency of the robot.
Disclosure of Invention
The present invention is directed to solving, at least to some extent, one of the technical problems in the related art.
Therefore, a first object of the present invention is to provide a method for controlling a lead device, which is used to solve the problem of poor lead efficiency in the prior art.
A second object of the present invention is to provide a control device for a lead device.
A third object of the invention is to propose an electronic device.
A fourth object of the invention is to propose a non-transitory computer-readable storage medium.
A fifth object of the invention is to propose a computer program product.
In order to achieve the above object, an embodiment of a first aspect of the present invention provides a method for controlling a lead device, including:
monitoring a guided object and judging whether the guided object is lost or not;
when the guided object is lost, controlling a guiding device to stop traveling and detecting whether the guided object exists in the surrounding environment;
and if the guided object does not exist in the surrounding environment, controlling the guiding equipment to move to a preset destination, and detecting whether the guided object exists in the environment along the way.
Further, according to an embodiment of the first aspect, in a first implementation manner of the embodiment of the first aspect, before the controlling the leading device to stop traveling and detecting whether the led object exists in the surrounding environment, the method further includes:
acquiring the distance between the leading device and the preset destination;
determining that a distance between the lead device and the preset destination is greater than a preset distance threshold.
Further, according to the first implementation manner of the embodiment of the first aspect, in a second implementation manner of the embodiment of the first aspect, the method further includes:
and if the distance between the leading device and the preset destination is not larger than a preset distance threshold value, controlling the leading device to travel to the preset destination.
Based on the embodiment of the first aspect, or the first implementation manner of the embodiment of the first aspect, or the second implementation manner of the embodiment of the first aspect, in a third implementation manner of the embodiment of the first aspect, the method further includes:
and when the guided object is lost, controlling the guiding device to play a first voice, wherein the first voice prompts that the guided object is close to the guiding device.
Based on the embodiment of the first aspect, or the first implementation manner of the embodiment of the first aspect, or the second implementation manner of the embodiment of the first aspect, in a fourth implementation manner of the embodiment of the first aspect, the method further includes:
and if the guided object exists in the surrounding environment or the guided object exists in the en-route environment, controlling the guiding device to play a second voice, and prompting the guided object to follow the guiding device by the second voice.
Based on the embodiment of the first aspect, or the first implementation manner of the embodiment of the first aspect, or the second implementation manner of the embodiment of the first aspect, in a fifth implementation manner of the embodiment of the first aspect, the method further includes:
and when the leading device reaches the preset destination, controlling the leading device to turn around and playing a third voice, wherein the third voice prompts that the preset destination of the guided object is reached.
In a sixth implementation manner of the first aspect as such, before detecting whether the led object exists in the surrounding environment, the method further includes:
determining that the guided object is not detected within a preset time period after the first voice is played by the guiding device.
Based on the first aspect, in a seventh implementation manner of the first aspect, the monitoring the guided object and determining whether the guided object is lost includes:
detecting the guided object by a rear camera of the guiding device;
when the guided object is not detected, calculating the duration length of the undetected guided object, and acquiring the current guided scene of the guided device;
and when the duration reaches a time length threshold corresponding to the current leading scene, determining that the led object is lost.
In an eighth implementation manner of the embodiment of the first aspect, based on the seventh implementation manner of the embodiment of the first aspect, the lead scenario includes: a normal leading scene, a turning leading scene and an obstacle leading scene;
and the time length threshold corresponding to the normal leading scene is smaller than the time length threshold corresponding to the turning leading scene and smaller than the time length threshold corresponding to the obstacle leading scene.
Based on the embodiments of the first aspect, in a ninth implementation manner of the embodiments of the first aspect, the detecting whether the led object exists in the surrounding environment includes:
predicting the loss direction of the guided object according to the last frame of image monitored to the guided object;
acquiring an image of the lost direction;
detecting whether the guided object exists in the image of the loss direction.
In a tenth implementation manner of the embodiment of the first aspect, based on the ninth implementation manner of the embodiment of the first aspect, the detecting whether the led object exists in the surrounding environment further includes:
if the guided object does not exist in the image in the losing direction, acquiring images in other directions except the losing direction;
detecting whether the guided object exists in the images in other directions.
In an eleventh implementation manner of the embodiment of the first aspect, before predicting the losing direction of the guided object according to the monitored last frame image of the guided object, the method further includes:
acquiring the distance between the leading device and the led object when the last frame image of the led object is monitored;
determining that a distance between the piloting device and the piloted object is less than a specified distance threshold.
In an eleventh implementation of the embodiment of the first aspect, in a twelfth implementation of the embodiment of the first aspect, the method further includes:
and if the distance between the leading device and the led object is not smaller than a specified distance threshold value, determining that the loss direction of the led object is the rear of the leading device.
In a thirteenth implementation manner of the first aspect, based on the ninth implementation manner of the first aspect, the predicting the loss direction of the guided object according to the monitored last frame image of the guided object includes:
detecting a center position of the guided object in a last frame image of the guided object monitored by a rear camera of the guiding device;
if the central position of the guided object is positioned at the left half part of the last frame image, determining that the loss direction of the guided object is the right rear part of the guiding device;
and if the central position of the guided object is positioned in the right half part of the last frame image, determining that the loss direction of the guided object is the left rear part of the guiding device.
In a fourteenth implementation manner of the embodiment of the first aspect, the acquiring the image in the losing direction includes:
if the loss direction is the rear of the leading device, acquiring an image of the rear of the leading device in a specified time period through a rear camera of the leading device;
if the loss direction is the right rear side of the leading device, controlling the leading device or a component where a rear camera of the leading device is located to rotate anticlockwise, and acquiring an image of the right rear side of the leading device through the rear camera of the leading device;
and if the loss direction is the left rear side of the leading device, controlling the leading device or a component where a rear camera of the leading device is located to rotate clockwise, and acquiring an image of the left rear side of the leading device through the rear camera of the leading device.
In a fifteenth implementation of the embodiment of the first aspect, the detecting whether the led object exists in the en-route environment includes:
and detecting whether the guided object exists in the en-route environment or not through a front camera and/or a rear camera of the guiding device.
In the method for controlling a leading device provided by this embodiment, whether a led object is lost or not is determined by monitoring the led object; when the guided object is lost, controlling the guiding equipment to stop advancing, and detecting whether the guided object exists in the surrounding environment; if the guided object does not exist in the surrounding environment, the guiding device is controlled to move to the preset destination, and whether the guided object exists in the en route environment or not is detected, so that the guided object can be found in the surrounding environment or the en route environment when the guided object is lost, and the guided object is continuously guided to reach the preset destination when the guided object is found, so that the guiding efficiency is improved, and the guiding experience of the guided object is improved.
In order to achieve the above object, a second aspect of the present invention provides a leading device control apparatus, including:
the monitoring module is used for monitoring the guided object and judging whether the guided object is lost;
the control module is used for controlling the leading equipment to stop advancing and detecting whether the led object exists in the surrounding environment when the led object is lost;
the control module is further configured to control the guidance device to travel to a preset destination when the guided object does not exist in the surrounding environment, and detect whether the guided object exists in the en-route environment.
Further, according to an embodiment of the second aspect, in the first implementation manner of the embodiment of the second aspect, the control module is further configured to,
before controlling a leading device to stop traveling and detecting whether the led object exists in the surrounding environment, acquiring the distance between the leading device and the preset destination; determining that a distance between the lead device and the preset destination is greater than a preset distance threshold.
Further, according to the first implementation manner of the embodiment of the second aspect, in the second implementation manner of the embodiment of the second aspect, the control module is further configured to,
and if the distance between the leading device and the preset destination is not larger than a preset distance threshold value, controlling the leading device to travel to the preset destination.
In a third implementation manner of the second aspect as that embodiment of the second aspect, or the first implementation manner of the second aspect, the apparatus further includes: a playing module;
the playing module is used for controlling the leading device to play a first voice when the led object is lost, and the first voice prompts that the led object is close to the leading device.
In a fourth implementation manner of the second aspect as that embodiment of the second aspect, or the first implementation manner of the second aspect, the apparatus further includes: a playing module;
the playing module is used for controlling the leading device to play a second voice when the led object exists in the surrounding environment or the led object exists in the en-route environment, and the second voice prompts the led object to follow the leading device.
In a fifth implementation manner of the second aspect as defined in the second aspect as such or the first implementation manner of the second aspect, the apparatus further includes: a playing module;
the playing module is configured to control the leading device to turn around and play a third voice when the leading device reaches the preset destination, where the third voice prompts that the preset destination of the led object has been reached.
In a sixth implementation manner of the second aspect, based on the third implementation manner of the second aspect, the control module is further configured to determine that the guided object is not detected within a preset time period after the first voice is played by the guiding device before detecting whether the guided object exists in the surrounding environment.
In a seventh implementation manner of the embodiment of the second aspect, based on the embodiment of the second aspect, the monitoring module is specifically configured to,
detecting the guided object by a rear camera of the guiding device;
when the guided object is not detected, calculating the duration length of the undetected guided object, and acquiring the current guided scene of the guided device;
and when the duration reaches a time length threshold corresponding to the current leading scene, determining that the led object is lost.
In an eighth implementation form of the embodiment of the second aspect, based on the seventh implementation form of the embodiment of the second aspect, the lead scenario includes: a normal leading scene, a turning leading scene and an obstacle leading scene;
and the time length threshold corresponding to the normal leading scene is smaller than the time length threshold corresponding to the turning leading scene and smaller than the time length threshold corresponding to the obstacle leading scene.
In a ninth implementation form of the embodiment of the second aspect, based on the embodiment of the second aspect, the control module is specifically configured to,
predicting the loss direction of the guided object according to the last frame of image monitored to the guided object;
acquiring an image of the lost direction;
detecting whether the guided object exists in the image of the loss direction.
In a tenth implementation manner of the second aspect, based on the ninth implementation manner of the second aspect, the control module is further specifically configured to,
if the guided object does not exist in the image in the losing direction, acquiring images in other directions except the losing direction;
detecting whether the guided object exists in the images in other directions.
In an eleventh implementation manner of the second aspect, based on the ninth implementation manner of the second aspect, the control module is further specifically configured to, before predicting the loss direction of the guided object according to the monitored last frame image of the guided object,
acquiring the distance between the leading device and the led object when the last frame image of the led object is monitored;
determining that a distance between the piloting device and the piloted object is less than a specified distance threshold.
In a twelfth implementation form of the embodiment of the second aspect, based on the eleventh implementation form of the embodiment of the second aspect, the control module is further specifically configured to,
and if the distance between the leading device and the led object is not smaller than a specified distance threshold value, determining that the loss direction of the led object is the rear of the leading device.
In a thirteenth implementation form of the second aspect, based on the ninth implementation form of the second aspect, the control module is specifically configured to,
detecting a center position of the guided object in a last frame image of the guided object monitored by a rear camera of the guiding device;
if the central position of the guided object is positioned at the left half part of the last frame image, determining that the loss direction of the guided object is the right rear part of the guiding device;
and if the central position of the guided object is positioned in the right half part of the last frame image, determining that the loss direction of the guided object is the left rear part of the guiding device.
In a fourteenth implementation manner of the second aspect, based on the ninth implementation manner of the second aspect, the control module is specifically configured to,
if the loss direction is the rear of the leading device, acquiring an image of the rear of the leading device in a specified time period through a rear camera of the leading device;
if the loss direction is the right rear side of the leading device, controlling the leading device or a component where a rear camera of the leading device is located to rotate anticlockwise, and acquiring an image of the right rear side of the leading device through the rear camera of the leading device;
and if the loss direction is the left rear side of the leading device, controlling the leading device or a component where a rear camera of the leading device is located to rotate clockwise, and acquiring an image of the left rear side of the leading device through the rear camera of the leading device.
In a fifteenth implementation manner of the embodiment of the second aspect, based on the embodiment of the second aspect, the control module is specifically configured to,
and detecting whether the guided object exists in the en-route environment or not through a front camera and/or a rear camera of the guiding device.
The leading device control apparatus provided in this embodiment determines whether a led object is lost by monitoring the led object; when the guided object is lost, controlling the guiding equipment to stop advancing, and detecting whether the guided object exists in the surrounding environment; if the guided object does not exist in the surrounding environment, the guiding device is controlled to move to the preset destination, and whether the guided object exists in the en route environment or not is detected, so that the guided object can be found in the surrounding environment or the en route environment when the guided object is lost, and the guided object is continuously guided to reach the preset destination when the guided object is found, so that the guiding efficiency is improved, and the guiding experience of the guided object is improved.
To achieve the above object, a third aspect of the present invention provides an electronic device, including: memory, processor and computer program stored on the memory and executable on the processor, characterized in that the processor implements the piloting device control method as described above when executing the program.
In order to achieve the above object, a fourth aspect embodiment of the present invention proposes a computer-readable storage medium on which a computer program is stored, which when executed by a processor implements the piloting device control method as described above.
In order to achieve the above object, a fifth aspect of the present invention provides a computer program product, wherein when executed by an instruction processor in the computer program product, a method for controlling a lead device is performed, the method comprising:
monitoring a guided object and judging whether the guided object is lost or not;
when the guided object is lost, controlling a guiding device to stop traveling and detecting whether the guided object exists in the surrounding environment;
and if the guided object does not exist in the surrounding environment, controlling the guiding equipment to move to a preset destination, and detecting whether the guided object exists in the environment along the way.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The foregoing and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a schematic flow chart of a method for controlling a lead device according to an embodiment of the present invention;
fig. 2 is a schematic flow chart of another method for controlling a lead device according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of a control device of a lead device according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
A method and apparatus for controlling a lead device according to an embodiment of the present invention will be described below with reference to the accompanying drawings.
Fig. 1 is a schematic flow chart of a method for controlling a lead device according to an embodiment of the present invention. As shown in fig. 1, the piloting device control method includes the steps of:
s101, monitoring the guided object and judging whether the guided object is lost or not.
The execution main body of the method for controlling the leading equipment is a leading equipment control device, and the leading equipment control device can be hardware or software installed on the leading equipment or a terminal which is communicated with the leading equipment to control the leading equipment. The piloting device may be, for example, a robot, or other device with piloting functionality. The led object may be, for example, a person, or other device that can move following the leading device, etc.
In this embodiment, the process of the leading device controlling apparatus executing step 101 may specifically be that a leading object is detected by a rear camera of the leading device; when the guided object is not detected, calculating the duration length of the undetected guided object, and acquiring the current guiding scene of the guiding device; and when the duration reaches a time length threshold corresponding to the current leading scene, determining that the led object is lost. Wherein, can be provided with at least one rearmounted camera on the equipment of leading for acquire the image in certain angle range and certain distance range behind the equipment of leading.
Before monitoring the guided object, the guiding device may determine a navigation route according to the map information, the destination of the guided object, and the current location, and advance and guide the guided object according to the navigation route.
Wherein the lead scenario includes: a normal leading scene, a turning leading scene and an obstacle leading scene; and the time length threshold corresponding to the normal leading scene is smaller than the time length threshold corresponding to the turning leading scene and smaller than the time length threshold corresponding to the obstacle leading scene. For example, the time length threshold corresponding to the normal leading scene may be 2 seconds, the time length threshold corresponding to the turning leading scene may be 3 seconds, and the time length threshold corresponding to the obstacle leading scene may be 4 seconds. The normal leading scene can be a non-turning and non-obstacle scene. In the turning leading scene and the obstacle leading scene, due to the existence of the obstacle or other objects, the case that the led object is close to the leading device but the leading device cannot monitor the led object may occur, and therefore, the time length threshold corresponding to the turning leading scene and the time length threshold corresponding to the obstacle leading scene may be set to be larger than the time length threshold corresponding to the normal leading scene.
And S102, when the guided object is lost, controlling the guiding equipment to stop moving, and detecting whether the guided object exists in the surrounding environment.
In this embodiment, when the led object is lost, the leading device control apparatus may control the leading device to play a first voice, where the first voice prompts that the led object approaches the leading device and controls the leading device to stop traveling; if the guided object is detected within a preset time period after the first voice is played by the guiding device, continuing to move, and guiding the guided object; and if the guided object is not detected within a preset time period after the first voice is played by the guiding device, detecting whether the guided object exists in the surrounding environment. The preset time period may be, for example, 2 seconds. The first voice may be, for example, "i do not see you, please follow me behind", etc.
In addition, if the guided object is detected within a preset time period after the first voice is played by the guiding device, the guiding device control device can control the guiding device to play a second voice, and the second voice prompts the guided object to follow the guiding device, so that the guided object can continue to move forward and be guided. The second voice may be, for example, "i see you, and there are no people going to the bar" or the like.
Further, in this embodiment, before the leading device controlling means controls the leading device to stop traveling and detects whether there is a led object in the surrounding environment, the method may further include: acquiring the distance between the leading device and a preset destination; determining that the distance between the leading device and the preset destination is greater than a preset distance threshold. The preset distance threshold may be 3m, for example.
In addition, if the leading device control device determines that the distance between the leading device and the preset destination is not greater than the preset distance threshold, the leading device and the guided object are closer to the destination, and the guided object can reach the destination by itself without leading, so that the leading device control device can directly control the leading device to travel to the preset destination; and when the leading device reaches the preset destination, controlling the leading device to turn around and play a third voice, wherein the third voice prompts that the preset destination of the guided object is reached so that the guided object can reach the destination by itself according to the voice prompts. The third speech may be, for example, "XX is here, i go back", etc.
S103, if the guided object does not exist in the surrounding environment, controlling the guiding device to move to a preset destination, and detecting whether the guided object exists in the environment along the way.
In this embodiment, the leading device control apparatus may detect whether there is a led object in the en-route environment through a front camera and/or a rear camera of the leading device in a process of controlling the leading device to travel to the preset destination. The leading device control device can control the front camera and/or the rear camera of the leading device to rotate in the advancing process to acquire images in all directions, can also control the front camera and/or the rear camera of the leading device not to rotate in the advancing process, only acquires the image of the current visual field, and judges whether a leading object exists around the current position according to the image.
In addition, in the process of controlling the leading device to travel to the preset destination, if the led object is detected in the en-route environment, the leading device can be controlled to play a second voice, and the second voice prompts the led object to follow the leading device, so that the led object is continuously led.
In the method for controlling a leading device provided by this embodiment, whether a led object is lost or not is determined by monitoring the led object; when the guided object is lost, controlling the guiding equipment to stop advancing, and detecting whether the guided object exists in the surrounding environment; if the guided object does not exist in the surrounding environment, the guiding device is controlled to move to the preset destination, and whether the guided object exists in the en route environment or not is detected, so that the guided object can be found in the surrounding environment or the en route environment when the guided object is lost, and the guided object is continuously guided to reach the preset destination when the guided object is found, so that the guiding efficiency is improved, and the guiding experience of the guided object is improved.
Fig. 2 is a schematic flow chart of another method for controlling a lead device according to an embodiment of the present invention, and as shown in fig. 2, on the basis of the embodiment shown in fig. 1, a process of detecting whether a led object exists in a surrounding environment in step 102 may specifically include the following steps:
and S1021, predicting the loss direction of the guided object according to the last frame of image monitored to the guided object.
In this embodiment, the leading device control apparatus may obtain a monitoring image within a certain time period, for example, a monitoring image from 4 seconds before the current time to the current time, analyze the monitoring image, obtain the last frame image of the monitored led object, and predict the loss direction of the led object according to the last frame image of the monitored led object.
The process of the leading device controlling apparatus executing step 1021 may specifically be that the central position of the led object in the last frame image of the led object monitored by the rear camera of the leading device is detected; if the central position of the guided object is positioned at the left half part of the last frame image, determining the loss direction of the guided object as the right rear part of the guiding device; if the center position of the guided object is located in the right half of the last frame image, the loss direction of the guided object is determined to be the left rear of the guiding device.
Further, since the guided object may be located outside the monitoring range of the guiding device after the distance between the guided object and the guiding device is greater than or equal to the specified distance threshold, the subsequent monitored image does not include the guided object, the monitoring image is analyzed and the loss direction of the guided object is not predicted, and generally, only the position located outside the monitoring range exists behind the guiding device, before step 1021, the guiding device control apparatus may first obtain the distance between the guiding device and the guided object when the last frame image of the guided object is monitored; determining that a distance between the piloting device and the piloted object is less than a specified distance threshold.
In addition, if it is determined that the distance between the guidance device and the guided object is not less than the specified distance threshold, it is determined that the loss direction of the guided object is the rear of the guidance device.
And S1022, acquiring the image in the missing direction.
Specifically, the process of the leading device controlling apparatus executing step 1022 may specifically be that, if the losing direction is the rear of the leading device, an image of the rear of the leading device in a specified time period is acquired by a rear camera of the leading device; if the losing direction is the right rear side of the leading device, controlling the leading device or a component where a rear camera of the leading device is located to rotate anticlockwise, and acquiring an image of the right rear side of the leading device through the rear camera of the leading device; and if the loss direction is the left rear side of the leading device, controlling the leading device or a component where a rear camera of the leading device is located to rotate clockwise, and acquiring an image of the left rear side of the leading device through the rear camera of the leading device.
S1023, it is detected whether or not the image in the missing direction has a guided object.
In addition, if the image in the losing direction does not have the guided object, acquiring images in other directions except the losing direction; it is detected whether or not the led object exists in the images in other directions. And if the guided object does not exist in the images in other directions, determining that the guided object does not exist in the surrounding environment, controlling the guiding equipment to travel to a preset destination, and detecting whether the guided object exists in the en-route environment.
In this embodiment, if the led object exists in the image in the direction lost or the led object exists in the images in other directions, it is determined that the led object exists in the surrounding environment, the leading device control apparatus controls the leading device to play the second voice, and the second voice prompts the led object to follow the leading device, and then to travel and lead the led object to the preset destination.
In this embodiment, the manner in which the leading device control apparatus acquires images in other directions than the loss direction may be that, if the loss direction is the right rear side of the leading device, the leading device or a component where the rear camera of the leading device is located is controlled to rotate counterclockwise with the loss direction as a starting point, and images in other directions are acquired; if the loss direction is the left rear side of the leading device, controlling the leading device or a component where a rear camera of the leading device is located to rotate clockwise by taking the loss direction as a starting point to acquire images in other directions; and if the loss direction is the rear of the leading device, controlling the leading device or a component where a rear camera of the leading device is located to rotate clockwise or anticlockwise by taking the loss direction as a starting point to acquire images in other directions.
In summary, in actual implementation, when the leading apparatus is a robot, after predicting the direction of loss of the leading object, the following control scheme may be adopted:
if the loss direction is the right rear side of the robot, the robot can be controlled to rotate anticlockwise for a circle, a rear camera of the robot is used for sequentially acquiring right rear images and images in other directions, and whether a guided object exists or not is detected; the whole robot can be controlled not to rotate, only the part where the rear camera of the robot is located is controlled to rotate to obtain an image, specifically, the cradle head of the robot can be controlled to rotate anticlockwise for a certain angle, the rear camera of the robot is used for obtaining a right half image, then the cradle head of the robot is controlled to rotate clockwise for a certain angle, the rear camera of the robot is used for obtaining a left half image, and whether a guided object exists or not is detected;
if the loss direction is the left rear side of the robot, the robot can be controlled to rotate clockwise for a circle, a rear camera of the robot is used for sequentially acquiring left rear images and other direction images, and whether a guided object exists or not is detected; the whole robot can be controlled not to rotate, only the part where the rear camera of the robot is located is controlled to rotate to obtain an image, specifically, the cradle head of the robot can be controlled to rotate clockwise for a certain angle, the rear camera of the robot is used for obtaining a left half-side image, then the cradle head of the robot is controlled to rotate anticlockwise for a certain angle, the rear camera of the robot is used for obtaining a right half-side image, and whether a guided object exists or not is detected;
if the losing direction is the rear of the robot, a rear camera of the robot can be directly used for obtaining a rear image, at the moment, the robot can be controlled to rotate anticlockwise for a circle, and also can be controlled to rotate clockwise for a circle so as to obtain images in other directions, and whether a guided object exists or not is detected; the whole robot can be controlled not to rotate, only the part where the rear camera of the robot is located is controlled to rotate to obtain an image, specifically, the cradle head of the robot can be controlled to rotate anticlockwise for a certain angle, the rear camera of the robot is used for obtaining a right half image, then the cradle head of the robot is controlled to rotate clockwise for a certain angle, the rear camera of the robot is used for obtaining a left half image, and whether a guided object exists or not is detected; the cloud platform that also can control the robot rotates certain angle clockwise earlier, utilizes the rearmounted camera of robot to acquire half left side image, and the cloud platform that controls the robot again rotates certain angle anticlockwise, utilizes the rearmounted camera of robot to acquire half right side image, detects whether to have the object of being led.
Of course, when the guided object is detected in the rotation process, the rotation is immediately stopped, and subsequent interaction is performed with the guided object to recover the guidance.
In the method for controlling a leading device provided by this embodiment, whether a led object is lost or not is determined by monitoring the led object; when the guided object is lost, controlling the guiding equipment to stop advancing, and predicting the loss direction of the guided object according to the last frame image of the guided object; acquiring an image in a loss direction; detecting whether a guided object exists in the image in the losing direction, and further determining whether the guided object exists in the surrounding environment; if the guided object does not exist in the surrounding environment, the guiding device is controlled to move to the preset destination, and whether the guided object exists in the en route environment or not is detected, so that the guided object can be found in the surrounding environment or the en route environment when the guided object is lost, and the guided object is continuously guided to reach the preset destination when the guided object is found, so that the guiding efficiency is improved, and the guiding experience of the guided object is improved.
Fig. 3 is a schematic structural diagram of a control device of a leader apparatus according to an embodiment of the present invention. As shown in fig. 3, includes: a monitoring module 31 and a control module 32.
The monitoring module 31 is configured to monitor a guided object and determine whether the guided object is lost;
a control module 32, configured to control a guidance device to stop traveling when the guided object is lost, and detect whether the guided object exists in a surrounding environment;
the control module 32 is further configured to control the guiding device to travel to a preset destination when the guided object does not exist in the surrounding environment, and detect whether the guided object exists in the en-route environment.
The leading device control device provided by the invention can be hardware or software installed on the leading device, or a terminal which is communicated with the leading device to control the leading device. The piloting device may be, for example, a robot, or other device with piloting functionality. The led object may be, for example, a person, or other device that can move following the leading device, etc.
In this embodiment, the monitoring module 31 is specifically configured to detect the guided object through a rear camera of the guiding device; when the guided object is not detected, calculating the duration length of the undetected guided object, and acquiring the current guided scene of the guided device; and when the duration reaches a time length threshold corresponding to the current leading scene, determining that the led object is lost. Wherein, can be provided with at least one rearmounted camera on the equipment of leading for acquire the image in certain angle range and certain distance range behind the equipment of leading.
Before monitoring the guided object, the guiding device may determine a navigation route according to the map information, the destination of the guided object, and the current location, and advance and guide the guided object according to the navigation route.
Wherein the lead scenario includes: a normal leading scene, a turning leading scene and an obstacle leading scene; and the time length threshold corresponding to the normal leading scene is smaller than the time length threshold corresponding to the turning leading scene and smaller than the time length threshold corresponding to the obstacle leading scene. For example, the time length threshold corresponding to the normal leading scene may be 2 seconds, the time length threshold corresponding to the turning leading scene may be 3 seconds, and the time length threshold corresponding to the obstacle leading scene may be 4 seconds. The normal leading scene can be a non-turning and non-obstacle scene. In the turning leading scene and the obstacle leading scene, due to the existence of the obstacle or other objects, the case that the led object is close to the leading device but the leading device cannot monitor the led object may occur, and therefore, the time length threshold corresponding to the turning leading scene and the time length threshold corresponding to the obstacle leading scene may be set to be larger than the time length threshold corresponding to the normal leading scene.
In this embodiment, the apparatus may further include: and the playing module is used for controlling the leading device to play a first voice when the led object is lost, and the first voice prompts that the led object is close to the leading device. Correspondingly, the control module 32 is further configured to determine that the guided object is not detected within a preset time period after the first voice is played by the guiding device before detecting whether the guided object exists in the surrounding environment. The preset time period may be, for example, 2 seconds. The first voice may be, for example, "i do not see you, please follow me behind", etc.
In addition, the playing module may be further configured to control the leading device to play a second voice when the led object is detected within a preset time period after the leading device plays the first voice, and the second voice prompts the led object to follow the leading device, so as to continue to travel and lead the led object. The second voice may be, for example, "i see you, and there are no people going to the bar" or the like.
Further, in this embodiment, the control module 32 is further configured to, before controlling a leading device to stop traveling and detecting whether the led object exists in the surrounding environment, obtain a distance between the leading device and the preset destination; determining that a distance between the lead device and the preset destination is greater than a preset distance threshold. The preset distance threshold may be 3m, for example.
In addition, the control module 32 is further configured to control the leading device to travel to the preset destination if it is determined that the distance between the leading device and the preset destination is not greater than a preset distance threshold. Correspondingly, the playing module is further configured to control the leading device to turn around and play a third voice when the leading device reaches the preset destination, where the third voice prompts that the preset destination of the led object has been reached. And when the leading device reaches the preset destination, controlling the leading device to turn around and playing a third voice, wherein the third voice prompts that the preset destination of the guided object is reached.
In this embodiment, the control module 32 may detect whether there is a guided object in the en-route environment through a front camera and/or a rear camera of the guiding device in the process of controlling the guiding device to travel to the preset destination. The control module 32 may control the front camera and/or the rear camera of the leading device to rotate in the advancing process to acquire images in various directions, and may also control the front camera and/or the rear camera of the leading device to rotate in the advancing process, so as to acquire only an image of the current field of view, and determine whether there is a guided object around the current position according to the image.
In addition, in the process of controlling the leading device to travel to the preset destination, if the led object is detected in the en-route environment, the leading device can be controlled to play a second voice, and the second voice prompts the led object to follow the leading device, so that the led object is continuously led.
The leading device control apparatus provided in this embodiment determines whether a led object is lost by monitoring the led object; when the guided object is lost, controlling the guiding equipment to stop advancing, and detecting whether the guided object exists in the surrounding environment; if the guided object does not exist in the surrounding environment, the guiding device is controlled to move to the preset destination, and whether the guided object exists in the en route environment or not is detected, so that the guided object can be found in the surrounding environment or the en route environment when the guided object is lost, and the guided object is continuously guided to reach the preset destination when the guided object is found, so that the guiding efficiency is improved, and the guiding experience of the guided object is improved.
Further, on the basis of the embodiment shown in fig. 3, the control module 32 is specifically configured to predict the loss direction of the guided object according to the last frame of image monitored to the guided object; acquiring an image of the lost direction; detecting whether the guided object exists in the image of the loss direction.
In this embodiment, the leading device control apparatus may obtain a monitoring image within a certain time period, for example, a monitoring image from 4 seconds before the current time to the current time, analyze the monitoring image, obtain the last frame image of the monitored led object, and predict the loss direction of the led object according to the last frame image of the monitored led object.
The process of predicting the loss direction of the guided object by the control module 32 according to the monitored last frame image of the guided object may specifically be to detect the center position of the guided object in the last frame image of the guided object monitored by the rear camera of the guiding device; if the central position of the guided object is positioned at the left half part of the last frame image, determining the loss direction of the guided object as the right rear part of the guiding device; if the center position of the guided object is located in the right half of the last frame image, the loss direction of the guided object is determined to be the left rear of the guiding device.
Further, since the guided object may be located outside the monitoring range of the guiding device after the distance between the guided object and the guiding device is greater than or equal to the specified distance threshold, the monitored image does not include the guided object, the monitoring image is analyzed to predict the loss direction of the guided object, and generally only the position located outside the monitoring range exists behind the guiding device, therefore, before the control module 32 predicts the loss direction of the guided object according to the last frame image of the guided object, the control module 32 is further specifically configured to obtain the distance between the guiding device and the guided object when the last frame image of the guided object is monitored; determining that a distance between the piloting device and the piloted object is less than a specified distance threshold.
In addition, if it is determined that the distance between the guidance device and the guided object is not less than the specified distance threshold, the control module 32 determines that the loss direction of the guided object is the rear of the guidance device.
Further, the process of acquiring the image in the loss direction by the control module 32 may specifically be that, if the loss direction is the rear of the lead device, the image at the rear of the lead device in the specified time period is acquired by a rear camera of the lead device; if the losing direction is the right rear side of the leading device, controlling the leading device or a component where a rear camera of the leading device is located to rotate anticlockwise, and acquiring an image of the right rear side of the leading device through the rear camera of the leading device; and if the loss direction is the left rear side of the leading device, controlling the leading device or a component where a rear camera of the leading device is located to rotate clockwise, and acquiring an image of the left rear side of the leading device through the rear camera of the leading device.
In addition, the control module 32 is specifically further configured to, if the guided object does not exist in the image in the losing direction, acquire images in directions other than the losing direction; detecting whether the guided object exists in the images in other directions.
In this embodiment, if the led object exists in the image in the direction lost or the led object exists in the images in other directions, it is determined that the led object exists in the surrounding environment, the playing module controls the leading device to play the second voice, and the second voice prompts the led object to follow the leading device, and then to travel and lead the led object to the preset destination.
The leading device control apparatus provided in this embodiment determines whether a led object is lost by monitoring the led object; when the guided object is lost, controlling the guiding equipment to stop advancing, and predicting the loss direction of the guided object according to the last frame image of the guided object; acquiring an image in a loss direction; detecting whether a guided object exists in the image in the losing direction, and further determining whether the guided object exists in the surrounding environment; if the guided object does not exist in the surrounding environment, the guiding device is controlled to move to the preset destination, and whether the guided object exists in the en route environment or not is detected, so that the guided object can be found in the surrounding environment or the en route environment when the guided object is lost, and the guided object is continuously guided to reach the preset destination when the guided object is found, so that the guiding efficiency is improved, and the guiding experience of the guided object is improved.
Fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present invention. The electronic device includes:
memory 1001, processor 1002, and computer programs stored on memory 1001 and executable on processor 1002.
The processor 1002, when executing the program, realizes the lead device control method provided in the above-described embodiment.
Further, the electronic device further includes:
a communication interface 1003 for communicating between the memory 1001 and the processor 1002.
A memory 1001 for storing computer programs that may be run on the processor 1002.
Memory 1001 may include high-speed RAM memory and may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
The processor 1002 is configured to implement the method for controlling a lead device according to the foregoing embodiment when executing the program.
If the memory 1001, the processor 1002, and the communication interface 1003 are implemented independently, the communication interface 1003, the memory 1001, and the processor 1002 may be connected to each other through a bus and perform communication with each other. The bus may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended ISA (Extended Industry standard architecture) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in FIG. 4, but this does not indicate only one bus or one type of bus.
Optionally, in a specific implementation, if the memory 1001, the processor 1002, and the communication interface 1003 are integrated on one chip, the memory 1001, the processor 1002, and the communication interface 1003 may complete communication with each other through an internal interface.
The processor 1002 may be a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits configured to implement embodiments of the present invention.
The present invention also provides a non-transitory computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the piloting device control method as described above.
The invention also provides a computer program product in which instructions, when executed by a processor, perform the piloting device control method as described above.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing steps of a custom logic function or process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. If implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (30)

1. A method for controlling a lead device, comprising:
monitoring a guided object and judging whether the guided object is lost or not;
when the guided object is lost, controlling a guiding device to stop traveling and detecting whether the guided object exists in the surrounding environment;
if the guided object does not exist in the surrounding environment, controlling the guiding equipment to move to a preset destination, and detecting whether the guided object exists in the environment along the way;
before the controlling the piloting device stops traveling and detects whether the piloted object exists in the surrounding environment, the method further comprises:
acquiring the distance between the leading device and the preset destination;
determining that the distance between the leading device and the preset destination is greater than a preset distance threshold;
and if the distance between the leading device and the preset destination is not larger than a preset distance threshold value, controlling the leading device to travel to the preset destination.
2. The method of claim 1, further comprising:
and when the guided object is lost, controlling the guiding device to play a first voice, wherein the first voice prompts that the guided object is close to the guiding device.
3. The method of claim 1, further comprising:
and if the guided object exists in the surrounding environment or the guided object exists in the en-route environment, controlling the guiding device to play a second voice, and prompting the guided object to follow the guiding device by the second voice.
4. The method of claim 1, further comprising:
and when the leading device reaches the preset destination, controlling the leading device to turn around and playing a third voice, wherein the third voice prompts that the preset destination of the guided object is reached.
5. The method of claim 2, further comprising, prior to detecting the presence of the guided object in the surrounding environment:
determining that the guided object is not detected within a preset time period after the first voice is played by the guiding device.
6. The method according to claim 1, wherein the monitoring the guided object and determining whether the guided object is lost comprises:
detecting the guided object by a rear camera of the guiding device;
when the guided object is not detected, calculating the duration length of the undetected guided object, and acquiring the current guided scene of the guided device;
and when the duration reaches a time length threshold corresponding to the current leading scene, determining that the led object is lost.
7. The method of claim 6, wherein the lead scene comprises: a normal leading scene, a turning leading scene and an obstacle leading scene;
and the time length threshold corresponding to the normal leading scene is smaller than the time length threshold corresponding to the turning leading scene and smaller than the time length threshold corresponding to the obstacle leading scene.
8. The method of claim 1, wherein said detecting the presence of the guided object in the surrounding environment comprises:
predicting the loss direction of the guided object according to the last frame of image monitored to the guided object;
acquiring an image of the lost direction;
detecting whether the guided object exists in the image of the loss direction.
9. The method of claim 8, wherein said detecting whether the piloted object is present in the surrounding environment further comprises:
if the guided object does not exist in the image in the losing direction, acquiring images in other directions except the losing direction;
detecting whether the guided object exists in the images in other directions.
10. The method according to claim 8, wherein before predicting the direction of loss of the guided object according to the last frame of image monitored to the guided object, further comprising:
acquiring the distance between the leading device and the led object when the last frame image of the led object is monitored;
determining that a distance between the piloting device and the piloted object is less than a specified distance threshold.
11. The method of claim 10, further comprising:
and if the distance between the leading device and the led object is not smaller than a specified distance threshold value, determining that the loss direction of the led object is the rear of the leading device.
12. The method according to claim 8, wherein predicting the direction of loss of the guided object according to the last frame of image monitored to the guided object comprises:
detecting a center position of the guided object in a last frame image of the guided object monitored by a rear camera of the guiding device;
if the central position of the guided object is positioned at the left half part of the last frame image, determining that the loss direction of the guided object is the right rear part of the guiding device;
and if the central position of the guided object is positioned in the right half part of the last frame image, determining that the loss direction of the guided object is the left rear part of the guiding device.
13. The method of claim 8, wherein the obtaining the image of the missing direction comprises:
if the loss direction is the rear of the leading device, acquiring an image of the rear of the leading device in a specified time period through a rear camera of the leading device;
if the loss direction is the right rear side of the leading device, controlling the leading device or a component where a rear camera of the leading device is located to rotate anticlockwise, and acquiring an image of the right rear side of the leading device through the rear camera of the leading device;
and if the loss direction is the left rear side of the leading device, controlling the leading device or a component where a rear camera of the leading device is located to rotate clockwise, and acquiring an image of the left rear side of the leading device through the rear camera of the leading device.
14. The method of claim 1, wherein detecting the presence of the guided object in the en-vironment comprises:
and detecting whether the guided object exists in the en-route environment or not through a front camera and/or a rear camera of the guiding device.
15. A leading device control apparatus, characterized by comprising:
the monitoring module is used for monitoring the guided object and judging whether the guided object is lost;
the control module is used for controlling the leading equipment to stop advancing and detecting whether the led object exists in the surrounding environment when the led object is lost;
the control module is also used for controlling the leading device to move to a preset destination when the led object does not exist in the surrounding environment, and detecting whether the led object exists in the along-the-way environment;
the control module is further used for acquiring the distance between the leading device and the preset destination before controlling the leading device to stop traveling and detecting whether the led object exists in the surrounding environment; determining that the distance between the leading device and the preset destination is greater than a preset distance threshold; and if the distance between the leading device and the preset destination is not larger than a preset distance threshold value, controlling the leading device to travel to the preset destination.
16. The apparatus of claim 15, further comprising: a playing module;
the playing module is used for controlling the leading device to play a first voice when the led object is lost, and the first voice prompts that the led object is close to the leading device.
17. The apparatus of claim 15, further comprising: a playing module;
the playing module is used for controlling the leading device to play a second voice when the led object exists in the surrounding environment or the led object exists in the en-route environment, and the second voice prompts the led object to follow the leading device.
18. The apparatus of claim 15, further comprising: a playing module;
the playing module is configured to control the leading device to turn around and play a third voice when the leading device reaches the preset destination, where the third voice prompts that the preset destination of the led object has been reached.
19. The apparatus of claim 16, wherein the control module is further configured to determine that the guided object is not detected within a preset time period after the first voice is played by the guiding device before detecting whether the guided object exists in the surrounding environment.
20. The device according to claim 15, characterized in that the monitoring module is in particular adapted to,
detecting the guided object by a rear camera of the guiding device;
when the guided object is not detected, calculating the duration length of the undetected guided object, and acquiring the current guided scene of the guided device;
and when the duration reaches a time length threshold corresponding to the current leading scene, determining that the led object is lost.
21. The apparatus of claim 20, wherein the lead scene comprises: a normal leading scene, a turning leading scene and an obstacle leading scene;
and the time length threshold corresponding to the normal leading scene is smaller than the time length threshold corresponding to the turning leading scene and smaller than the time length threshold corresponding to the obstacle leading scene.
22. The apparatus of claim 15, wherein the control module is specifically configured to,
predicting the loss direction of the guided object according to the last frame of image monitored to the guided object;
acquiring an image of the lost direction;
detecting whether the guided object exists in the image of the loss direction.
23. The apparatus of claim 22, wherein the control module is further configured to,
if the guided object does not exist in the image in the losing direction, acquiring images in other directions except the losing direction;
detecting whether the guided object exists in the images in other directions.
24. The apparatus according to claim 22, wherein the control module is further configured to, before predicting the direction of loss of the guided object based on monitoring the last frame of image of the guided object,
acquiring the distance between the leading device and the led object when the last frame image of the led object is monitored;
determining that a distance between the piloting device and the piloted object is less than a specified distance threshold.
25. The apparatus of claim 24, wherein the control module is further configured to,
and if the distance between the leading device and the led object is not smaller than a specified distance threshold value, determining that the loss direction of the led object is the rear of the leading device.
26. The apparatus of claim 22, wherein the control module is specifically configured to,
detecting a center position of the guided object in a last frame image of the guided object monitored by a rear camera of the guiding device;
if the central position of the guided object is positioned at the left half part of the last frame image, determining that the loss direction of the guided object is the right rear part of the guiding device;
and if the central position of the guided object is positioned in the right half part of the last frame image, determining that the loss direction of the guided object is the left rear part of the guiding device.
27. The apparatus of claim 22, wherein the control module is specifically configured to,
if the loss direction is the rear of the leading device, acquiring an image of the rear of the leading device in a specified time period through a rear camera of the leading device;
if the loss direction is the right rear side of the leading device, controlling the leading device or a component where a rear camera of the leading device is located to rotate anticlockwise, and acquiring an image of the right rear side of the leading device through the rear camera of the leading device;
and if the loss direction is the left rear side of the leading device, controlling the leading device or a component where a rear camera of the leading device is located to rotate clockwise, and acquiring an image of the left rear side of the leading device through the rear camera of the leading device.
28. The apparatus of claim 15, wherein the control module is specifically configured to,
and detecting whether the guided object exists in the en-route environment or not through a front camera and/or a rear camera of the guiding device.
29. An electronic device, comprising: memory, processor and computer program stored on the memory and executable on the processor, which when executed by the processor implements a lead device control method as claimed in any one of claims 1 to 14.
30. A non-transitory computer-readable storage medium on which a computer program is stored, the program, when executed by a processor, implementing a piloting device control method as in any one of claims 1 to 14.
CN201711252444.4A 2017-12-01 2017-12-01 Method and device for controlling leading equipment Active CN108724172B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711252444.4A CN108724172B (en) 2017-12-01 2017-12-01 Method and device for controlling leading equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711252444.4A CN108724172B (en) 2017-12-01 2017-12-01 Method and device for controlling leading equipment

Publications (2)

Publication Number Publication Date
CN108724172A CN108724172A (en) 2018-11-02
CN108724172B true CN108724172B (en) 2020-09-11

Family

ID=63940893

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711252444.4A Active CN108724172B (en) 2017-12-01 2017-12-01 Method and device for controlling leading equipment

Country Status (1)

Country Link
CN (1) CN108724172B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111202330A (en) * 2020-01-07 2020-05-29 灵动科技(北京)有限公司 Self-driven system and method
CN114237249A (en) * 2021-12-17 2022-03-25 北京云迹科技股份有限公司 Control method and device used in robot leading process and leading robot

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012020858A1 (en) * 2010-08-11 2012-02-16 (주) 퓨처로봇 Intelligent driving robot for providing customer service and calculation in restaurants
CN102096415B (en) * 2010-12-31 2012-09-26 重庆邮电大学 Multi-robot formation method based on Ad-Hoc network and leader-follower algorithm
CN102360423A (en) * 2011-10-19 2012-02-22 丁泉龙 Intelligent human body tracking method
CN104298240A (en) * 2014-10-22 2015-01-21 湖南格兰博智能科技有限责任公司 Guiding robot and control method thereof
CN106155093A (en) * 2016-07-22 2016-11-23 王威 A kind of robot based on computer vision follows the system and method for human body
CN106570478A (en) * 2016-11-04 2017-04-19 北京智能管家科技有限公司 Object loss determine method and device in visual tracking
CN107160392A (en) * 2017-05-26 2017-09-15 深圳市天益智网科技有限公司 Method, device and terminal device and robot that view-based access control model is positioned and followed
CN107248173A (en) * 2017-06-08 2017-10-13 深圳市智美达科技股份有限公司 Method for tracking target, device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN108724172A (en) 2018-11-02

Similar Documents

Publication Publication Date Title
US9937922B2 (en) Collision avoidance using auditory data augmented with map data
CN110316193B (en) Preview distance setting method, device, equipment and computer readable storage medium
CN109035831A (en) Recognition methods, device, equipment, storage medium and the vehicle of traffic light
US9975558B2 (en) Control system and control method for selecting and tracking a motor vehicle
CN109466554A (en) Adaptive cruise is jumped a queue prevention and control method, system, device and storage medium
WO2019020003A1 (en) Train control method and system, and vehicle onboard controller
US20180046193A1 (en) Driving assistance device and driving assistance method
CN108724172B (en) Method and device for controlling leading equipment
US20150120187A1 (en) Apparatus and method of generating travel route of vehicle
CN109633662B (en) Obstacle positioning method and device and terminal
CN108297870A (en) A kind of control method for vehicle and device
CN108734077B (en) Method and device for controlling leading equipment
CN109814575B (en) Lane changing route planning method and device for automatic driving vehicle and terminal
CN108417028A (en) Image processing system and image processing method
CN111123732B (en) Method and device for simulating automatic driving vehicle, storage medium and terminal equipment
CN109684944B (en) Obstacle detection method, obstacle detection device, computer device, and storage medium
CN110356413A (en) For providing the device and method of the security strategy of vehicle
CN111199088A (en) Method and device for reproducing scene data
CN110285981A (en) The method for detecting abnormality and device of traffic light detection module in automatic vehicle of driving
CN109887321B (en) Unmanned vehicle lane change safety judgment method and device and storage medium
CN102436758B (en) Method and apparatus for supporting parking process of vehicle
EP3477259A1 (en) Method and system for estimating quality of measuring results of one of more sensors mounted on a mobile platform
CN111683850B (en) Parking support device and parking support method
EP3888987A1 (en) Lane selection method for vehicle when self-driving, selection system, and vehicle
CN115762139A (en) Method, device and equipment for filtering predicted track of intersection and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant