CN112596508B - Control method and device of sensor and storage medium - Google Patents

Control method and device of sensor and storage medium Download PDF

Info

Publication number
CN112596508B
CN112596508B CN201910806166.5A CN201910806166A CN112596508B CN 112596508 B CN112596508 B CN 112596508B CN 201910806166 A CN201910806166 A CN 201910806166A CN 112596508 B CN112596508 B CN 112596508B
Authority
CN
China
Prior art keywords
environment
electronic equipment
map information
target area
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910806166.5A
Other languages
Chinese (zh)
Other versions
CN112596508A (en
Inventor
陈远
孙淑萍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Midea Robozone Technology Co Ltd
Original Assignee
Midea Robozone Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Midea Robozone Technology Co Ltd filed Critical Midea Robozone Technology Co Ltd
Priority to CN201910806166.5A priority Critical patent/CN112596508B/en
Priority to PCT/CN2019/111732 priority patent/WO2021035903A1/en
Publication of CN112596508A publication Critical patent/CN112596508A/en
Application granted granted Critical
Publication of CN112596508B publication Critical patent/CN112596508B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0259Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means
    • G05D1/0263Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means using magnetic strips
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a control method of a sensor, which comprises the steps of obtaining map information of an environment where electronic equipment is located; and controlling the electronic equipment to close the downward-looking sensor in response to the fact that the target area is marked in the map information of the environment where the electronic equipment is located. The invention also discloses a control device and a storage medium of the sensor.

Description

Control method and device of sensor and storage medium
Technical Field
The present invention relates to the field of household electrical appliance technologies, and in particular, to a method and an apparatus for controlling a sensor, and a storage medium.
Background
At present, a household sweeping robot is often provided with an infrared detection downward-looking sensor for detecting whether a step exists in the environment where the sweeping robot is located, and if the step is detected, the sweeping robot is controlled to change the direction so as to avoid the step and prevent the sweeping robot from falling.
However, when the downward-looking sensor based on the infrared detection principle meets the environments such as strong light, dark carpet, floor tiles and the like, false triggering can be carried out, the place which is not a step is identified as the step by mistake, and the sweeping robot is controlled to turn to avoid. Thus, the cleaning effect is affected, resulting in poor user experience.
Disclosure of Invention
In view of the above, embodiments of the present invention are intended to provide a method and an apparatus for controlling a sensor, and a storage medium, which are used to solve the above problems in the prior art.
In order to achieve the above purpose, the technical solution of the embodiment of the present invention is realized as follows:
the embodiment of the invention provides a control method of a sensor, which comprises the following steps:
obtaining map information of an environment where the electronic equipment is located;
responding to the map information of the environment where the electronic equipment is located and marking a target area, and controlling the electronic equipment to close a downward-looking sensor; the target area is used for representing an area where the electronic equipment can be dropped.
In the foregoing solution, the acquiring map information of an environment in which the electronic device is located includes:
receiving map information of the environment where the electronic equipment is located, which is sent by other electronic equipment;
alternatively, the first and second electrodes may be,
the electronic equipment acquires the environmental information of the electronic equipment through at least one acquisition unit and establishes map information of the environment of the electronic equipment based on the environmental information.
In the above scheme, the method further comprises:
and receiving the position of the target area marked by the map information of the environment where the electronic equipment is located from the terminal.
In the above scheme, the method further comprises:
and controlling the electronic equipment to be in a first working mode, and determining a target area in the map information based on an area corresponding to a control instruction for adjusting a driving direction in the process of driving along the boundary of the environment where the electronic equipment is located in the first working mode.
In the above scheme, the method further comprises:
detecting a target area identification mark in the environment where the electronic equipment is located to obtain a detection result;
and determining a target area in the map information of the environment where the electronic equipment is located based on the detection result.
In the foregoing solution, the map information of the environment where the electronic device is located sent by the other electronic device further includes: a location of at least one target area.
The embodiment of the present invention further provides a control device of a sensor, which is characterized in that the device includes: the device comprises an acquisition module and a control module; wherein the content of the first and second substances,
the acquisition module is used for acquiring map information of the environment where the electronic equipment is located;
the control module is used for responding to a target area marked in the map information of the environment where the electronic equipment is located and controlling the electronic equipment to close the downward-looking sensor; the target area is used for representing an area where the electronic equipment can be dropped.
In the above scheme, the obtaining module is specifically configured to receive map information of an environment where the electronic device is located, which is sent by another electronic device; or the electronic equipment acquires the environment information of the electronic equipment through at least one acquisition unit and establishes map information of the environment of the electronic equipment based on the environment information.
In the above solution, the apparatus further includes a receiving module, configured to receive a location of the target area marked by the map information for an environment where the electronic device is located, where the location is sent by a terminal.
In the foregoing solution, the apparatus further includes a first determining module, configured to control the electronic device to be in a first operating mode, and determine, based on an area corresponding to a control instruction for adjusting a driving direction, a target area in the map information during driving along a boundary of an environment where the electronic device is located in the first operating mode.
In the above scheme, the apparatus further includes a second determining module, configured to detect a target area identification mark in an environment where the electronic device is located, and obtain a detection result; and determining a target area in the map information of the environment where the electronic equipment is located based on the detection result.
The embodiment of the present invention further provides a storage medium, on which an executable program is stored, and is characterized in that the executable program implements the steps of the above technical solution when being executed by a processor.
The embodiment of the invention also provides a control device of the sensor, which comprises a memory, a processor and an executable program which is stored on the memory and can be run by the processor, and is characterized in that the steps in the technical scheme are executed when the processor runs the executable program.
According to the control method and device for the sensor and the storage medium provided by the embodiment of the invention, after the electronic equipment enters the target environment, the map information of the environment where the electronic equipment is located is acquired. And if the target area is marked in the map information of the environment where the electronic equipment is located, controlling the electronic equipment to close the downward-looking sensor. Because the electronic equipment does not need to detect whether the area to be driven is the target area through the downward-looking sensor in real time in the driving process, the situation of misjudgment can be effectively avoided, the situation of bypassing due to the misjudgment of the sensor can be avoided, the problem of incomplete cleaning due to the bypassing of the electronic equipment can be avoided, and the user experience is improved.
Drawings
FIG. 1 is a schematic flow chart illustrating a method for controlling a sensor according to an embodiment of the present invention;
FIG. 2 is a flow chart of a method of controlling a sensor according to an embodiment of the present invention;
FIG. 3 is a first view of a control method of a sensor according to an embodiment of the present invention;
FIG. 4 is a second schematic view of a control method of a sensor according to an embodiment of the present invention;
FIG. 5 is a first schematic structural diagram of a control device of a sensor according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a second exemplary embodiment of a control device of a sensor;
fig. 7 is a schematic diagram of a hardware configuration of a control device of a sensor according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
The first embodiment,
In the embodiment of the present invention, an implementation flow diagram of a control method of a sensor is shown in fig. 1, and includes the following steps:
step 101: obtaining map information of an environment where the electronic equipment is located;
step 102: responding to the map information of the environment where the electronic equipment is located and marking a target area, and controlling the electronic equipment to close a downward-looking sensor; the target area is used for representing an area where the electronic equipment can be dropped.
In an embodiment of the present invention, the electronic device is a household appliance with a cleaning function, such as a sweeping robot.
Before cleaning the environment, the electronic device needs to acquire map information of the environment to determine a driving route of the electronic device.
Here, in step 101, the obtaining map information of an environment in which the electronic device is located includes: receiving map information of the environment where the electronic equipment is located, which is sent by other electronic equipment; or the electronic equipment acquires the environment information of the electronic equipment through at least one acquisition unit and establishes map information of the environment of the electronic equipment based on the environment information. Here, the other electronic device is an intelligent electronic device having an image capturing function or an image capturing function in an environment where the electronic device is located. For example, security cameras, computers equipped with cameras, etc.
Specifically, the map information of the environment where the electronic device is located is uploaded to a cloud end by the other electronic devices in the environment where the electronic device is located, and when the electronic device enters the environment where the electronic device is located for the first time, the electronic device directly downloads the map information of the environment where the electronic device is located from the cloud end.
Or the electronic equipment acquires the environment information of the electronic equipment through at least one acquisition unit of the electronic equipment, and establishes map information of the environment of the electronic equipment based on the environment information. Wherein, the acquisition unit can be a camera device, a laser scanning device and the like. The processing can be performed in this way when the electronic device enters the environment for the first time and the map information cannot be directly downloaded from the cloud.
In addition, when the camera device of the electronic equipment detects that the environment where the electronic equipment is located changes, the electronic equipment activates the downward-looking sensor again, obtains the current environment information where the electronic equipment is located by using at least one acquisition unit of the electronic equipment, establishes map information of the environment where the electronic equipment is located based on the environment information, and ensures that the drawn map information is matched with the current environment where the electronic equipment is located.
Or when the camera device of the electronic device detects that the environment in which the electronic device is located changes, the electronic device acquires the map information from other electronic devices in the current environment again, so that the map information stored by the electronic device is consistent with the current environment.
In the solution provided in this embodiment, the target area is also marked according to the map information of the environment where the electronic device is located. Specifically, there are several ways to determine the target area in the map information as follows.
Firstly, the electronic device receives the position of the target area marked by the map information of the environment where the electronic device is located from the terminal.
Specifically, after acquiring map information of an environment where the electronic device is located, a terminal marks a target area in the map information and sends the map information marked with the target area to the electronic device; or after the terminal acquires the map information of the environment where the electronic equipment is located, marking the position of the target area in the map information, and sending the marked position information of the target area to the electronic equipment. The terminal can be a smart phone; for example, the map information of the environment where the electronic device is located may be obtained or synchronized by a smart phone through an Application (APP).
For example, the electronic device acquires map information of an environment by using a visual sensor and sends the map information to a smart phone, the smart phone marks a target area in the map information through an APP, and the map information marked with the target area is sent to the electronic device. Or the smart phone sends the position of the target area in the map information at the mark position to the electronic equipment.
In addition, the electronic equipment can also utilize a laser ranging sensor to determine map information of the environment and send the map information to the smart phone, the smart phone marks a target area in the map information through the APP, and the map information marked with the target area is sent to the electronic equipment. Or the smart phone sends the position of the target area in the map information at the mark position to the electronic equipment.
And secondly, controlling the electronic equipment to be in a first working mode, and determining a target area in the map information based on an area corresponding to a control instruction for adjusting a driving direction in the process of driving along the boundary of the environment where the electronic equipment is located in the first working mode. Here, the first operation mode may be a welt mode.
The following two ways of determining the control command for adjusting the driving direction are available: in the first mode, a control command for adjusting the traveling direction is received from another device. Here, the other device may be a smart terminal, a remote control, or the like. In a second mode, the electronic device itself generates a control command for adjusting the traveling direction.
Next, a mode of receiving a control command for adjusting the traveling direction from another device will be described. Specifically, the electronic device runs at the boundary of the environment in the first working mode, and receives a control instruction for adjusting the running direction sent by other devices when the electronic device runs to the target area. Then, according to the control instruction for adjusting the driving direction, the target area in the map information can be determined.
Next, a mode of generating a control command for adjusting the traveling direction by the electronic device itself will be described. In a first working mode, the sensor on the side of the electronic equipment is in an activated state, signals are sent to two sides of the position where the electronic equipment is located, feedback information is obtained, the sensor on the side of the electronic equipment sends the received feedback information to the control unit of the electronic equipment, and the control unit determines the boundary of the environment where the electronic equipment is located. Under the condition that no feedback information is received, the control unit controls the electronic equipment to start a downward-looking sensor, the electronic equipment enters an area where the feedback information is not received, the downward-looking sensor sends the acquired information to the control unit, the area where the feedback information is not received is determined to be the target area, a control instruction for adjusting the driving direction is generated, and the driving direction of the electronic equipment is adjusted.
Detecting a target area identification mark in the environment where the electronic equipment is located by the electronic equipment to obtain a detection result; and determining a target area in the map information of the environment where the electronic equipment is located based on the detection result.
Here, the target area identification mark is an identification of a target area in an environment where the electronic device identifies, for example, a virtual bar in the target area. Specifically, in the environment where the electronic device is located, a virtual bar is attached to the target area. The virtual strip can be a laser strip, an infrared strip or a magnetic strip; for example, the infrared bar may be an infrared generator composed of infrared diodes, and further embodiments are not exhaustive. When the electronic equipment enters the environment for the first time, the electronic equipment starts a sensor for detecting the virtual strip, scans the environment, and determines the area where the virtual strip is located as the target area in the map information when the virtual strip in the environment is scanned. For example, after the electronic device enters the environment for the first time, the hall sensor is turned on, the environment is scanned, and when a magnetic stripe in the environment is detected, the area where the magnetic stripe is located is determined as the target area in the map information.
And fourthly, the electronic equipment directly receives the map information marked with the target area sent from the other electronic equipment.
In addition, if the map information of the environment where the electronic equipment is located is changed, the target area in the map information is determined again according to the updated map information. The manner of updating the map information may adopt any one of the foregoing manners, and a description thereof will not be repeated.
In step 102, in response to the map information of the environment where the electronic device is located being marked with the target area, the electronic device is controlled to turn off the downward-looking sensor. In addition, if there is no target area in the environment in which the electronic device is located that would cause the electronic device to fall, activation or deactivation of a downward-looking sensor of the electronic device may not be limited.
In this embodiment, the electronic device acquires map information of an environment where the electronic device is located, and when it is determined that the map information of the environment contains a target area, the electronic device turns off the downward-looking sensor to drive in the environment where the electronic device is located. Therefore, the electronic equipment does not need to use a downward-looking sensor to detect and judge whether the area to be driven is the target area, and the problem of misjudgment of the target area does not occur. Therefore, the problem of incomplete cleaning caused by misjudgment of the electronic equipment is avoided, and the user experience is improved.
Example II,
In the embodiment of the present invention, a control process of a sensor of a sweeping robot is described in detail, as shown in fig. 2, the method includes the following steps:
step 201: obtaining map information of the environment where the sweeping robot is located;
here, the sweeping robot may be in an environment requiring daily sweeping, such as a home, an office, or the like. The method for acquiring the map information of the environment where the sweeping robot is located comprises the following two modes.
The first mode is that map information of the environment where the sweeping robot is located, sent by other electronic equipment, is received.
Here, the other electronic devices are intelligent electronic devices having an image acquisition function or a camera function in an environment where the sweeping robot is located, for example, a security camera device, a computer with a camera, and the like. Specifically, the sweeping robot can download the map information of the environment where the sweeping robot is located, which is acquired by the other electronic equipment, from a cloud after entering the environment where the sweeping robot is located for the first time.
And secondly, the sweeping robot acquires the own environment information through at least one acquisition unit and establishes map information of the environment of the sweeping robot based on the environment information. Specifically, the sweeping robot acquires the environmental information of the robot by using a camera of the robot, a laser scanning device and other acquisition units, and then draws the map information of the environment according to the acquired environmental information. The sweeping robot enters the environment for the first time and can not directly download the map information from the cloud, and the sweeping robot can process the map information in the mode.
In addition, when the camera device of the sweeping robot detects that the environment where the sweeping robot is located changes, the sweeping robot activates the downward-looking sensor again, obtains the current environment information where the sweeping robot is located by using at least one acquisition unit of the sweeping robot, establishes the map information of the environment where the sweeping robot is located based on the environment information, and ensures that the drawn map information is matched with the current environment where the sweeping robot is located.
Or when the camera device of the sweeping robot detects that the environment of the sweeping robot is changed, the sweeping robot acquires the map information from other electronic equipment in the current environment again, so that the map information stored by the sweeping robot is consistent with the current environment.
Step 202: determining a target area in the map information;
here, the target area in the map information is determined in the following ways. The target area is used for representing an area where the sweeping robot can fall.
In a first mode, the sweeping robot receives the position of the target area marked by the map information aiming at the environment where the sweeping robot is located, which is sent by a terminal. Specifically, after obtaining map information of the environment where the sweeping robot is located, a terminal marks a target area in the map information and sends the map information marked with the target area to the sweeping robot; or after the terminal acquires the map information of the environment where the electronic equipment is located, the terminal marks the position of the target area in the map information and sends the marked position information of the target area to the sweeping robot.
For example, the sweeping robot acquires map information of an environment by using a visual sensor and sends the map information to a terminal, the terminal marks a target area in the map information, and the map information marked with the target area is sent to the sweeping robot. Or the terminal sends the position of the target area in the map information marked to the sweeping robot.
In addition, the sweeping robot can also determine map information of the environment by using a laser ranging sensor and send the map information to a terminal, the terminal marks a target area in the map information, and the map information marked with the target area is sent to the sweeping robot. Or the terminal sends the position of the target area in the map information marked to the sweeping robot.
And secondly, controlling the sweeping robot to be in a first working mode, and determining a target area in the map information based on an area corresponding to a control instruction for adjusting the driving direction in the process of driving along the boundary of the environment where the sweeping robot is located in the first working mode.
The following two ways of determining the control command for adjusting the driving direction are available: in the first mode, a control command for adjusting the traveling direction is received from another device. Here, the other device may be a smart terminal, a remote control, or the like. In the second mode, the sweeping robot generates a control instruction for adjusting the driving direction.
Next, a mode of determining the control command for adjusting the traveling direction will be described in detail. For the first way of determining the control instruction for adjusting the driving direction, as shown in fig. 3, the sweeping robot is in a welt mode, and drives along the boundary of the environment where the sweeping robot is located in the welt mode, and when the sweeping robot drives to a step area in the environment where the sweeping robot is located, the control instruction for adjusting the driving direction is received, and the robot drives along the direction CA instead of the direction AB. And determining the step area as a target area in the map information according to the control command for adjusting the driving direction.
Aiming at the second mode that the sweeping robot generates a control instruction for adjusting the running direction, in the welting mode, a sensor on the side face of the sweeping robot is in an activated state, signals are sent to two sides of the position where the sweeping robot is located, feedback information is obtained, the sensor sends the feedback information to a control unit, and the control unit determines that the sweeping robot is located on the boundary of the environment where the sweeping robot is located. Under the condition that no feedback information is received, the control unit controls the sweeping robot to start a downward-looking sensor, the sweeping robot enters an area where no feedback information is received, the downward-looking sensor sends the obtained information to the control unit, the area where no feedback information is received is determined to be a step area in the environment where the sweeping robot is located, a control instruction for adjusting the running direction is generated, and the running direction of the sweeping robot is adjusted.
In a third mode, the sweeping robot detects a target area identification mark in the environment where the sweeping robot is located to obtain a detection result; and determining a target area in the map information of the environment where the sweeping robot is located based on the detection result. As shown in fig. 4, in the environment of the sweeping robot, a virtual bar may be attached to the step area. The virtual strip can be a laser strip, an infrared strip or a magnetic strip; for example, the infrared bar may be an infrared generator composed of infrared diodes, and further embodiments are not exhaustive. When the sweeping robot enters the environment for the first time, the sweeping robot starts a sensor for detecting the virtual strip, scans the environment, and determines the area where the virtual strip is located as the target area in the map information when the virtual strip in the environment is scanned. For example, after the sweeping robot enters the environment for the first time, the hall sensor is turned on, the environment is scanned, and when a magnetic stripe in the environment is detected, the area where the magnetic stripe is located is determined as the target area in the map information.
And fourthly, directly receiving the map information marked with the target area sent from the other electronic equipment by the sweeping robot. In addition, if the map information of the environment where the sweeping robot is located is changed, the target area in the map information is determined again according to the updated map information. The manner of updating the map information may adopt any one of the foregoing manners, and a description thereof will not be repeated.
Step 203: and controlling the sweeping robot to close the downward-looking sensor in response to the fact that the target area is marked in the map information of the environment where the sweeping robot is located.
Specifically, when the target area is marked in the map information of the environment where the sweeping robot is located, the sweeping robot is controlled to close the downward-looking sensor of the sweeping robot and drive in the environment where the sweeping robot is located.
In addition, if there is no target area in the environment where the sweeping robot is located where the sweeping robot would fall, activation or deactivation of the downward-looking sensor of the sweeping robot may not be limited.
Further, when the sweeping robot is in a state of closing the downward-looking sensor, the sweeping robot detects that the sweeping robot is in a state of leaving the ground by using a switch sensor or an infrared sensor arranged on a connecting arm of the driving wheel.
Next, a principle of the floor sweeping robot detecting that the floor sweeping robot is in a state of being lifted off the ground by using the switch sensor will be described. Here, the switch sensor is automatically adjusted to an on state when the target object approaches itself; the object is adjusted to the closed state when the object is far away from the object. Because the robot of sweeping the floor can make under the circumstances of leaving ground the drive wheel is in the state of packing up, at this moment, the drive wheel with distance between the switch sensor increases, the switch sensor can become closed state by the on-state. Therefore, when the switch sensor is adjusted from the on state to the off state, the sweeping robot can be determined to leave the ground.
Similarly, the infrared sensor arranged on the connecting arm of the driving wheel can be used for judging that the sweeping robot leaves the ground. Specifically, because the robot of sweeping the floor can make under the circumstances of leaving ground the drive wheel is in the state of packing up, at this moment, the drive wheel with the distance between the infrared sensor increases. And then, judging that the sweeping robot leaves the ground.
Besides the two modes for detecting that the sweeping robot leaves the ground, the sweeping robot can also detect that the sweeping robot is in a state of leaving the ground through a gravity sensor of the sweeping robot or a pressure sensor on a driving wheel. The gravity sensor is used for judging whether the sweeping robot leaves the ground or not by detecting the change of the gravity center of the sweeping robot; the pressure sensor on the driving wheel judges whether the sweeping robot is off the ground or not by judging whether the pressure value exists or not. Specifically, when the gravity center of the sweeping robot becomes high, the sweeping robot is determined to be off the ground; and when the pressure value on the driving wheel becomes zero, determining that the sweeping robot lifts off the ground.
And under the condition that the sweeping robot is determined to be off the ground, the sweeping robot activates the downward-looking sensor again to acquire the map information of the environment. And if the newly acquired map information of the environment is not changed, closing the downward-looking sensor again.
If the map information of the environment where the map information is acquired again is changed, the foregoing steps 201 to 203 are executed. Or, a reminder message may appear to remind the user to perform a related operation of whether to turn off the downward-looking sensor.
In this embodiment, map information of an environment where the sweeping robot is located is obtained, and if a target area is marked in the map information, the sweeping robot is controlled to close a downward-looking sensor. Therefore, the down-looking sensor of the sweeping robot is not needed to be used for detecting and judging whether the area to be driven is the target area, and the problem of misjudgment of the target area can be avoided. Therefore, the situation that the sweeping robot misjudges in the driving process is avoided, and the sweeping effect of the environment is improved.
Example III,
In order to implement the control method of the sensor, an embodiment of the present invention further provides a control device of a sensor, where a schematic structural diagram of a composition of the device is shown in fig. 5, and the control device includes: an acquisition module 51 and a control module 52; wherein the content of the first and second substances,
the obtaining module 51 is configured to obtain map information of an environment where the electronic device is located;
the control module 52 is configured to control the electronic device to turn off the downward-looking sensor in response to that a target area is marked in the map information of the environment where the electronic device is located; the target area is used for representing an area where the electronic equipment can be dropped.
Here, the obtaining module 51 is specifically configured to receive map information of an environment where the electronic device is located, which is sent by another electronic device; or the electronic equipment acquires the environment information of the electronic equipment through at least one acquisition unit and establishes map information of the environment of the electronic equipment based on the environment information. The electronic device is a household appliance with a cleaning function, such as a sweeping robot. The other electronic equipment is intelligent electronic equipment with an image acquisition function or a camera shooting function in the environment where the electronic equipment is located. For example, security cameras, computers equipped with cameras, etc.
Specifically, in the environment where the electronic device is located, the other electronic devices upload the map information of the located environment to the cloud, and when the electronic device first enters the located environment, the obtaining module 51 of the electronic device directly downloads the map information of the located environment from the cloud.
Or, the obtaining module 51 obtains the environment information of the electronic device through at least one collecting unit of the electronic device, and establishes map information of the environment of the electronic device based on the environment information. Wherein, the acquisition unit can be a camera device, a laser scanning device and the like. The processing can be performed in this way when the electronic device enters the environment for the first time and the map information cannot be directly downloaded from the cloud.
Here, the apparatus further includes a receiving module 53, configured to receive a location of the target area marked by the map information of the environment where the electronic device is located from the terminal.
Specifically, after acquiring map information of an environment where the electronic device is located, the terminal marks a target area in the map information, and sends the map information marked with the target area to the receiving module 53; or after acquiring the map information of the environment where the electronic device is located, the terminal marks the position of the target area in the map information, and sends the marked position information of the target area to the receiving module 53.
Further, the apparatus further includes a first determining module 54, configured to control the electronic device to be in a first operating mode, and determine a target area in the map information based on an area corresponding to a control instruction for adjusting a driving direction during driving along a boundary of an environment where the electronic device is located in the first operating mode.
Further, the apparatus further includes a second determining module 55, configured to detect a target area identification mark in an environment where the electronic device is located, so as to obtain a detection result; and determining a target area in the map information of the environment where the electronic equipment is located based on the detection result.
Further, the structural composition of the control device of the sensor in fig. 5 also applies to the structural composition shown in fig. 6, and specifically includes: an acquisition module 51, a control module 52, a receiving module 53, a first determination module 54 and a second determination module 55.
In addition, the specific implementation process of this embodiment has been explained in detail in the foregoing technical solutions, and is not described herein again.
In practical applications, the obtaining module 51, the control module 52, the receiving module 53, the first determining module 54 and the second determining module 55 may be implemented by a Central Processing Unit (CPU), a microprocessor Unit (MPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), or the like located in a server or a terminal.
It should be noted that: in the control device of the sensor provided in the above embodiment, only the division of the program modules is exemplified when performing the sensor control, and in practical applications, the processing may be distributed to different program modules according to needs, that is, the internal structure of the device may be divided into different program modules to complete all or part of the processing described above. In addition, the control device of the sensor and the control method of the sensor provided in the above embodiments belong to the same concept, and the specific implementation process thereof is described in the method embodiments, and is not described herein again.
In order to implement the method, an embodiment of the present invention further provides another sensor control apparatus, where the apparatus includes a memory, a processor, and an executable program stored on the memory and capable of being executed by the processor, and when the processor executes the executable program, the processor performs the following operations:
obtaining map information of an environment where the electronic equipment is located;
responding to the map information of the environment where the electronic equipment is located and marking a target area, and controlling the electronic equipment to close a downward-looking sensor; the target area is used for representing an area where the electronic equipment can be dropped.
The processor is further configured to, when running the executable program, perform the following:
receiving map information of the environment where the electronic equipment is located, which is sent by other electronic equipment;
alternatively, the first and second electrodes may be,
the electronic equipment acquires the environmental information of the electronic equipment through at least one acquisition unit and establishes map information of the environment of the electronic equipment based on the environmental information.
The processor is further configured to, when running the executable program, perform the following:
and receiving the position of the target area marked by the map information of the environment where the electronic equipment is located from the terminal.
The processor is further configured to, when running the executable program, perform the following:
and controlling the electronic equipment to be in a first working mode, and determining a target area in the map information based on an area corresponding to a control instruction for adjusting a driving direction in the process of driving along the boundary of the environment where the electronic equipment is located in the first working mode.
The processor is further configured to, when running the executable program, perform the following:
detecting a target area identification mark in the environment where the electronic equipment is located to obtain a detection result;
and determining a target area in the map information of the environment where the electronic equipment is located based on the detection result.
The processor is further configured to, when running the executable program, perform the following:
the map information of the environment where the electronic device is located sent by the other electronic device further includes: a location of at least one target area. The hardware configuration of the sensor control device will be further described below, taking as an example that the sensor control device is implemented as a server or a terminal for controlling the sensor.
Fig. 7 is a schematic diagram showing a hardware configuration of a control apparatus for a sensor according to an embodiment of the present invention, and a control apparatus 700 for a sensor shown in fig. 7 includes: at least one processor 701, memory 702, user interface 703, and at least one network interface 704. The various components in the control device 700 of the sensor are coupled together by a bus system 705. It is understood that the bus system 705 is used to enable communications among the components. The bus system 705 includes a power bus, a control bus, and a status signal bus in addition to a data bus. But for clarity of illustration the various busses are labeled in figure 7 as the bus system 705.
The user interface 703 may include, among other things, a display, a keyboard, a mouse, a trackball, a click wheel, a key, a button, a touch pad, or a touch screen.
It will be appreciated that the memory 702 can be either volatile memory or nonvolatile memory, and can include both volatile and nonvolatile memory.
The memory 702 in the embodiment of the present invention is used to store various types of data to support the operation of the control device 700 of the sensor. Examples of such data include: any computer program for operating on the control device 700 of the sensor, such as executable program 7021, a program implementing a method according to an embodiment of the invention may be included in executable program 7021.
The method disclosed in the above embodiments of the present invention may be applied to the processor 701, or implemented by the processor 701. The processor 701 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be implemented by integrated logic circuits of hardware or instructions in the form of software in the processor 701. The processor 701 described above may be a general purpose processor, a DSP, or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like. The processor 701 may implement or perform the methods, steps, and logic blocks disclosed in embodiments of the present invention. A general purpose processor may be a microprocessor or any conventional processor or the like. The steps of the method disclosed by the embodiment of the invention can be directly implemented by a hardware decoding processor, or can be implemented by combining hardware and software modules in the decoding processor. The software modules may be located in a storage medium located in the memory 702, and the processor 701 may read the information in the memory 702 and perform the steps of the aforementioned methods in conjunction with its hardware.
In an exemplary embodiment, an embodiment of the present invention further provides a storage medium having an executable program stored thereon, which when executed by a processor, performs the foregoing method.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or executable program product. Accordingly, the present invention may take the form of a hardware embodiment, a software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of an executable program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) having computer-usable program code embodied therein.
The present invention has been described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and executable program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by executable program instructions. These executable program instructions may be provided to a general purpose computer, special purpose computer, embedded processor, or processor with reference to a programmable data processing apparatus to produce a machine, such that the instructions, which execute via the computer or processor with reference to the programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These executable program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These executable program instructions may also be loaded onto a computer or reference programmable data processing apparatus to cause a series of operational steps to be performed on the computer or reference programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or reference programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention.

Claims (11)

1. A method of controlling a sensor, the method comprising:
obtaining map information of an environment where the electronic equipment is located;
responding to the map information of the environment where the electronic equipment is located and marking a target area, and controlling the electronic equipment to close a downward-looking sensor; the target area is used for representing an area where the electronic equipment can fall;
the method further comprises the following steps:
detecting a target area identification mark in the environment where the electronic equipment is located to obtain a detection result;
and determining a target area in the map information of the environment where the electronic equipment is located based on the detection result.
2. The method of claim 1, wherein the obtaining map information of an environment in which the electronic device is located comprises:
receiving map information of the environment where the electronic equipment is located, which is sent by other electronic equipment;
alternatively, the first and second electrodes may be,
the electronic equipment acquires the environmental information of the electronic equipment through at least one acquisition unit and establishes map information of the environment of the electronic equipment based on the environmental information.
3. The method according to claim 1 or 2, characterized in that the method further comprises:
and receiving the position of the target area marked by the map information of the environment where the electronic equipment is located from the terminal.
4. The method according to claim 1 or 2, characterized in that the method further comprises:
and controlling the electronic equipment to be in a first working mode, and determining a target area in the map information based on an area corresponding to a control instruction for adjusting a driving direction in the process of driving along the boundary of the environment where the electronic equipment is located in the first working mode.
5. The method according to claim 2, wherein the map information of the environment where the electronic device is located sent from the other electronic device further includes: a location of at least one target area.
6. A control device for a sensor, the device comprising: the device comprises an acquisition module and a control module; wherein the content of the first and second substances,
the acquisition module is used for acquiring map information of the environment where the electronic equipment is located;
the control module is used for responding to a target area marked in the map information of the environment where the electronic equipment is located and controlling the electronic equipment to close the downward-looking sensor; the target area is used for representing an area where the electronic equipment can fall;
the device also comprises a second determining module, a first determining module and a second determining module, wherein the second determining module is used for detecting the target area identification mark in the environment where the electronic equipment is located to obtain a detection result; and determining a target area in the map information of the environment where the electronic equipment is located based on the detection result.
7. The apparatus according to claim 6, wherein the obtaining module is specifically configured to receive map information of an environment where the electronic device is located, which is sent by another electronic device; or the electronic equipment acquires the environment information of the electronic equipment through at least one acquisition unit and establishes map information of the environment of the electronic equipment based on the environment information.
8. The apparatus according to claim 6 or 7, further comprising a receiving module configured to receive a location of the target area marked by the map information about the environment where the electronic device is located from a terminal.
9. The apparatus according to claim 6 or 7, further comprising a first determining module, configured to control the electronic device to be in a first operating mode, and determine the target area in the map information based on an area corresponding to a control instruction for adjusting a driving direction during driving along a boundary of an environment in which the electronic device is located in the first operating mode.
10. A storage medium having stored thereon an executable program, the executable program when executed by a processor implementing the steps of the method of any one of claims 1 to 5.
11. A control device for a sensor, comprising a memory, a processor and an executable program stored on the memory and executable by the processor, characterized in that the steps of the method according to any one of claims 1 to 5 are performed when the executable program is executed by the processor.
CN201910806166.5A 2019-08-29 2019-08-29 Control method and device of sensor and storage medium Active CN112596508B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910806166.5A CN112596508B (en) 2019-08-29 2019-08-29 Control method and device of sensor and storage medium
PCT/CN2019/111732 WO2021035903A1 (en) 2019-08-29 2019-10-17 Control method and apparatus for sensor and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910806166.5A CN112596508B (en) 2019-08-29 2019-08-29 Control method and device of sensor and storage medium

Publications (2)

Publication Number Publication Date
CN112596508A CN112596508A (en) 2021-04-02
CN112596508B true CN112596508B (en) 2022-04-12

Family

ID=74684966

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910806166.5A Active CN112596508B (en) 2019-08-29 2019-08-29 Control method and device of sensor and storage medium

Country Status (2)

Country Link
CN (1) CN112596508B (en)
WO (1) WO2021035903A1 (en)

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100926783B1 (en) * 2008-02-15 2009-11-13 한국과학기술연구원 Method for self-localization of a robot based on object recognition and environment information around the recognized object
KR102020210B1 (en) * 2013-04-11 2019-11-05 삼성전자주식회사 Sensor module and robot cleaner having the same
US9410979B2 (en) * 2014-09-23 2016-08-09 Fitbit, Inc. Hybrid angular motion sensors
CN104501943B (en) * 2014-12-05 2016-10-26 广东美的制冷设备有限公司 The self-checking unit of light sensor and self checking method in air-conditioner and air-conditioner
CN106289359B (en) * 2015-05-19 2019-12-20 科沃斯机器人股份有限公司 Downward-looking sensor detection system and method and self-moving processing system and method thereof
SG11201804933SA (en) * 2015-12-16 2018-07-30 Mbl Ltd Robotic kitchen including a robot, a storage arrangement and containers therefor
CN106913289B (en) * 2015-12-25 2021-01-01 北京奇虎科技有限公司 Sweeping processing method and device of sweeping robot
WO2017218586A1 (en) * 2016-06-13 2017-12-21 Gamma2Robotics Methods and systems for reducing false alarms in a robotic device by sensor fusion
CN106272420B (en) * 2016-08-30 2019-07-02 北京小米移动软件有限公司 Robot and robot control method
DE102016125408A1 (en) * 2016-12-22 2018-06-28 RobArt GmbH AUTONOMOUS MOBILE ROBOT AND METHOD FOR CONTROLLING AN AUTONOMOUS MOBILE ROBOT
CN107102294A (en) * 2017-06-19 2017-08-29 安徽味唯网络科技有限公司 A kind of method in sweeping robot intelligent planning path
CN206856691U (en) * 2017-06-23 2018-01-09 南京工程学院 Anti- phenomenon of taking throttle as brake intelligence control system
CN111492403A (en) * 2017-10-19 2020-08-04 迪普迈普有限公司 Lidar to camera calibration for generating high definition maps
CN107608360A (en) * 2017-10-26 2018-01-19 深圳市银星智能科技股份有限公司 Mobile robot
CN108020844B (en) * 2017-11-27 2020-07-07 深圳市无限动力发展有限公司 Cliff detection method and robot
CN107775640A (en) * 2017-12-05 2018-03-09 深圳市银星智能科技股份有限公司 Mobile robot
WO2019113859A1 (en) * 2017-12-13 2019-06-20 广州艾若博机器人科技有限公司 Machine vision-based virtual wall construction method and device, map construction method, and portable electronic device
US10638906B2 (en) * 2017-12-15 2020-05-05 Neato Robotics, Inc. Conversion of cleaning robot camera images to floorplan for user interaction
CN108222605A (en) * 2018-01-16 2018-06-29 江苏海天科技有限公司 It is a kind of that there is the stereo garage for intercepting ultrahigh detection device
CN109375224A (en) * 2018-09-30 2019-02-22 小狗电器互联网科技(北京)股份有限公司 A kind of steep cliff detection method, device and sweeping robot
CN109602341B (en) * 2019-01-23 2020-09-15 珠海市一微半导体有限公司 Cleaning robot falling control method and chip based on virtual boundary

Also Published As

Publication number Publication date
WO2021035903A1 (en) 2021-03-04
CN112596508A (en) 2021-04-02

Similar Documents

Publication Publication Date Title
CN106541407B (en) Cleaning robot and control method thereof
CN108958490B (en) Electronic device, gesture recognition method thereof and computer-readable storage medium
EP3032369A2 (en) Methods for clearing garbage and devices for the same
CN105279898A (en) Alarm method and device
CN111787223B (en) Video shooting method and device and electronic equipment
CN104978133A (en) Screen capturing method and screen capturing device for intelligent terminal
CN104123520A (en) Two-dimensional code scanning method and device
US20160041632A1 (en) Contact detection system, information processing method, and information processing apparatus
CN103999020A (en) Method for gesture control, gesture server device and sensor input device
CN105338399A (en) Image acquisition method and device
CN105518560A (en) Location-based control method and apparatus, mobile machine and robot
CN105243350A (en) Code scanning method and code scanning device
JP2008525051A5 (en)
KR20190099642A (en) Microwave, display device and cooking system including the same
CN107479710B (en) Intelligent mirror and control method, device, equipment and storage medium thereof
CN103699260A (en) Method for starting terminal function module, and terminal equipment
CN104320591A (en) Method and device for controlling front-rear switching of camera and intelligent terminal
CN111431775A (en) Control method and device for household electrical appliance and range hood
US11029753B2 (en) Human computer interaction system and human computer interaction method
CN110874131A (en) Building intercom indoor unit and control method and storage medium thereof
CN111862371A (en) Express delivery automatic signing method based on intelligent visual doorbell and related equipment
CN112596508B (en) Control method and device of sensor and storage medium
CN110597081A (en) Method and device for sending control instruction based on smart home operating system
CN104601629A (en) Processing method and processing apparatus, control apparatus and working method thereof as well as control method and control system
CN104898485A (en) Positioning method and device for household electrical appliance, and server

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant