CN114515124B - Cleaning position determining method, cleaning position determining device, cleaning position determining equipment and storage medium - Google Patents

Cleaning position determining method, cleaning position determining device, cleaning position determining equipment and storage medium Download PDF

Info

Publication number
CN114515124B
CN114515124B CN202210419052.7A CN202210419052A CN114515124B CN 114515124 B CN114515124 B CN 114515124B CN 202210419052 A CN202210419052 A CN 202210419052A CN 114515124 B CN114515124 B CN 114515124B
Authority
CN
China
Prior art keywords
cleaning
cleaning position
determining
target
related information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210419052.7A
Other languages
Chinese (zh)
Other versions
CN114515124A (en
Inventor
何世友
杭大明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Baseus Technology Co Ltd
Original Assignee
Shenzhen Baseus Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Baseus Technology Co Ltd filed Critical Shenzhen Baseus Technology Co Ltd
Priority to CN202210419052.7A priority Critical patent/CN114515124B/en
Publication of CN114515124A publication Critical patent/CN114515124A/en
Application granted granted Critical
Publication of CN114515124B publication Critical patent/CN114515124B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/06Control of the cleaning action for autonomous devices; Automatic detection of the surface condition before, during or after cleaning

Abstract

The invention belongs to the technical field of intelligent equipment, and discloses a cleaning position determining method, a cleaning position determining device, cleaning position determining equipment and a storage medium. The method comprises the following steps: collecting a view field picture in real time; judging whether the field-of-view pictures collected each time contain preset patterns, wherein the preset patterns are generated when a user points to a cleaning position through a handheld terminal; and if the preset pattern is contained, determining a target coordinate corresponding to the cleaning position according to the view field picture. In this way, the user presets the pattern at clean position mark through handheld terminal, and cleaning machines people is through the pattern of predetermineeing in the discernment visual field picture in order to confirm clean position place coordinate, and cleaning machines people moves the clean position appointed to the user fast automatically and cleans of being convenient for, need not the manual work and progressively handle the walking direction of robot, has improved cleaning machines people's clean efficiency, has promoted user experience.

Description

Cleaning position determining method, device, equipment and storage medium
Technical Field
The invention relates to the technical field of intelligent equipment, in particular to a method, a device, equipment and a storage medium for determining a cleaning position.
Background
With the rapid development of smart device technology, cleaning robots having an automatic cleaning function are becoming more and more popular. The cleaning robot can automatically perform a cleaning operation in a waiting cleaning space of a home space or a large-sized place to clean the space to be cleaned, thereby saving a large amount of cleaning time for a user.
Except that the space to be cleaned is cleaned automatically, the cleaning robot can move straight, retreat, turn left or turn right under the control of the remote controller, but the walking direction of the cleaning robot is required to be controlled manually in the mode step by step, the cleaning robot cannot move to the position where a user needs to clean easily and quickly, the cleaning efficiency is low, and the user experience is poor.
The above is only for the purpose of assisting understanding of the technical aspects of the present invention, and does not represent an admission that the above is prior art.
Disclosure of Invention
The invention mainly aims to provide a cleaning position determining method, a cleaning position determining device, cleaning equipment and a storage medium, and aims to solve the technical problems that an existing cleaning robot needs to manually control the walking direction step by step, cannot easily and quickly move to a position needing cleaning by a user, is low in cleaning efficiency and poor in user experience.
To achieve the above object, the present invention provides a cleaning position determining method including the steps of:
collecting a view field picture in real time;
judging whether a preset pattern is contained in each collected view field picture or not, wherein the preset pattern is generated when a user points to a cleaning position through a handheld terminal;
and if the preset pattern is contained, determining a target coordinate corresponding to the cleaning position according to the view field picture.
Optionally, before determining the target coordinate corresponding to the cleaning position according to the view field picture, the method further includes:
acquiring initial position related information corresponding to the current position of the terminal equipment in a target coordinate system;
the determining the target coordinates corresponding to the cleaning position according to the view field picture includes:
determining cleaning position related information corresponding to the cleaning position according to the view field picture;
and determining the target coordinate of the cleaning position in the target coordinate system according to the starting position related information and the cleaning position related information.
Optionally, the start position related information includes a start coordinate and a start direction of the current position of the terminal device in the target coordinate system, and the cleaning position related information includes a cleaning distance and a cleaning angle between the cleaning position and the current position of the terminal device;
the determining the target coordinate of the cleaning position in the target coordinate system according to the start position-related information and the cleaning position-related information includes:
determining a reference angle of a target straight line between the cleaning position and the current position of the terminal device in the target coordinate system according to the starting direction and the cleaning angle, wherein the reference angle is an angle between the target straight line and a coordinate axis of the target coordinate system;
determining a relative coordinate of the cleaning position between the target coordinate system and the starting coordinate according to the cleaning distance and the reference angle;
and determining a target coordinate of the cleaning position in the target coordinate system according to the starting coordinate and the relative coordinate.
Optionally, the determining, according to the view field picture, a target coordinate corresponding to the cleaning position includes:
determining cleaning position related information corresponding to the cleaning position according to the view field picture;
and determining a target coordinate of the cleaning position in a reference coordinate system according to the cleaning position related information, wherein the reference coordinate system is a coordinate system established by taking the current position of the terminal equipment as an origin.
Optionally, the cleaning position related information includes a cleaning distance and a cleaning angle between the cleaning position and a position where the terminal device is currently located;
the determining the target coordinate of the cleaning position in the reference coordinate system according to the cleaning position related information comprises:
determining a first reference coordinate value and a second reference coordinate value according to the cleaning distance and the cleaning angle;
and determining the target coordinate of the cleaning position in a reference coordinate system according to the fixed first reference coordinate value and the second reference coordinate value.
Optionally, the field-of-view pictures are acquired using a binocular camera, the field-of-view pictures including a first reference image and a second reference image;
the determining the cleaning position related information corresponding to the cleaning position according to the view field picture includes:
determining a first pixel position corresponding to the preset pattern according to the first reference image, and determining a first off-center value corresponding to the first pixel position according to an image center line corresponding to the first reference image;
determining a second pixel position corresponding to the preset pattern according to the second reference image, and determining a second off-center value corresponding to the second pixel position according to an image center line corresponding to the second reference image;
and determining cleaning position related information corresponding to the cleaning position according to the first deviation center value, the second deviation center value and a preset binocular center distance.
Optionally, the handheld terminal is provided with a laser emitting hole matched with a preset pattern, and when receiving a trigger instruction, the handheld terminal emits laser forward so as to project a light spot shaped as the preset pattern at the pointed cleaning position.
Further, to achieve the above object, the present invention also provides a cleaning position determining apparatus including:
the acquisition module is used for acquiring a view field picture in real time;
the judging module is used for judging whether a preset pattern is contained in each acquired view field picture, and the preset pattern is generated when a user points to a cleaning position through the handheld terminal;
and the determining module is used for determining a target coordinate corresponding to the cleaning position according to the view field picture if the preset pattern is included.
Further, to achieve the above object, the present invention also proposes a cleaning position determining apparatus including: a memory, a processor, and a cleaning position determining program stored on the memory and executable on the processor, the cleaning position determining program configured to implement a cleaning position determining method as described above.
Furthermore, to achieve the above object, the present invention also proposes a storage medium having stored thereon a cleaning position determining program which, when executed by a processor, implements the cleaning position determining method as described above.
The invention collects the view field picture in real time; judging whether the field-of-view pictures collected each time contain preset patterns, wherein the preset patterns are generated when a user points to a cleaning position through a handheld terminal; and if the preset pattern is contained, determining a target coordinate corresponding to the cleaning position according to the view field picture. In this way, the user presets the pattern at clean position mark through handheld terminal, and cleaning machines people is through the pattern of presetting in the discernment visual field picture in order to confirm clean position place coordinate, and cleaning machines people removes the clean position that appoints to the user automatically fast and cleans of being convenient for, need not the manual work and progressively handle the walking direction of robot, has improved cleaning machines people's clean efficiency, has promoted user experience.
Drawings
FIG. 1 is a schematic diagram of a clean location determining apparatus for a hardware operating environment according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of a cleaning position determining method according to a first embodiment of the present invention;
FIG. 3 is a schematic flow chart of a cleaning position determining method according to a second embodiment of the present invention;
FIG. 4 is a first graph of a cleaning position according to an embodiment of the cleaning position determining method of the present invention;
FIG. 5 is a schematic flow chart of a cleaning position determining method according to a third embodiment of the present invention;
FIG. 6 is a second graph of a cleaning position according to an embodiment of the cleaning position determining method of the present invention;
fig. 7 is a block diagram showing the structure of the cleaning position determining apparatus according to the first embodiment of the present invention.
The implementation, functional features and advantages of the present invention will be further described with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
As shown in fig. 1, the terminal device may include: a processor 1001, a communication bus 1002, a user interface 1003, a network interface 1004, and a memory 1005. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include an input unit such as a key. The network interface 1004 may optionally include a standard wired interface, a Wireless interface (e.g., a Wireless-Fidelity (Wi-Fi) interface). The Memory 1005 may be a high speed Random Access Memory (RAM).
Those skilled in the art will appreciate that the configuration shown in fig. 1 does not constitute a limitation of the terminal device and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
As shown in fig. 1, the memory 1005, which is a storage medium, may include therein an operating system, a network communication module, a user interface module, and a cleaning position determining program.
In the terminal device shown in fig. 1, the network interface 1004 is mainly used for data communication with other devices in the cleaning system; the user interface 1003 is mainly used for data interaction with a user; the processor 1001 and the memory 1005 in the terminal device of the present invention may be provided in the terminal device, and the terminal device calls the cleaning position determination program stored in the memory 1005 through the processor 1001 and executes the cleaning position determination method provided by the embodiment of the present invention.
An embodiment of the present invention provides a cleaning position determining method, and referring to fig. 2, fig. 2 is a schematic flowchart of a first embodiment of the cleaning position determining method according to the present invention.
In this embodiment, the cleaning position determining method includes the steps of:
step S10: and collecting the view field picture in real time.
It is understood that the main execution body of the embodiment is a terminal device, and the terminal device may be a base station in a cleaning system or a cleaning robot, which is not limited in this embodiment.
It should be noted that, a camera is installed on the terminal device, and is used for acquiring a view field picture within a view field range. Optionally, a picture acquisition frequency is set, and the terminal device acquires the field-of-view pictures every other short time according to the picture acquisition frequency, so that the field-of-view pictures are acquired in real time.
Optionally, when the user projects the laser forward through the handheld terminal, the handheld terminal sends a working instruction to the terminal device, the terminal device starts the camera according to the working instruction, and collects the view field picture in the view field in real time according to the preset picture collection frequency.
Step S20: and judging whether the field-of-view picture acquired each time contains a preset pattern, wherein the preset pattern is generated when the user points to the cleaning position through the handheld terminal.
It should be understood that, the target detection algorithm is used to detect whether the light spot is included in the view field picture, and after the light spot is determined to exist in the view field picture, the shape of the light spot is detected to be consistent with the preset pattern. In the specific implementation, a target detection algorithm is trained in advance according to a large amount of laser spot data, so that the target detection algorithm identifies the characteristic information of the laser spots, and a field-of-view picture is detected based on the trained target detection algorithm, so that whether the field-of-view picture contains the spots is determined. When the light spots are detected in the view field picture, the light spot image is intercepted according to a target detection frame obtained by a target detection algorithm, gray level conversion and binary conversion are carried out on the light spot image, contour recognition is carried out on the converted image, the number of corresponding vertexes is determined, the shape corresponding to the light spots in the view field picture is determined based on the number of the vertexes, and if the shape corresponding to the light spots in the view field picture is consistent with a preset pattern, the collected view field picture is judged to contain the preset pattern.
Furthermore, the handheld terminal is provided with a laser emitting hole matched with a preset pattern, and when a trigger instruction is received, laser is emitted forwards so as to project a light spot with the shape of the preset pattern at a pointed cleaning position.
It should be noted that, in a specific application scenario, a user points to an area to be cleaned by triggering a handheld terminal in a field of view of a cleaning robot or a base station, the handheld terminal emits laser forward, a light spot of a preset pattern is projected on a dirty ground, when the cleaning robot scans that the preset pattern exists in the field of view, a target coordinate corresponding to a position where the preset pattern exists is confirmed, and the cleaning robot moves to the position where the preset pattern exists to perform a cleaning task; when the base station scans the preset pattern in the view field, the target coordinate corresponding to the position of the preset pattern is confirmed, and the target coordinate is sent to the cleaning robot, so that the cleaning robot moves to the position of the preset pattern to perform a cleaning task, the cleaning robot realizes the 'which the user points to clean', the direction of the robot is not required to be controlled step by the user, and the cleaning efficiency of the cleaning robot is improved.
Step S30: and if the preset pattern is contained, determining a target coordinate corresponding to the cleaning position according to the view field picture.
It should be understood that the preset pattern in the view picture is analyzed, the distance and the angle between the cleaning position where the preset pattern is located and the current position where the terminal device is located are determined, and the target coordinate of the cleaning position pair is determined according to the distance and the angle, wherein the target coordinate may be a coordinate where the preset pattern is located in a coordinate system established by taking the terminal device as the center, or may be a coordinate where the preset pattern is located in a known map coordinate system.
In one implementation, a base station or a cleaning robot determines target coordinates of a cleaning position and initial coordinates and pose of the cleaning robot, determines a distance and a pose adjustment angle between the cleaning robot and the cleaning position according to the target coordinates and the initial coordinates, adjusts the pose according to the pose adjustment angle, and moves to the cleaning position according to the distance to perform a cleaning task, wherein the base station transmits the distance and the pose adjustment angle to the cleaning robot after determining the distance and the pose adjustment angle between the cleaning robot and the cleaning position, so that the cleaning robot adjusts the pose according to the pose adjustment angle and moves to the cleaning position according to the distance to perform the cleaning task.
Preferably, in order to avoid collision of the cleaning robot caused by obstacles in the cleaning area, when the cleaning robot determines or acquires a target coordinate corresponding to the cleaning position, a moving track heading to the target coordinate is generated according to the obstacles in a known map coordinate system, and the cleaning robot executes a cleaning task heading to the cleaning position according to the moving track.
It should be noted that, if the cleaning robot or the base station detects that a plurality of preset patterns exist in the view field picture (the user points to a plurality of cleaning positions through a plurality of handheld terminals), the distances between the plurality of cleaning positions and the cleaning robot are determined according to target coordinates corresponding to the plurality of preset patterns and initial coordinates where the cleaning robot is located, the target coordinate closest to the cleaning robot is selected as a first cleaning coordinate, the target coordinate closest to the first cleaning coordinate is selected from the remaining target coordinates as a second cleaning coordinate, and so on, a plurality of cleaning coordinates are obtained by arranging in sequence, and the cleaning robot goes to the cleaning positions in sequence to execute a cleaning task.
In a specific implementation, the terminal device of this embodiment is a cleaning robot or a base station:
if the terminal equipment is a cleaning robot, the cleaning robot collects view field pictures in real time, determines a target coordinate corresponding to a cleaning position according to the collected view field pictures, moves to the cleaning position according to the target coordinate and executes a cleaning task;
if the terminal equipment is a base station, the base station collects a view field picture in real time, and after a target coordinate corresponding to a cleaning position is determined according to the collected view field picture, the target coordinate is sent to the cleaning robot, so that the cleaning robot moves to the cleaning position according to the target coordinate, and a cleaning task is executed.
The embodiment collects the view field pictures in real time; judging whether the field-of-view pictures collected each time contain preset patterns, wherein the preset patterns are generated when a user points to a cleaning position through a handheld terminal; and if the preset pattern is contained, determining a target coordinate corresponding to the cleaning position according to the view field picture. In this way, the user presets the pattern at clean position mark through handheld terminal, and cleaning machines people is through the pattern of presetting in the discernment visual field picture in order to confirm clean position place coordinate, and cleaning machines people removes the clean position that appoints to the user automatically fast and cleans of being convenient for, need not the manual work and progressively handle the walking direction of robot, has improved cleaning machines people's clean efficiency, has promoted user experience.
Referring to fig. 3, fig. 3 is a flowchart illustrating a cleaning position determining method according to a second embodiment of the present invention.
Based on the first embodiment, before the determining the target coordinates corresponding to the cleaning position according to the view field picture, the method for determining the cleaning position in this embodiment further includes:
step S301: and acquiring the relevant information of the initial position corresponding to the current position of the terminal equipment in the target coordinate system.
It will be appreciated that the target coordinate system may be a map coordinate system known to the cleaning robot and the base station, but may also be other known coordinate systems, such as a world coordinate system. The terminal equipment positions the relative information of the initial position of the current position in a target coordinate system through a sensor arranged on the terminal equipment, wherein the relative information of the initial position comprises an initial coordinate and an initial direction.
The determining the target coordinates corresponding to the cleaning position according to the view field picture includes:
step S302: and determining cleaning position related information corresponding to the cleaning position according to the view field picture.
It should be noted that the cleaning position related information may be a cleaning distance and a cleaning angle between the cleaning position and a current position of the terminal device, in one implementation manner, when the handheld terminal projects the laser forward under the trigger of the user, the distance between the handheld terminal and the laser drop point position is obtained, the size of the laser emitting hole is adjusted according to the distance, so as to project a light spot with a certain size on the ground, and it is assumed that the size of the light spot is a preset size. The method comprises the steps that the terminal equipment collects a view field picture through a monocular camera, orthodontic adjustment is conducted on preset patterns in the view field picture, the adjusted pixel size is determined, the pixel size is converted based on internal and external parameters of the camera calibrated in advance, the actual size corresponding to the preset patterns is determined, and the cleaning distance and the cleaning angle between a cleaning position and the current position of the terminal equipment are determined according to the actual size and the preset size.
Optionally, the field-of-view pictures are acquired using a binocular camera, the field-of-view pictures including a first reference image and a second reference image;
the step S302 includes: determining a first pixel position corresponding to the preset pattern according to the first reference image, and determining a first off-center value corresponding to the first pixel position according to an image center line corresponding to the first reference image; determining a second pixel position corresponding to the preset pattern according to the second reference image, and determining a second off-center value corresponding to the second pixel position according to an image center line corresponding to the second reference image; and determining cleaning position related information corresponding to the cleaning position according to the first deviation center value, the second deviation center value and a preset binocular center distance.
It should be understood that the binocular camera is calibrated in advance, and the focal length f, the binocular center distance T and the image center line corresponding to the binocular camera are determined and stored. The binocular camera of the terminal equipment acquires a first reference image and a second reference image according to a certain frequency, determines a first pixel position corresponding to a reference point of a preset pattern in the first reference image, and determines a second pixel position corresponding to the reference point of the preset pattern in the second reference image, wherein optionally, the reference point of the preset pattern is an upper vertex, a lower vertex, a left vertex, a right vertex or a center point of the preset pattern. When analyzing the preset patterns in the field picture, the focal length f, the binocular central distance T and the image center line are taken from the preset storage interval, and a first deviation center value x is determined according to the image center lines respectively corresponding to the binocular cameraslAnd a second off-center value xrAccording to a first off-center value xlSecond off-center value xrAnd the focal length f and the preset binocular center distance T determine the cleaning distance Z corresponding to the cleaning position through the following formula:
Figure 290219DEST_PATH_IMAGE001
;
it should be noted that a conversion matrix between a pixel coordinate system corresponding to any one of the binocular cameras and a mobile terminal coordinate system is calibrated in advance, the first pixel position or the second pixel position is mapped to the mobile terminal coordinate system based on the conversion matrix and the cleaning distance Z, and a cleaning angle between the cleaning position and the current position of the mobile terminal is determined.
Step S303: and determining target coordinates of the cleaning position in the target coordinate system according to the starting position related information and the cleaning position related information.
Specifically, the start position related information includes a start coordinate and a start direction of the current position of the terminal device in the target coordinate system, and the cleaning position related information includes a cleaning distance and a cleaning angle between the cleaning position and the current position of the terminal device;
the step S303 includes: determining a reference angle of a target straight line between the cleaning position and the current position of the terminal device in the target coordinate system according to the starting direction and the cleaning angle, wherein the reference angle is an angle between the target straight line and a coordinate axis of the target coordinate system; determining a relative coordinate of the cleaning position between the target coordinate system and the starting coordinate according to the cleaning distance and the reference angle; and determining a target coordinate of the cleaning position in the target coordinate system according to the starting coordinate and the relative coordinate.
It should be understood that, referring to fig. 4, fig. 4 is a first coordinate diagram of a cleaning position according to an embodiment of the cleaning position determining method of the present invention, the coordinate system in fig. 4 is a known target coordinate system (e.g. a map coordinate system), the starting coordinate corresponding to the current position M of the mobile terminal is (a, b), the corresponding starting direction is the direction corresponding to the straight line MC, the mobile terminal detects a preset pattern in the field of view (the dotted line sector area in fig. 4), the cleaning distance between the detected cleaning position a and the current position M of the mobile terminal is represented as | MA |, the cleaning angle is represented as θ, and the reference angle struma between the coordinate axes of the target straight line MA and the target coordinate system can be determined according to the angle α and the cleaning angle θ corresponding to the starting direction MC1Determining relative coordinates of the cleaning position between the target coordinate system and the start coordinate based on the following formula:
Figure 291542DEST_PATH_IMAGE002
Figure 51687DEST_PATH_IMAGE003
according to the starting coordinates (a, b) and relative coordinates (| MA)1|,|MA2L) determining a target coordinate (a + | MA) in which the cleaning position is located in a target coordinate system1|,b+|MA2|)。
The embodiment collects the view field pictures in real time; judging whether the field-of-view pictures collected each time contain preset patterns, wherein the preset patterns are generated when a user points to a cleaning position through a handheld terminal; if the preset pattern is contained, acquiring initial position related information corresponding to the current position of the terminal equipment in the target coordinate system; determining cleaning position related information corresponding to the cleaning position according to the view field picture; and determining target coordinates of the cleaning position in the target coordinate system according to the starting position related information and the cleaning position related information. In this way, the user presets the pattern at clean position mark through handheld terminal, cleaning machines people is through the pattern of predetermineeing in the discernment visual field picture in order to confirm clean position information, confirm clean position place coordinate based on known initial position and clean position information, be convenient for cleaning machines people based on known coordinate system location self position and clean position, automatically, remove to the clean position that the user appointed fast and clean, need not the manual work and progressively handle the walking direction of robot, cleaning machines people's cleaning efficiency has been improved, user experience has been promoted.
Referring to fig. 5, fig. 5 is a flowchart illustrating a cleaning position determining method according to a third embodiment of the present invention.
Based on the first embodiment, the determining method of the cleaning position according to this embodiment, in the determining the target coordinate corresponding to the cleaning position according to the view picture, includes:
step S304: and determining cleaning position related information corresponding to the cleaning position according to the view field picture.
It can be understood that, in an implementation manner of step S304 in this embodiment, referring to step S302 in the second embodiment, in an implementation manner, a monocular camera is used to acquire a view field picture, perform orthodontic adjustment on a preset pattern in the view field picture, determine an adjusted pixel size, convert the pixel size based on internal and external parameters of the camera calibrated in advance, determine an actual size corresponding to the preset pattern, and determine a cleaning distance and a cleaning angle between a cleaning position and a current position where the terminal device is located according to the actual size and the preset size.
Preferably, the view field pictures are acquired by using a binocular camera, and the view field pictures comprise a first reference image and a second reference image;
the step S304 includes: determining a first pixel position corresponding to the preset pattern according to the first reference image, and determining a first off-center value corresponding to the first pixel position according to an image center line corresponding to the first reference image; determining a second pixel position corresponding to the preset pattern according to the second reference image, and determining a second off-center value corresponding to the second pixel position according to an image center line corresponding to the second reference image; and determining cleaning position related information corresponding to the cleaning position according to the first deviation center value, the second deviation center value and a preset binocular center distance.
Step S305: and determining target coordinates of the cleaning position in a reference coordinate system according to the cleaning position related information, wherein the reference coordinate system is a coordinate system established by taking the current position of the terminal equipment as an origin.
Specifically, the cleaning position related information includes a cleaning distance and a cleaning angle between the cleaning position and a position where the terminal device is currently located;
the step S305 includes: determining a first reference coordinate value and a second reference coordinate value according to the cleaning distance and the cleaning angle; and determining the target coordinate of the cleaning position in a reference coordinate system according to the fixed first reference coordinate value and the second reference coordinate value.
Referring to fig. 6, fig. 6 shows the present inventionA second coordinate diagram of the cleaning position according to an embodiment of the method for determining a cleaning position, where the coordinate system in fig. 6 is a coordinate system established by using the position of the terminal device as an origin and the orientation direction of the terminal device as a coordinate axis, a cleaning distance between the cleaning position B and the current position B of the terminal device is represented as | OB |, and a cleaning angle is represented as θ, and a first reference coordinate value | OB | is determined according to the cleaning distance | OB | and the cleaning angle θ by the following formulas1| and a second reference coordinate value | OB2|:
Figure 505671DEST_PATH_IMAGE004
Figure 704571DEST_PATH_IMAGE005
Thereby determining the target coordinate (| OB) of the cleaning position in the reference coordinate system1|,|OB2|)。
Further, when the mobile terminal is a base station, the base station determines a target coordinate of the cleaning position under a base station coordinate system according to a view field picture in a view field, communicates with the cleaning robot through a position determining device (UWB) to determine the distance and the angle between the base station and the cleaning robot, converts the target coordinate into the cleaning robot coordinate system according to the distance and the angle between the base station and the cleaning robot to obtain a final cleaning coordinate, and sends the final cleaning coordinate to the cleaning robot so that the cleaning robot moves to the cleaning position according to the coordinate system of the cleaning robot and the final cleaning coordinate and executes a cleaning task.
The embodiment collects the view field picture in real time; judging whether the field-of-view pictures collected each time contain preset patterns, wherein the preset patterns are generated when a user points to a cleaning position through a handheld terminal; if the cleaning position contains the preset pattern, determining cleaning position related information corresponding to the cleaning position according to the view field picture; and determining target coordinates of the cleaning position in a reference coordinate system according to the cleaning position related information, wherein the reference coordinate system is a coordinate system established by taking the current position of the terminal equipment as an origin. In this way, the user presets the pattern at clean position mark through handheld terminal, cleaning machines people is through the pattern of predetermineeing in the discernment visual field picture in order to confirm clean positional information, confirm the coordinate that clean position is located cleaning machines people place coordinate system based on clean positional information, it cleans to be convenient for cleaning machines people to remove to user's appointed clean position automatically fast based on the coordinate system at self place, need not the manual work and progressively handle the walking direction of robot, cleaning machines people's cleaning efficiency has been improved, user experience has been promoted.
Furthermore, an embodiment of the present invention also provides a storage medium having a cleaning position determining program stored thereon, where the cleaning position determining program is executed by a processor to implement the cleaning position determining method as described above.
Since the storage medium adopts all technical solutions of all the above embodiments, at least all the beneficial effects brought by the technical solutions of the above embodiments are achieved, and details are not repeated herein.
Referring to fig. 7, fig. 7 is a block diagram showing the structure of the first embodiment of the cleaning position determining apparatus of the present invention.
As shown in fig. 7, a cleaning position determining apparatus according to an embodiment of the present invention includes:
and the acquisition module 10 is used for acquiring the view field pictures in real time.
The judging module 20 is configured to judge whether the field-of-view picture acquired each time includes a preset pattern, where the preset pattern is generated when the user points to the cleaning position through the handheld terminal.
A determining module 30, configured to determine, if the preset pattern is included, a target coordinate corresponding to the cleaning position according to the view picture.
It should be understood that the above is only an example, and the technical solution of the present invention is not limited in any way, and in a specific application, a person skilled in the art may set the technical solution as needed, and the present invention is not limited in this respect.
The embodiment collects the view field picture in real time; judging whether the field-of-view pictures collected each time contain preset patterns, wherein the preset patterns are generated when a user points to a cleaning position through a handheld terminal; and if the preset pattern is contained, determining a target coordinate corresponding to the cleaning position according to the view field picture. In this way, the user presets the pattern at clean position mark through handheld terminal, and cleaning machines people is through the pattern of presetting in the discernment visual field picture in order to confirm clean position place coordinate, and cleaning machines people removes the clean position that appoints to the user automatically fast and cleans of being convenient for, need not the manual work and progressively handle the walking direction of robot, has improved cleaning machines people's clean efficiency, has promoted user experience.
It should be noted that the above-described work flows are only exemplary, and do not limit the scope of the present invention, and in practical applications, a person skilled in the art may select some or all of them to achieve the purpose of the solution of the embodiment according to actual needs, and the present invention is not limited herein.
In addition, the technical details that are not elaborated in this embodiment can be referred to the cleaning position determining method provided by any embodiment of the present invention, and are not described herein again.
In an embodiment, the cleaning position determining apparatus further comprises an acquisition module;
the acquisition module is used for acquiring the relevant information of the initial position corresponding to the current position of the terminal equipment in the target coordinate system;
the determining module 30 is further configured to determine, according to the view field picture, cleaning position related information corresponding to the cleaning position; and determining target coordinates of the cleaning position in the target coordinate system according to the starting position related information and the cleaning position related information.
In an embodiment, the start position related information includes a start coordinate and a start direction of the current position of the terminal device in the target coordinate system, and the cleaning position related information includes a cleaning distance and a cleaning angle between the cleaning position and the current position of the terminal device;
the determining module 30 is further configured to determine, according to the starting direction and the cleaning angle, a reference angle at which a target straight line between the cleaning position and a current position of the terminal device is located in the target coordinate system, where the reference angle is an angle between the target straight line and a coordinate axis of the target coordinate system; determining a relative coordinate of the cleaning position between the target coordinate system and the starting coordinate according to the cleaning distance and the reference angle; and determining target coordinates of the cleaning position in the target coordinate system according to the starting coordinates and the relative coordinates.
In an embodiment, the determining module 30 is further configured to determine, according to the view picture, cleaning position related information corresponding to the cleaning position; and determining a target coordinate of the cleaning position in a reference coordinate system according to the cleaning position related information, wherein the reference coordinate system is a coordinate system established by taking the current position of the terminal equipment as an origin.
In one embodiment, the cleaning position related information comprises a cleaning distance and a cleaning angle between the cleaning position and a position where the terminal device is currently located;
the determining module 30 is further configured to determine a first reference coordinate value and a second reference coordinate value according to the cleaning distance and the cleaning angle; and determining the target coordinate of the cleaning position in a reference coordinate system according to the fixed first reference coordinate value and the second reference coordinate value.
In one embodiment, the field of view picture is acquired by using a binocular camera, and the field of view picture comprises a first reference image and a second reference image;
the determining module 30 is further configured to determine a first pixel position corresponding to the preset pattern according to the first reference image, and determine a first off-center value corresponding to the first pixel position according to an image center line corresponding to the first reference image; determining a second pixel position corresponding to the preset pattern according to the second reference image, and determining a second off-center value corresponding to the second pixel position according to an image center line corresponding to the second reference image; and determining cleaning position related information corresponding to the cleaning position according to the first deviation center value, the second deviation center value and a preset binocular center distance.
In one embodiment, the handheld terminal is provided with a laser emitting hole matched with a preset pattern, and when a trigger instruction is received, laser is emitted forwards so as to project a light spot with the shape of the preset pattern at a pointed cleaning position.
Further, it is to be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the description of the foregoing embodiments, it is clear to those skilled in the art that the method of the foregoing embodiments may be implemented by software plus a necessary general hardware platform, and certainly may also be implemented by hardware, but in many cases, the former is a better implementation. Based on such understanding, the technical solution of the present invention or portions thereof that contribute to the prior art may be embodied in the form of a software product, where the computer software product is stored in a storage medium (e.g. Read Only Memory (ROM)/RAM, magnetic disk, optical disk), and includes several instructions for enabling a terminal device (e.g. a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention, and all equivalent structures or equivalent processes performed by the present invention or directly or indirectly applied to other related technical fields are also included in the scope of the present invention.

Claims (7)

1. A cleaning position determining method, characterized in that the method comprises:
collecting a view field picture in real time;
judging whether a preset pattern is contained in each collected view field picture or not, wherein the preset pattern is generated when a user points to a cleaning position through a handheld terminal;
if the preset pattern is contained, determining a target coordinate corresponding to the cleaning position according to the view field picture;
before determining the target coordinate corresponding to the cleaning position according to the view field picture, the method further includes:
acquiring initial position related information corresponding to the current position of the terminal equipment in a target coordinate system;
the determining the target coordinates corresponding to the cleaning position according to the view field picture comprises:
determining cleaning position related information corresponding to the cleaning position according to the view field picture;
determining a target coordinate of the cleaning position in the target coordinate system according to the starting position related information and the cleaning position related information;
the cleaning position related information comprises a cleaning distance and a cleaning angle between a cleaning position and a position where the terminal equipment is located currently;
the determining the cleaning position related information corresponding to the cleaning position according to the view field picture includes:
orthodontic adjustment is carried out on a preset pattern in a field image, the adjusted pixel size is determined, and the field image is acquired through a monocular camera;
converting the pixel size based on the internal and external parameters of the camera calibrated in advance, and determining the actual size corresponding to the preset pattern;
determining a cleaning distance and a cleaning angle between the cleaning position and the current position of the terminal equipment according to the actual size and the preset size;
the handheld terminal is provided with a laser emitting hole matched with a preset pattern, laser is emitted forwards when a trigger instruction is received, the distance between the laser emitting hole and a laser drop point position is obtained, the size of the laser emitting hole is adjusted according to the distance, and a light spot with the shape of the preset pattern and the size of the preset size is projected at the pointed cleaning position.
2. The cleaning position determining method according to claim 1, wherein the start position-related information includes start coordinates and a start direction in the target coordinate system at which the terminal device is currently located;
the determining the target coordinate of the cleaning position in the target coordinate system according to the start position-related information and the cleaning position-related information includes:
determining a reference angle of a target straight line between the cleaning position and the current position of the terminal equipment in the target coordinate system according to the starting direction and the cleaning angle, wherein the reference angle is an angle between the target straight line and a coordinate axis of the target coordinate system;
determining a relative coordinate of the cleaning position between the target coordinate system and the starting coordinate according to the cleaning distance and the reference angle;
and determining target coordinates of the cleaning position in the target coordinate system according to the starting coordinates and the relative coordinates.
3. The method for determining the cleaning position according to claim 1, wherein the determining the target coordinates corresponding to the cleaning position according to the view field picture comprises:
determining cleaning position related information corresponding to the cleaning position according to the view field picture;
and determining target coordinates of the cleaning position in a reference coordinate system according to the cleaning position related information, wherein the reference coordinate system is a coordinate system established by taking the current position of the terminal equipment as an origin.
4. The cleaning position determining method according to claim 3, wherein the determining the target coordinates of the cleaning position in the reference coordinate system based on the cleaning position-related information includes:
determining a first reference coordinate value and a second reference coordinate value according to the cleaning distance and the cleaning angle;
and determining the target coordinate of the cleaning position in a reference coordinate system according to the fixed first reference coordinate value and the second reference coordinate value.
5. A cleaning position determining apparatus, characterized in that the apparatus comprises:
the acquisition module is used for acquiring a field-of-view picture in real time;
the judging module is used for judging whether a preset pattern is contained in each collected view field picture, and the preset pattern is generated when a user points to a cleaning position through a handheld terminal;
the determining module is used for determining a target coordinate corresponding to the cleaning position according to the view field picture if the preset pattern is included;
wherein the cleaning position determining apparatus further comprises an acquisition module;
the acquisition module is used for acquiring the relevant information of the initial position corresponding to the current position of the terminal equipment in the target coordinate system;
the determining module is further configured to determine cleaning position related information corresponding to the cleaning position according to the view field picture; determining a target coordinate of the cleaning position in the target coordinate system according to the starting position related information and the cleaning position related information;
the cleaning position related information comprises a cleaning distance and a cleaning angle between a cleaning position and a position where the terminal equipment is located currently; the determining module is further used for performing orthodontic adjustment on a preset pattern in a field image and determining the adjusted pixel size, wherein the field image is acquired through a monocular camera; converting the pixel size based on the internal and external parameters of the camera calibrated in advance, and determining the actual size corresponding to the preset pattern; determining a cleaning distance and a cleaning angle between a cleaning position and a current position of the terminal equipment according to the actual size and the preset size;
the handheld terminal is provided with a laser emitting hole matched with a preset pattern, laser is emitted forwards when a trigger instruction is received, the distance between the laser emitting hole and a laser drop point position is obtained, the size of the laser emitting hole is adjusted according to the distance, and light spots with the shapes of the preset pattern and the size of the preset size are projected at the pointed cleaning position.
6. A storage medium characterized in that the storage medium has stored thereon a cleaning position determination program that, when executed by a processor, implements the cleaning position determination method according to any one of claims 1 to 4.
7. A terminal device, characterized in that the device comprises: a memory, a processor, and a cleaning position determination program stored on the memory and executable on the processor, the cleaning position determination program configured to implement the cleaning position determination method of any one of claims 1 to 4;
if the terminal equipment is a cleaning robot, the cleaning robot collects view field pictures in real time, determines a target coordinate corresponding to a cleaning position according to the collected view field pictures, moves to the cleaning position according to the target coordinate and executes a cleaning task;
if the terminal equipment is a base station, the base station collects a view field picture in real time, and after a target coordinate corresponding to a cleaning position is determined according to the collected view field picture, the target coordinate is sent to the cleaning robot, so that the cleaning robot moves to the cleaning position according to the target coordinate, and a cleaning task is executed.
CN202210419052.7A 2022-04-21 2022-04-21 Cleaning position determining method, cleaning position determining device, cleaning position determining equipment and storage medium Active CN114515124B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210419052.7A CN114515124B (en) 2022-04-21 2022-04-21 Cleaning position determining method, cleaning position determining device, cleaning position determining equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210419052.7A CN114515124B (en) 2022-04-21 2022-04-21 Cleaning position determining method, cleaning position determining device, cleaning position determining equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114515124A CN114515124A (en) 2022-05-20
CN114515124B true CN114515124B (en) 2022-07-22

Family

ID=81600128

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210419052.7A Active CN114515124B (en) 2022-04-21 2022-04-21 Cleaning position determining method, cleaning position determining device, cleaning position determining equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114515124B (en)

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6629028B2 (en) * 2000-06-29 2003-09-30 Riken Method and system of optical guidance of mobile body
KR100520079B1 (en) * 2003-08-01 2005-10-12 삼성전자주식회사 robot system and control method thereof
JP4431446B2 (en) * 2004-06-08 2010-03-17 シャープ株式会社 Self-propelled robot
US8606404B1 (en) * 2009-06-19 2013-12-10 Bissell Homecare, Inc. System and method for controlling a cleaning apparatus
DE102011053386A1 (en) * 2011-06-28 2013-01-03 Vorwerk & Co. Interholding Gmbh Automatically movable device and method for the route guidance of such a device
CN106231971B (en) * 2014-02-28 2020-07-10 三星电子株式会社 Cleaning robot and remote controller including the same
KR101561921B1 (en) * 2014-05-20 2015-10-20 엘지전자 주식회사 A cleaner
CN108403009A (en) * 2018-03-08 2018-08-17 徐志强 A kind of sweeping robot and its control method
CN108888187A (en) * 2018-05-31 2018-11-27 四川斐讯信息技术有限公司 A kind of sweeping robot based on depth camera
CN108514389A (en) * 2018-06-04 2018-09-11 赵海龙 A kind of control method of intelligent cleaning equipment
CN112617674B (en) * 2019-10-18 2022-07-15 上海善解人意信息科技有限公司 Sweeping robot system and sweeping robot control method

Also Published As

Publication number Publication date
CN114515124A (en) 2022-05-20

Similar Documents

Publication Publication Date Title
CN110974088B (en) Sweeping robot control method, sweeping robot and storage medium
CN108297115B (en) Autonomous repositioning method for robot
KR100703692B1 (en) System, apparatus and method for improving readability of a map representing objects in space
WO2022078467A1 (en) Automatic robot recharging method and apparatus, and robot and storage medium
CN109528089B (en) Method, device and chip for continuously walking trapped cleaning robot
JP5736622B1 (en) Detection device and operation control of manipulator equipped with the device
JPWO2003102706A1 (en) Remote operation robot and robot self-position identification method
EP3836084B1 (en) Charging device identification method, mobile robot and charging device identification system
CN109816730A (en) Workpiece grabbing method, apparatus, computer equipment and storage medium
CN107291084B (en) Sweeping robot charging system, sweeping robot and charging seat
CN114160507B (en) Laser cleaning path automatic planning method based on multiple sensing detection
WO2013145025A1 (en) Stereo camera system and mobile object
CN111814752A (en) Indoor positioning implementation method, server, intelligent mobile device and storage medium
CN106352871A (en) Indoor visual positioning system and method based on artificial ceiling beacon
CN111761159B (en) Automatic control method and system for welding equipment, welding robot and storage medium
US20200380727A1 (en) Control method and device for mobile device, and storage device
CN111168685B (en) Robot control method, robot, and readable storage medium
CN114515124B (en) Cleaning position determining method, cleaning position determining device, cleaning position determining equipment and storage medium
CN112716401B (en) Obstacle-detouring cleaning method, device, equipment and computer-readable storage medium
KR20180090723A (en) Robt cleaner and controlling method of the same
CN112014830A (en) Radar laser reflection and filtering method, sweeping robot, equipment and storage medium
JP2000254883A (en) Tool device, tool recognizing method, and automatic operation device
CN114630398A (en) Network distribution method, network distribution equipment, readable storage medium and sweeping robot
CN114504285B (en) Cleaning position determining method, cleaning position determining device, cleaning position determining equipment and storage medium
CN110946512A (en) Sweeping robot control method and device based on laser radar and camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant