CN114515124A - Cleaning position determining method, cleaning position determining device, cleaning position determining equipment and storage medium - Google Patents

Cleaning position determining method, cleaning position determining device, cleaning position determining equipment and storage medium Download PDF

Info

Publication number
CN114515124A
CN114515124A CN202210419052.7A CN202210419052A CN114515124A CN 114515124 A CN114515124 A CN 114515124A CN 202210419052 A CN202210419052 A CN 202210419052A CN 114515124 A CN114515124 A CN 114515124A
Authority
CN
China
Prior art keywords
cleaning
cleaning position
determining
field picture
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210419052.7A
Other languages
Chinese (zh)
Other versions
CN114515124B (en
Inventor
何世友
杭大明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Baseus Technology Co Ltd
Original Assignee
Shenzhen Baseus Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Baseus Technology Co Ltd filed Critical Shenzhen Baseus Technology Co Ltd
Priority to CN202210419052.7A priority Critical patent/CN114515124B/en
Publication of CN114515124A publication Critical patent/CN114515124A/en
Application granted granted Critical
Publication of CN114515124B publication Critical patent/CN114515124B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/06Control of the cleaning action for autonomous devices; Automatic detection of the surface condition before, during or after cleaning

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Electric Vacuum Cleaner (AREA)

Abstract

The invention belongs to the technical field of intelligent equipment, and discloses a cleaning position determining method, a cleaning position determining device, cleaning position determining equipment and a storage medium. The method comprises the following steps: collecting a view field picture in real time; judging whether the field-of-view pictures collected each time contain preset patterns, wherein the preset patterns are generated when a user points to a cleaning position through a handheld terminal; and if the preset pattern is contained, determining a target coordinate corresponding to the cleaning position according to the view field picture. In this way, the user presets the pattern at clean position mark through handheld terminal, and cleaning machines people is through the pattern of predetermineeing in the discernment visual field picture in order to confirm clean position place coordinate, and cleaning machines people moves the clean position appointed to the user fast automatically and cleans of being convenient for, need not the manual work and progressively handle the walking direction of robot, has improved cleaning machines people's clean efficiency, has promoted user experience.

Description

Cleaning position determining method, cleaning position determining device, cleaning position determining equipment and storage medium
Technical Field
The invention relates to the technical field of intelligent equipment, in particular to a cleaning position determining method, a cleaning position determining device, cleaning position determining equipment and a storage medium.
Background
With the rapid development of smart device technology, cleaning robots having an automatic cleaning function are becoming more and more popular. The cleaning robot can automatically perform a cleaning operation in a waiting cleaning space of a home space or a large-sized place to clean the space to be cleaned, thereby saving a large amount of cleaning time for a user.
Except that the space to be cleaned is cleaned automatically, the cleaning robot can move straight, retreat, turn left or turn right under the control of the remote controller, but the walking direction of the cleaning robot is required to be controlled manually in the mode step by step, the cleaning robot cannot move to the position where a user needs to clean easily and quickly, the cleaning efficiency is low, and the user experience is poor.
The above is only for the purpose of assisting understanding of the technical aspects of the present invention, and does not represent an admission that the above is prior art.
Disclosure of Invention
The invention mainly aims to provide a cleaning position determining method, a cleaning position determining device, cleaning equipment and a storage medium, and aims to solve the technical problems that an existing cleaning robot needs to manually control the walking direction step by step, cannot easily and quickly move to a position needing cleaning by a user, is low in cleaning efficiency and poor in user experience.
To achieve the above object, the present invention provides a cleaning position determining method including the steps of:
collecting a view field picture in real time;
judging whether a preset pattern is contained in each collected view field picture or not, wherein the preset pattern is generated when a user points to a cleaning position through a handheld terminal;
and if the preset pattern is contained, determining a target coordinate corresponding to the cleaning position according to the view field picture.
Optionally, before determining the target coordinate corresponding to the cleaning position according to the view field picture, the method further includes:
acquiring initial position related information corresponding to the current position of the terminal equipment in a target coordinate system;
the determining the target coordinates corresponding to the cleaning position according to the view field picture includes:
determining cleaning position related information corresponding to the cleaning position according to the view field picture;
and determining the target coordinate of the cleaning position in the target coordinate system according to the starting position related information and the cleaning position related information.
Optionally, the start position related information includes a start coordinate and a start direction of the current position of the terminal device in the target coordinate system, and the cleaning position related information includes a cleaning distance and a cleaning angle between the cleaning position and the current position of the terminal device;
the determining the target coordinate of the cleaning position in the target coordinate system according to the start position-related information and the cleaning position-related information includes:
determining a reference angle of a target straight line between the cleaning position and the current position of the terminal device in the target coordinate system according to the starting direction and the cleaning angle, wherein the reference angle is an angle between the target straight line and a coordinate axis of the target coordinate system;
determining a relative coordinate of the cleaning position between the target coordinate system and the starting coordinate according to the cleaning distance and the reference angle;
and determining a target coordinate of the cleaning position in the target coordinate system according to the starting coordinate and the relative coordinate.
Optionally, the determining, according to the view field picture, a target coordinate corresponding to the cleaning position includes:
determining cleaning position related information corresponding to the cleaning position according to the view field picture;
and determining target coordinates of the cleaning position in a reference coordinate system according to the cleaning position related information, wherein the reference coordinate system is a coordinate system established by taking the current position of the terminal equipment as an origin.
Optionally, the cleaning position related information includes a cleaning distance and a cleaning angle between the cleaning position and a position where the terminal device is currently located;
the determining the target coordinate of the cleaning position in the reference coordinate system according to the cleaning position related information comprises:
determining a first reference coordinate value and a second reference coordinate value according to the cleaning distance and the cleaning angle;
and determining the target coordinate of the cleaning position in a reference coordinate system according to the fixed first reference coordinate value and the second reference coordinate value.
Optionally, the view field picture is acquired by using a binocular camera, and the view field picture comprises a first reference image and a second reference image;
the determining the cleaning position related information corresponding to the cleaning position according to the view field picture includes:
determining a first pixel position corresponding to the preset pattern according to the first reference image, and determining a first off-center value corresponding to the first pixel position according to an image center line corresponding to the first reference image;
determining a second pixel position corresponding to the preset pattern according to the second reference image, and determining a second off-center value corresponding to the second pixel position according to an image center line corresponding to the second reference image;
and determining cleaning position related information corresponding to the cleaning position according to the first deviation center value, the second deviation center value and a preset binocular center distance.
Optionally, the handheld terminal is provided with a laser emitting hole matched with a preset pattern, and when a trigger instruction is received, laser is emitted forwards so as to project a light spot in the shape of the preset pattern at the pointed cleaning position.
Further, to achieve the above object, the present invention also provides a cleaning position determining apparatus including:
the acquisition module is used for acquiring a view field picture in real time;
the judging module is used for judging whether a preset pattern is contained in each collected view field picture, and the preset pattern is generated when a user points to a cleaning position through a handheld terminal;
and the determining module is used for determining a target coordinate corresponding to the cleaning position according to the view field picture if the preset pattern is included.
Further, to achieve the above object, the present invention also proposes a cleaning position determining apparatus including: a memory, a processor, and a cleaning position determining program stored on the memory and executable on the processor, the cleaning position determining program configured to implement a cleaning position determining method as described above.
Furthermore, to achieve the above object, the present invention also proposes a storage medium having stored thereon a cleaning position determining program which, when executed by a processor, implements the cleaning position determining method as described above.
The invention collects the view field picture in real time; judging whether the field-of-view pictures collected each time contain preset patterns, wherein the preset patterns are generated when a user points to a cleaning position through a handheld terminal; and if the preset pattern is contained, determining a target coordinate corresponding to the cleaning position according to the view field picture. In this way, the user presets the pattern at clean position mark through handheld terminal, and cleaning machines people is through the pattern of presetting in the discernment visual field picture in order to confirm clean position place coordinate, and cleaning machines people removes the clean position that appoints to the user automatically fast and cleans of being convenient for, need not the manual work and progressively handle the walking direction of robot, has improved cleaning machines people's clean efficiency, has promoted user experience.
Drawings
FIG. 1 is a schematic diagram of a clean location determining apparatus for a hardware operating environment according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of a cleaning position determining method according to a first embodiment of the present invention;
FIG. 3 is a schematic flow chart of a cleaning position determining method according to a second embodiment of the present invention;
FIG. 4 is a first graph of a cleaning position according to an embodiment of the cleaning position determining method of the present invention;
FIG. 5 is a schematic flow chart of a cleaning position determining method according to a third embodiment of the present invention;
FIG. 6 is a second graph of a cleaning position according to an embodiment of the cleaning position determining method of the present invention;
fig. 7 is a block diagram showing the structure of the cleaning position determining apparatus according to the first embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
As shown in fig. 1, the terminal device may include: a processor 1001, a communication bus 1002, a user interface 1003, a network interface 1004, and a memory 1005. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include an input unit such as a key. The network interface 1004 may optionally include a standard wired interface, a Wireless interface (e.g., a Wireless-Fidelity (Wi-Fi) interface). The Memory 1005 may be a high speed Random Access Memory (RAM).
Those skilled in the art will appreciate that the configuration shown in fig. 1 does not constitute a limitation of the terminal device and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
As shown in fig. 1, a memory 1005, which is a kind of storage medium, may include therein an operating system, a network communication module, a user interface module, and a cleaning position determining program.
In the terminal device shown in fig. 1, the network interface 1004 is mainly used for data communication with other devices in the cleaning system; the user interface 1003 is mainly used for data interaction with a user; the processor 1001 and the memory 1005 in the terminal device of the present invention may be provided in the terminal device, and the terminal device calls the cleaning position determination program stored in the memory 1005 through the processor 1001 and executes the cleaning position determination method provided by the embodiment of the present invention.
An embodiment of the present invention provides a cleaning position determining method, and referring to fig. 2, fig. 2 is a schematic flowchart of a first embodiment of the cleaning position determining method according to the present invention.
In this embodiment, the cleaning position determining method includes the steps of:
step S10: and collecting the view field picture in real time.
It can be understood that the main execution body of this embodiment is a terminal device, and the terminal device may be a base station in a cleaning system or a cleaning robot, which is not limited in this embodiment.
It should be noted that, a camera is installed on the terminal device and is used for acquiring a view field picture within a view field range. Optionally, a picture acquisition frequency is set, and the terminal device acquires the view field pictures every other short time according to the picture acquisition frequency, so that the real-time acquisition of the view field pictures is realized.
Optionally, when the user projects the laser forward through the handheld terminal, the handheld terminal sends a working instruction to the terminal device, the terminal device starts the camera according to the working instruction, and collects the view field picture in the view field in real time according to the preset picture collection frequency.
Step S20: and judging whether the field-of-view picture acquired each time contains a preset pattern, wherein the preset pattern is generated when the user points to the cleaning position through the handheld terminal.
It should be understood that, the target detection algorithm is used to detect whether the light spot is included in the view field picture, and after the light spot is determined to exist in the view field picture, the shape of the light spot is detected to be consistent with the preset pattern. In the specific implementation, a target detection algorithm is trained in advance according to a large amount of laser spot data, so that the target detection algorithm identifies the characteristic information of the laser spots, and a field-of-view picture is detected based on the trained target detection algorithm, so that whether the field-of-view picture contains the spots is determined. When the light spots are detected in the view field picture, the light spot image is intercepted according to a target detection frame obtained by a target detection algorithm, gray level conversion and binary conversion are carried out on the light spot image, contour recognition is carried out on the converted image, the number of corresponding vertexes is determined, the shape corresponding to the light spots in the view field picture is determined based on the number of the vertexes, and if the shape corresponding to the light spots in the view field picture is consistent with a preset pattern, the collected view field picture is judged to contain the preset pattern.
Furthermore, the handheld terminal is provided with a laser emitting hole matched with a preset pattern, and when a trigger instruction is received, laser is emitted forwards so as to project a light spot with the shape of the preset pattern at a pointed cleaning position.
It should be noted that, in a specific application scenario, a user points to an area to be cleaned by triggering a handheld terminal in a field of view of a cleaning robot or a base station, the handheld terminal emits laser forward, a light spot of a preset pattern is projected on a dirty ground, when the cleaning robot scans that the preset pattern exists in the field of view, a target coordinate corresponding to a position where the preset pattern exists is confirmed, and the cleaning robot moves to the position where the preset pattern exists to perform a cleaning task; when the base station scans preset patterns in a view field, the target coordinates corresponding to the positions of the preset patterns are confirmed, and the target coordinates are sent to the cleaning robot, so that the cleaning robot moves to the positions of the preset patterns to perform cleaning tasks, the fact that which the cleaning robot is to be scanned is achieved, a user does not need to gradually control the direction of the cleaning robot, and the cleaning efficiency of the cleaning robot is improved.
Step S30: and if the preset pattern is contained, determining a target coordinate corresponding to the cleaning position according to the view field picture.
It should be understood that the preset pattern in the view picture is analyzed, the distance and the angle between the cleaning position where the preset pattern is located and the current position where the terminal device is located are determined, and the target coordinate of the cleaning position pair is determined according to the distance and the angle, wherein the target coordinate may be a coordinate where the preset pattern is located in a coordinate system established by taking the terminal device as the center, or may be a coordinate where the preset pattern is located in a known map coordinate system.
In one implementation, a base station or a cleaning robot determines target coordinates of a cleaning position and initial coordinates and pose of the cleaning robot, determines a distance and a pose adjustment angle between the cleaning robot and the cleaning position according to the target coordinates and the initial coordinates, adjusts the pose according to the pose adjustment angle, and moves to the cleaning position according to the distance to perform a cleaning task, wherein the base station transmits the distance and the pose adjustment angle to the cleaning robot after determining the distance and the pose adjustment angle between the cleaning robot and the cleaning position, so that the cleaning robot adjusts the pose according to the pose adjustment angle and moves to the cleaning position according to the distance to perform the cleaning task.
Preferably, in order to avoid collision of the cleaning robot caused by obstacles in the cleaning area, when the cleaning robot determines or acquires a target coordinate corresponding to the cleaning position, a moving track heading to the target coordinate is generated according to the obstacles in a known map coordinate system, and the cleaning robot executes a cleaning task heading to the cleaning position according to the moving track.
It should be noted that, if the cleaning robot or the base station detects that a plurality of preset patterns exist in the view field picture (the user points to a plurality of cleaning positions through a plurality of handheld terminals), the distances between the plurality of cleaning positions and the cleaning robot are determined according to target coordinates corresponding to the plurality of preset patterns and initial coordinates where the cleaning robot is located, the target coordinate closest to the cleaning robot is selected as a first cleaning coordinate, the target coordinate closest to the first cleaning coordinate is selected from the remaining target coordinates as a second cleaning coordinate, and so on, a plurality of cleaning coordinates are obtained by arranging in sequence, and the cleaning robot goes to the cleaning positions in sequence to execute a cleaning task.
In a specific implementation, the terminal device of this embodiment is a cleaning robot or a base station:
if the terminal equipment is a cleaning robot, the cleaning robot collects a view field picture in real time, determines a target coordinate corresponding to a cleaning position according to the collected view field picture, moves to the cleaning position according to the target coordinate and executes a cleaning task;
if the terminal equipment is a base station, the base station collects a view field picture in real time, and after a target coordinate corresponding to a cleaning position is determined according to the collected view field picture, the target coordinate is sent to the cleaning robot, so that the cleaning robot moves to the cleaning position according to the target coordinate, and a cleaning task is executed.
The embodiment collects the view field picture in real time; judging whether the field-of-view pictures collected each time contain preset patterns, wherein the preset patterns are generated when a user points to a cleaning position through a handheld terminal; and if the preset pattern is contained, determining a target coordinate corresponding to the cleaning position according to the view field picture. In this way, the user presets the pattern at clean position mark through handheld terminal, and cleaning machines people is through the pattern of presetting in the discernment visual field picture in order to confirm clean position place coordinate, and cleaning machines people removes the clean position that appoints to the user automatically fast and cleans of being convenient for, need not the manual work and progressively handle the walking direction of robot, has improved cleaning machines people's clean efficiency, has promoted user experience.
Referring to fig. 3, fig. 3 is a flowchart illustrating a cleaning position determining method according to a second embodiment of the present invention.
Based on the first embodiment, before the determining the target coordinates corresponding to the cleaning position according to the view picture, the method for determining the cleaning position in this embodiment further includes:
step S301: and acquiring the relevant information of the initial position corresponding to the current position of the terminal equipment in the target coordinate system.
It will be appreciated that the target coordinate system may be a map coordinate system known to the cleaning robot and the base station, but may also be other known coordinate systems, such as the world coordinate system. The terminal equipment locates the relative information of the initial position of the current position in the target coordinate system through a sensor installed on the terminal equipment, wherein the relative information of the initial position comprises an initial coordinate and an initial direction.
The determining the target coordinates corresponding to the cleaning position according to the view field picture includes:
step S302: and determining cleaning position related information corresponding to the cleaning position according to the view field picture.
It should be noted that the cleaning position related information may be a cleaning distance and a cleaning angle between the cleaning position and a current position of the terminal device, in one implementation manner, when the handheld terminal projects the laser forward under the trigger of the user, the distance between the handheld terminal and the laser drop point position is obtained, the size of the laser emitting hole is adjusted according to the distance, so as to project a light spot with a certain size on the ground, and it is assumed that the size of the light spot is a preset size. The method comprises the steps that terminal equipment collects a view field picture through a monocular camera, performs orthodontic adjustment on a preset pattern in the view field picture, determines the adjusted pixel size, converts the pixel size based on pre-calibrated internal and external parameters of the camera, determines the actual size corresponding to the preset pattern, and determines the cleaning distance and the cleaning angle between a cleaning position and the current position of the terminal equipment according to the actual size and the preset size.
Optionally, the view field picture is acquired by using a binocular camera, and the view field picture comprises a first reference image and a second reference image;
the step S302 includes: determining a first pixel position corresponding to the preset pattern according to the first reference image, and determining a first off-center value corresponding to the first pixel position according to an image center line corresponding to the first reference image; determining a second pixel position corresponding to the preset pattern according to the second reference image, and determining a second off-center value corresponding to the second pixel position according to an image center line corresponding to the second reference image; and determining cleaning position related information corresponding to the cleaning position according to the first deviation center value, the second deviation center value and a preset binocular center distance.
It should be understood that the binocular camera is calibrated in advance, and the focal length f, the binocular center distance T and the image center line corresponding to the binocular camera are determined and stored. The binocular camera of the terminal equipment acquires a first reference image and a second reference image according to a certain frequency, determines a first pixel position corresponding to a reference point of a preset pattern in the first reference image, and determines a second pixel position corresponding to the reference point of the preset pattern in the second reference image, wherein optionally, the reference point of the preset pattern is an upper vertex, a lower vertex, a left vertex, a right vertex or a center point of the preset pattern. When analyzing the preset patterns in the field picture, the focal length f, the binocular center distance T and the image center line are called from the preset storage interval, and the first deviation center value x is determined according to the image center lines respectively corresponding to the binocular cameraslAnd a second off-center value xrAccording to the firstAn off-center value xlSecond off-center value xrAnd the focal length f and the preset binocular center distance T determine the cleaning distance Z corresponding to the cleaning position through the following formula:
Figure 290219DEST_PATH_IMAGE001
;
it should be noted that a conversion matrix between a pixel coordinate system corresponding to any one of the binocular cameras and a mobile terminal coordinate system is calibrated in advance, the first pixel position or the second pixel position is mapped to the mobile terminal coordinate system based on the conversion matrix and the cleaning distance Z, and a cleaning angle between the cleaning position and the current position of the mobile terminal is determined.
Step S303: and determining the target coordinate of the cleaning position in the target coordinate system according to the starting position related information and the cleaning position related information.
Specifically, the start position related information includes a start coordinate and a start direction of the current position of the terminal device in the target coordinate system, and the cleaning position related information includes a cleaning distance and a cleaning angle between the cleaning position and the current position of the terminal device;
the step S303 includes: determining a reference angle of a target straight line between the cleaning position and the current position of the terminal device in the target coordinate system according to the starting direction and the cleaning angle, wherein the reference angle is an angle between the target straight line and a coordinate axis of the target coordinate system; determining a relative coordinate of the cleaning position between the target coordinate system and the starting coordinate according to the cleaning distance and the reference angle; and determining a target coordinate of the cleaning position in the target coordinate system according to the starting coordinate and the relative coordinate.
It should be understood that, referring to fig. 4, fig. 4 is a first coordinate diagram of a cleaning location according to an embodiment of the cleaning location determining method of the present invention, and the coordinate system in fig. 4 is a known target coordinate system (e.g. a map coordinate system), and the current location M of the mobile terminal corresponds to the current location M of the mobile terminalThe initial coordinates of the moving terminal are (a, b), the corresponding initial direction is the direction corresponding to the straight line MC, the mobile terminal detects the preset pattern in the view field (the dotted line sector area in fig. 4), the cleaning distance between the cleaning position A and the current position M of the mobile terminal is detected to be | MA |, the cleaning angle is represented as theta, and the reference angle |, AMA, between the target straight line MA and the coordinate axis of the target coordinate system can be determined according to the angle alpha corresponding to the initial direction MC and the cleaning angle theta1Determining relative coordinates of the cleaning position between the target coordinate system and the start coordinate based on the following formula:
Figure 291542DEST_PATH_IMAGE002
Figure 51687DEST_PATH_IMAGE003
according to the starting coordinates (a, b) and relative coordinates (| MA)1|,|MA2| determining a target coordinate (a + | MA) of the cleaning position in a target coordinate system1|,b+|MA2|)。
The embodiment collects the view field picture in real time; judging whether the field-of-view pictures collected each time contain preset patterns, wherein the preset patterns are generated when a user points to a cleaning position through a handheld terminal; if the preset pattern is contained, acquiring initial position related information corresponding to the current position of the terminal equipment in the target coordinate system; determining cleaning position related information corresponding to the cleaning position according to the view field picture; and determining target coordinates of the cleaning position in the target coordinate system according to the starting position related information and the cleaning position related information. In this way, the user presets the pattern at clean position mark through handheld terminal, cleaning machines people is through the pattern of predetermineeing in the discernment visual field picture in order to confirm clean position information, confirm clean position place coordinate based on known initial position and clean position information, be convenient for cleaning machines people based on known coordinate system location self position and clean position, automatically, remove to the clean position that the user appointed fast and clean, need not the manual work and progressively handle the walking direction of robot, cleaning machines people's cleaning efficiency has been improved, user experience has been promoted.
Referring to fig. 5, fig. 5 is a flowchart illustrating a cleaning position determining method according to a third embodiment of the present invention.
Based on the first embodiment, the determining method of the cleaning position according to this embodiment, in the determining the target coordinate corresponding to the cleaning position according to the view picture, includes:
step S304: and determining cleaning position related information corresponding to the cleaning position according to the view field picture.
It can be understood that, in an implementation manner of step S304 in this embodiment, referring to step S302 in the second embodiment, in an implementation manner, a monocular camera is used to acquire a view field picture, perform orthodontic adjustment on a preset pattern in the view field picture, determine an adjusted pixel size, convert the pixel size based on pre-calibrated internal and external parameters of the camera, determine an actual size corresponding to the preset pattern, and determine a cleaning distance and a cleaning angle between a cleaning position and a current position of the terminal device according to the actual size and the preset size.
Preferably, the view field picture is acquired by using a binocular camera, and the view field picture comprises a first reference image and a second reference image;
the step S304 includes: determining a first pixel position corresponding to the preset pattern according to the first reference image, and determining a first off-center value corresponding to the first pixel position according to an image center line corresponding to the first reference image; determining a second pixel position corresponding to the preset pattern according to the second reference image, and determining a second off-center value corresponding to the second pixel position according to an image center line corresponding to the second reference image; and determining cleaning position related information corresponding to the cleaning position according to the first deviation center value, the second deviation center value and a preset binocular center distance.
Step S305: and determining target coordinates of the cleaning position in a reference coordinate system according to the cleaning position related information, wherein the reference coordinate system is a coordinate system established by taking the current position of the terminal equipment as an origin.
Specifically, the cleaning position related information includes a cleaning distance and a cleaning angle between the cleaning position and a position where the terminal device is currently located;
the step S305 includes: determining a first reference coordinate value and a second reference coordinate value according to the cleaning distance and the cleaning angle; and determining the target coordinate of the cleaning position in a reference coordinate system according to the fixed first reference coordinate value and the second reference coordinate value.
It should be noted that, referring to fig. 6, fig. 6 is a second coordinate diagram of the cleaning position according to an embodiment of the cleaning position determining method of the present invention, the coordinate system in fig. 6 is a coordinate system established by taking the position of the terminal device as an origin and the direction of the terminal device as a coordinate axis, the cleaning distance between the cleaning position B and the current position B of the terminal device is represented as | OB |, the cleaning angle is represented as θ, and a first reference coordinate value | OB | is determined according to the cleaning distance | OB | and the cleaning angle θ by the following formulas1| and a second reference coordinate value | OB2|:
Figure 505671DEST_PATH_IMAGE004
Figure 704571DEST_PATH_IMAGE005
Thereby determining the target coordinate (| OB) of the cleaning position in the reference coordinate system1|,|OB2|)。
Further, when the mobile terminal is a base station, the base station determines a target coordinate of the cleaning position under a base station coordinate system according to a view field picture in a view field, communicates with the cleaning robot through a position determining device (UWB) to determine the distance and the angle between the base station and the cleaning robot, converts the target coordinate into the cleaning robot coordinate system according to the distance and the angle between the base station and the cleaning robot to obtain a final cleaning coordinate, and sends the final cleaning coordinate to the cleaning robot so that the cleaning robot moves to the cleaning position according to the coordinate system of the cleaning robot and the final cleaning coordinate and executes a cleaning task.
The embodiment collects the view field picture in real time; judging whether the field-of-view pictures collected each time contain preset patterns, wherein the preset patterns are generated when a user points to a cleaning position through a handheld terminal; if the cleaning position contains the preset pattern, determining cleaning position related information corresponding to the cleaning position according to the view field picture; and determining target coordinates of the cleaning position in a reference coordinate system according to the cleaning position related information, wherein the reference coordinate system is a coordinate system established by taking the current position of the terminal equipment as an origin. In this way, the user presets the pattern at clean position mark through handheld terminal, cleaning machines people is through the pattern of presetting in the discernment visual field picture in order to confirm clean positional information, it is located the coordinate of cleaning machines people's place coordinate system to confirm clean position based on clean positional information, it cleans to be convenient for cleaning machines people to remove to user's appointed clean position automatically fast based on the coordinate system that self belonged, need not the manual work and progressively handle the walking direction of robot, cleaning efficiency of cleaning machines people is improved, user experience has been promoted.
Furthermore, an embodiment of the present invention also provides a storage medium having a cleaning position determining program stored thereon, where the cleaning position determining program is executed by a processor to implement the cleaning position determining method as described above.
Since the storage medium adopts all technical solutions of all the embodiments, at least all the beneficial effects brought by the technical solutions of the embodiments are achieved, and no further description is given here.
Referring to fig. 7, fig. 7 is a block diagram showing the structure of the first embodiment of the cleaning position determining apparatus of the present invention.
As shown in fig. 7, a cleaning position determining apparatus according to an embodiment of the present invention includes:
and the acquisition module 10 is used for acquiring the view field pictures in real time.
The judging module 20 is configured to judge whether the field-of-view picture acquired each time includes a preset pattern, where the preset pattern is generated when the user points to the cleaning position through the handheld terminal.
A determining module 30, configured to determine, if the preset pattern is included, a target coordinate corresponding to the cleaning position according to the view picture.
It should be understood that the above is only an example, and the technical solution of the present invention is not limited in any way, and in a specific application, a person skilled in the art may set the technical solution as needed, and the present invention is not limited thereto.
The embodiment collects the view field picture in real time; judging whether the field-of-view pictures collected each time contain preset patterns, wherein the preset patterns are generated when a user points to a cleaning position through a handheld terminal; and if the preset pattern is contained, determining a target coordinate corresponding to the cleaning position according to the view field picture. In this way, the user presets the pattern at clean position mark through handheld terminal, and cleaning machines people is through the pattern of presetting in the discernment visual field picture in order to confirm clean position place coordinate, and cleaning machines people removes the clean position that appoints to the user automatically fast and cleans of being convenient for, need not the manual work and progressively handle the walking direction of robot, has improved cleaning machines people's clean efficiency, has promoted user experience.
It should be noted that the above-described work flows are only exemplary, and do not limit the scope of the present invention, and in practical applications, a person skilled in the art may select some or all of them to achieve the purpose of the solution of the embodiment according to actual needs, and the present invention is not limited herein.
In addition, the technical details that are not elaborated in this embodiment may refer to the cleaning position determining method provided by any embodiment of the present invention, and are not described herein again.
In an embodiment, the cleaning position determining apparatus further comprises an acquisition module;
the acquisition module is used for acquiring the relevant information of the initial position corresponding to the current position of the terminal equipment in the target coordinate system;
the determining module 30 is further configured to determine, according to the view field picture, cleaning position related information corresponding to the cleaning position; and determining the target coordinate of the cleaning position in the target coordinate system according to the starting position related information and the cleaning position related information.
In an embodiment, the start position related information includes start coordinates and a start direction of a position where the terminal device is currently located in the target coordinate system, and the cleaning position related information includes a cleaning distance and a cleaning angle between the cleaning position and the position where the terminal device is currently located;
the determining module 30 is further configured to determine, according to the starting direction and the cleaning angle, a reference angle at which a target straight line between the cleaning position and a current position of the terminal device is located in the target coordinate system, where the reference angle is an angle between the target straight line and a coordinate axis of the target coordinate system; determining a relative coordinate of the cleaning position between the target coordinate system and the starting coordinate according to the cleaning distance and the reference angle; and determining a target coordinate of the cleaning position in the target coordinate system according to the starting coordinate and the relative coordinate.
In an embodiment, the determining module 30 is further configured to determine, according to the view picture, cleaning position related information corresponding to the cleaning position; and determining target coordinates of the cleaning position in a reference coordinate system according to the cleaning position related information, wherein the reference coordinate system is a coordinate system established by taking the current position of the terminal equipment as an origin.
In one embodiment, the cleaning position related information comprises a cleaning distance and a cleaning angle between the cleaning position and a position where the terminal device is currently located;
the determining module 30 is further configured to determine a first reference coordinate value and a second reference coordinate value according to the cleaning distance and the cleaning angle; and determining the target coordinate of the cleaning position in a reference coordinate system according to the fixed first reference coordinate value and the second reference coordinate value.
In one embodiment, the field of view picture is acquired by using a binocular camera, and the field of view picture comprises a first reference image and a second reference image;
the determining module 30 is further configured to determine a first pixel position corresponding to the preset pattern according to the first reference image, and determine a first off-center value corresponding to the first pixel position according to an image center line corresponding to the first reference image; determining a second pixel position corresponding to the preset pattern according to the second reference image, and determining a second off-center value corresponding to the second pixel position according to an image center line corresponding to the second reference image; and determining cleaning position related information corresponding to the cleaning position according to the first deviation center value, the second deviation center value and a preset binocular center distance.
In one embodiment, the handheld terminal is provided with a laser emitting hole matched with a preset pattern, and when a trigger instruction is received, laser is emitted forwards so as to project a light spot with the shape of the preset pattern at a pointed cleaning position.
Further, it is to be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or system in which the element is included.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solution of the present invention or portions thereof that contribute to the prior art may be embodied in the form of a software product, where the computer software product is stored in a storage medium (e.g. Read Only Memory (ROM)/RAM, magnetic disk, optical disk), and includes several instructions for enabling a terminal device (e.g. a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. A cleaning position determining method, characterized in that the method comprises:
collecting a view field picture in real time;
judging whether a preset pattern is contained in each collected view field picture or not, wherein the preset pattern is generated when a user points to a cleaning position through a handheld terminal;
and if the preset pattern is contained, determining a target coordinate corresponding to the cleaning position according to the view field picture.
2. The cleaning position determining method according to claim 1, wherein before determining the target coordinates corresponding to the cleaning position from the view field picture, the method further comprises:
acquiring initial position related information corresponding to the current position of the terminal equipment in a target coordinate system;
the determining the target coordinates corresponding to the cleaning position according to the view field picture includes:
determining cleaning position related information corresponding to the cleaning position according to the view field picture;
and determining the target coordinate of the cleaning position in the target coordinate system according to the starting position related information and the cleaning position related information.
3. The cleaning position determination method according to claim 2, wherein the start position-related information includes start coordinates and a start direction in the target coordinate system at which the position at which the terminal device is currently located, and the cleaning position-related information includes a cleaning distance and a cleaning angle between the cleaning position and the position at which the terminal device is currently located;
the determining the target coordinate of the cleaning position in the target coordinate system according to the start position-related information and the cleaning position-related information includes:
determining a reference angle of a target straight line between the cleaning position and the current position of the terminal device in the target coordinate system according to the starting direction and the cleaning angle, wherein the reference angle is an angle between the target straight line and a coordinate axis of the target coordinate system;
determining a relative coordinate of the cleaning position between the target coordinate system and the starting coordinate according to the cleaning distance and the reference angle;
and determining a target coordinate of the cleaning position in the target coordinate system according to the starting coordinate and the relative coordinate.
4. The method for determining the cleaning position according to claim 1, wherein the determining the target coordinates corresponding to the cleaning position according to the view field picture comprises:
determining cleaning position related information corresponding to the cleaning position according to the view field picture;
and determining target coordinates of the cleaning position in a reference coordinate system according to the cleaning position related information, wherein the reference coordinate system is a coordinate system established by taking the current position of the terminal equipment as an origin.
5. The cleaning position determination method according to claim 4, wherein the cleaning position-related information includes a cleaning distance and a cleaning angle between the cleaning position and a position at which the terminal device is currently located;
the determining the target coordinate of the cleaning position in the reference coordinate system according to the cleaning position related information comprises:
determining a first reference coordinate value and a second reference coordinate value according to the cleaning distance and the cleaning angle;
and determining the target coordinate of the cleaning position in a reference coordinate system according to the fixed first reference coordinate value and the second reference coordinate value.
6. The cleaning position determining method according to claim 2 or 4, wherein the field-of-view pictures are acquired using a binocular camera, the field-of-view pictures including a first reference image and a second reference image;
the determining the cleaning position related information corresponding to the cleaning position according to the view field picture includes:
determining a first pixel position corresponding to the preset pattern according to the first reference image, and determining a first off-center value corresponding to the first pixel position according to an image center line corresponding to the first reference image;
determining a second pixel position corresponding to the preset pattern according to the second reference image, and determining a second off-center value corresponding to the second pixel position according to an image center line corresponding to the second reference image;
and determining cleaning position related information corresponding to the cleaning position according to the first deviation center value, the second deviation center value and a preset binocular center distance.
7. The cleaning position determining method according to any one of claims 1 to 5, wherein the hand-held terminal is provided with a laser emitting hole matching a preset pattern, and emits laser light forward upon receiving a trigger instruction to project a spot having a shape of the preset pattern at the pointed cleaning position.
8. A cleaning position determining apparatus, characterized in that the apparatus comprises:
the acquisition module is used for acquiring a view field picture in real time;
the judging module is used for judging whether a preset pattern is contained in each collected view field picture, and the preset pattern is generated when a user points to a cleaning position through a handheld terminal;
and the determining module is used for determining a target coordinate corresponding to the cleaning position according to the view field picture if the preset pattern is included.
9. A storage medium characterized in that the storage medium has stored thereon a cleaning position determination program that, when executed by a processor, implements a cleaning position determination method according to any one of claims 1 to 7.
10. A terminal device, characterized in that the device comprises: a memory, a processor, and a cleaning position determination program stored on the memory and executable on the processor, the cleaning position determination program configured to implement the cleaning position determination method of any one of claims 1 to 7;
if the terminal equipment is a cleaning robot, the cleaning robot collects a view field picture in real time, determines a target coordinate corresponding to a cleaning position according to the collected view field picture, moves to the cleaning position according to the target coordinate and executes a cleaning task;
if the terminal equipment is a base station, the base station collects a view field picture in real time, and after a target coordinate corresponding to a cleaning position is determined according to the collected view field picture, the target coordinate is sent to the cleaning robot, so that the cleaning robot moves to the cleaning position according to the target coordinate, and a cleaning task is executed.
CN202210419052.7A 2022-04-21 2022-04-21 Cleaning position determining method, cleaning position determining device, cleaning position determining equipment and storage medium Active CN114515124B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210419052.7A CN114515124B (en) 2022-04-21 2022-04-21 Cleaning position determining method, cleaning position determining device, cleaning position determining equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210419052.7A CN114515124B (en) 2022-04-21 2022-04-21 Cleaning position determining method, cleaning position determining device, cleaning position determining equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114515124A true CN114515124A (en) 2022-05-20
CN114515124B CN114515124B (en) 2022-07-22

Family

ID=81600128

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210419052.7A Active CN114515124B (en) 2022-04-21 2022-04-21 Cleaning position determining method, cleaning position determining device, cleaning position determining equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114515124B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020027652A1 (en) * 2000-06-29 2002-03-07 Paromtchik Igor E. Method for instructing target position for mobile body, method for controlling transfer thereof, and method as well as system of optical guidance therefor
CN1579714A (en) * 2003-08-01 2005-02-16 三星电子株式会社 Robot system and control method thereof
JP2005349497A (en) * 2004-06-08 2005-12-22 Sharp Corp Self-propelled robot
CN102846280A (en) * 2011-06-28 2013-01-02 德国福维克控股公司 Self-propelled device and method for guiding such a device
US8606404B1 (en) * 2009-06-19 2013-12-10 Bissell Homecare, Inc. System and method for controlling a cleaning apparatus
CN105078364A (en) * 2014-05-20 2015-11-25 Lg电子株式会社 Cleaner
CN106231971A (en) * 2014-02-28 2016-12-14 三星电子株式会社 Sweeping robot and the remote controllers being incorporated herein
CN108403009A (en) * 2018-03-08 2018-08-17 徐志强 A kind of sweeping robot and its control method
CN108514389A (en) * 2018-06-04 2018-09-11 赵海龙 A kind of control method of intelligent cleaning equipment
CN108888187A (en) * 2018-05-31 2018-11-27 四川斐讯信息技术有限公司 A kind of sweeping robot based on depth camera
CN112617674A (en) * 2019-10-18 2021-04-09 上海善解人意信息科技有限公司 Sweeping robot system and sweeping robot control method

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020027652A1 (en) * 2000-06-29 2002-03-07 Paromtchik Igor E. Method for instructing target position for mobile body, method for controlling transfer thereof, and method as well as system of optical guidance therefor
CN1579714A (en) * 2003-08-01 2005-02-16 三星电子株式会社 Robot system and control method thereof
JP2005349497A (en) * 2004-06-08 2005-12-22 Sharp Corp Self-propelled robot
US8606404B1 (en) * 2009-06-19 2013-12-10 Bissell Homecare, Inc. System and method for controlling a cleaning apparatus
CN102846280A (en) * 2011-06-28 2013-01-02 德国福维克控股公司 Self-propelled device and method for guiding such a device
CN106231971A (en) * 2014-02-28 2016-12-14 三星电子株式会社 Sweeping robot and the remote controllers being incorporated herein
CN105078364A (en) * 2014-05-20 2015-11-25 Lg电子株式会社 Cleaner
CN108403009A (en) * 2018-03-08 2018-08-17 徐志强 A kind of sweeping robot and its control method
CN108888187A (en) * 2018-05-31 2018-11-27 四川斐讯信息技术有限公司 A kind of sweeping robot based on depth camera
CN108514389A (en) * 2018-06-04 2018-09-11 赵海龙 A kind of control method of intelligent cleaning equipment
CN112617674A (en) * 2019-10-18 2021-04-09 上海善解人意信息科技有限公司 Sweeping robot system and sweeping robot control method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张立川等: "《自主水下航行器导航与控制技术》", 31 October 2020, 上海科学技术出版社 *

Also Published As

Publication number Publication date
CN114515124B (en) 2022-07-22

Similar Documents

Publication Publication Date Title
CN108885459B (en) Navigation method, navigation system, mobile control system and mobile robot
WO2021103987A1 (en) Control method for sweeping robot, sweeping robot, and storage medium
CN108297115B (en) Autonomous repositioning method for robot
KR100703692B1 (en) System, apparatus and method for improving readability of a map representing objects in space
JP5736622B1 (en) Detection device and operation control of manipulator equipped with the device
US20100245767A1 (en) Eye-tracking method and eye-tracking system for implementing the same
CN109528089B (en) Method, device and chip for continuously walking trapped cleaning robot
TWI502956B (en) Systems and methods for resuming capture of a base image of an object by a mobile scanner
JPWO2003102706A1 (en) Remote operation robot and robot self-position identification method
CN114160507B (en) Laser cleaning path automatic planning method based on multiple sensing detection
JP5775965B2 (en) Stereo camera system and moving body
JP7507964B2 (en) Method and apparatus for adjusting shelf position and orientation by a mobile robot
EP3836084B1 (en) Charging device identification method, mobile robot and charging device identification system
EP3127586B1 (en) Interactive system, remote controller and operating method thereof
CN112716401B (en) Obstacle-detouring cleaning method, device, equipment and computer-readable storage medium
US20200193698A1 (en) Robotic 3d scanning systems and scanning methods
CN111761159B (en) Automatic control method and system for welding equipment, welding robot and storage medium
CN113284178A (en) Object stacking method and device, computing equipment and computer storage medium
CN112748721A (en) Visual robot and cleaning control method, system and chip thereof
CN113760131B (en) Projection touch processing method and device and computer readable storage medium
CN114515124B (en) Cleaning position determining method, cleaning position determining device, cleaning position determining equipment and storage medium
CN112014830A (en) Radar laser reflection and filtering method, sweeping robot, equipment and storage medium
JP2000254883A (en) Tool device, tool recognizing method, and automatic operation device
EP4177837A1 (en) Marker detection apparatus and robot teaching system
CN110946512A (en) Sweeping robot control method and device based on laser radar and camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant