CN113440050B - Cleaning method and device for interaction of AR equipment and sweeper and computer equipment - Google Patents

Cleaning method and device for interaction of AR equipment and sweeper and computer equipment Download PDF

Info

Publication number
CN113440050B
CN113440050B CN202110523060.1A CN202110523060A CN113440050B CN 113440050 B CN113440050 B CN 113440050B CN 202110523060 A CN202110523060 A CN 202110523060A CN 113440050 B CN113440050 B CN 113440050B
Authority
CN
China
Prior art keywords
target
sweeper
cleaning
user
equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110523060.1A
Other languages
Chinese (zh)
Other versions
CN113440050A (en
Inventor
杨滨豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Water World Co Ltd
Original Assignee
Shenzhen Water World Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Water World Co Ltd filed Critical Shenzhen Water World Co Ltd
Priority to CN202110523060.1A priority Critical patent/CN113440050B/en
Publication of CN113440050A publication Critical patent/CN113440050A/en
Application granted granted Critical
Publication of CN113440050B publication Critical patent/CN113440050B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Abstract

The invention relates to the technical field of sweeping robot sweeping control, and provides a sweeping method for interaction of AR equipment and a sweeper, which comprises the following steps: when the AR equipment is started, the AR equipment is accessed to the edge equipment; the AR equipment scans the current environment, acquires a three-dimensional map and an absolute position B, and acquires a sweeper to be selected, which is in communication connection with the edge equipment; uploading the three-dimensional map to edge equipment so that the sweeper to be selected can obtain an absolute position A from the edge equipment; calculating the visual position of a user wearing the AR equipment, and matching the visual position with the absolute position A based on the absolute position B to obtain a target sweeper corresponding to the absolute position A which is successfully matched; and controlling the target sweeper to run to the target sweeping area for sweeping. The invention provides a cleaning method for interaction of AR equipment and a sweeper, and solves the technical problems that in the prior art, manual operation is needed by a user, the user is dependent on manually selecting a corresponding sweeper, complicated steps are needed, and the sweeper is easy to be selected mistakenly.

Description

Cleaning method and device for interaction of AR equipment and sweeper and computer equipment
Technical Field
The invention relates to the technical field of sweeping robot sweeping control, in particular to a sweeping method and device for interaction of AR equipment and a sweeper and computer equipment.
Background
Along with the improvement of living standard and the rapid development of technology of people, the existence of the sweeping robot brings convenience to people in life, frees the hands of people to a certain extent and reduces the labor amount; however, the sweeping robot is mostly controlled by an APP client side to perform sweeping control of the sweeping robot in front, back, left and right directions at present; the APP is opened by a user, the manual control sweeping robot reaches the region needing cleaning and cleans, manual operation of the user is usually needed, the operation is very complicated, in addition, when the robot is in actual use, the user selects the corresponding sweeping machine on the APP manually, interaction is carried out through two-dimensional map operation on the mobile phone, the process is complicated, complicated steps are needed, the robot is not direct and convenient enough, and multiple misoperation is easily caused.
Disclosure of Invention
The invention mainly aims to provide a cleaning method and a cleaning device for interaction of AR equipment and a sweeper, and computer equipment, and aims to solve the problems that in the prior art, manual operation of a user is required, the corresponding sweeper is manually selected by the user, complicated steps are required, the method is not direct and convenient enough, and various wrong operations are easily caused.
The invention provides a cleaning method for interaction of AR equipment and a sweeper, which is applied to the AR equipment and comprises the following steps:
when the AR equipment is started, the AR equipment is accessed to the edge equipment;
the AR equipment scans the current environment, acquires a three-dimensional map generated by scanning the current environment and the position of the AR equipment, records the three-dimensional map as an absolute position B, and acquires a sweeper to be selected, which is in communication connection with the edge equipment;
uploading the three-dimensional map to edge equipment so that the sweeper to be selected can acquire the three-dimensional map from the edge equipment and reposition the three-dimensional map to obtain the position of the sweeper to be selected in the three-dimensional map, or so that the sweeper to be selected can acquire the real-time position of the sweeper to be selected in the three-dimensional map in the motion process and record the real-time position as an absolute position A;
calculating the visual position of a user wearing the AR equipment, and matching the visual position with the absolute position A based on the absolute position B to obtain a target sweeper corresponding to the absolute position A which is successfully matched;
and controlling the target sweeper to run to a target sweeping area for sweeping.
Further, the step of calculating the visual position of the user wearing the AR device, and matching the visual position with the absolute position a based on the absolute position B to obtain the target sweeper corresponding to the absolute position a successfully matched includes:
acquiring the relative position of the visual position of the user relative to the absolute position B, and recording as a relative position C;
calculating an absolute position A' according to the absolute position B and the relative position C;
matching the absolute position A' with the absolute position A one by one; wherein, the matching is to judge whether the absolute position A' and the absolute position A are within a preset distance range;
and if the absolute position A which is within the preset distance range with the absolute position A' is matched, obtaining the target sweeper corresponding to the absolute position A which is successfully matched.
Further, when the number of the target sweeper is multiple, after the step of calculating the visual position of the user wearing the AR device, and matching the visual position with the absolute position a based on the absolute position B to obtain the target sweeper corresponding to the absolute position a successfully matched, the method further includes:
and acquiring one sweeper in the target sweepers selected by the user.
Further, before the step of controlling the target sweeper to move to the target sweeping area for sweeping, the method further includes:
acquiring an intersection point formed by a visual sight of a user and a connecting line on the ground, and taking the intersection point as a midpoint; wherein the connecting line is a connecting line between a finger and the ground;
and acquiring a graph selected by a user and the size of the graph relative to the midpoint, and acquiring a target cleaning area according to the midpoint, the graph selected by the user and the size of the graph relative to the midpoint.
Further, the step of obtaining a figure selected by a user and a size of the figure relative to the midpoint and obtaining a target cleaning area according to the midpoint, the figure selected by the user and the size of the figure relative to the midpoint includes:
acquiring a graph selected by a user and the size of the graph relative to the midpoint in each set duration, and obtaining a target range according to the midpoint, the graph selected by the user and the size of the graph relative to the midpoint; the midpoint is an intersection point formed by the visual sight line and the connecting line on the ground, wherein the visual sight line is acquired within each set time length;
calculating the areas of a plurality of target ranges and sorting the areas from high to low or from low to high;
and taking the sorted target range as a target cleaning area.
Further, before the step of obtaining an intersection point formed by a visual line of a user and a connecting line on the ground and taking the intersection point as a midpoint, the method further comprises:
acquiring the time length of a user for visually observing the target sweeper, and judging whether the time length exceeds a first time length;
if the duration does not exceed the first time length, the target sweeper is not selected;
if the duration exceeds the first time length, the broadcasting user selects the target sweeper and broadcasts a prompt for starting to select a target sweeping area.
Further, the step of controlling the target sweeper to move to the target sweeping area for sweeping includes:
acquiring a cleaning mode and a strength degree of the identified target cleaning area;
and uploading the target cleaning area, the cleaning mode and the strength degree thereof to the edge equipment, so that the target sweeper can obtain the target cleaning area, the cleaning mode and the strength degree thereof from the edge equipment, operate to the target cleaning area, and perform cleaning by adopting the corresponding cleaning mode and the strength degree.
Further, before the step of controlling the target sweeper to move to the target sweeping area for sweeping, the method further includes:
recording a cleaning picture before cleaning is started;
after the step of controlling the target sweeper to move to the target sweeping area for sweeping, the method further comprises the following steps:
recording the cleaning picture again after cleaning;
and acquiring a cleaning picture recorded before the target sweeper starts cleaning and a cleaning picture recorded after the cleaning is finished, and displaying each cleaning picture in front of eyes of a user.
Further, after the step of controlling the target sweeper to move to the target sweeping area for sweeping, the method includes:
acquiring a signal which is sent by the target sweeper and is difficult to determine a target sweeping area;
reacquiring an intersection point formed by a visual sight line of a user and the connecting line on the ground, and taking the intersection point as a midpoint; wherein the connecting line is a connecting line between a finger and the ground;
re-acquiring a graph selected by a user and the size of the graph relative to the midpoint, and obtaining a new target cleaning area according to the midpoint, the graph selected by the user and the size of the graph relative to the midpoint;
and controlling the target sweeper to run to a new target sweeping area for sweeping.
The invention also provides a cleaning device for interaction between the AR equipment and the sweeper, which is applied to the AR equipment and comprises the following components:
the access module is used for accessing the edge device when the AR device is started;
the acquisition module is used for scanning the current environment by the AR equipment, acquiring a three-dimensional map generated by scanning the current environment and the position of the AR equipment, recording the three-dimensional map as an absolute position B, and acquiring a sweeper to be selected connected with the edge equipment;
the uploading module is used for uploading the three-dimensional map to edge equipment so that the sweeper to be selected can acquire the three-dimensional map from the edge equipment and reposition the three-dimensional map to obtain the position of the sweeper to be selected in the three-dimensional map, or so that the sweeper to be selected can acquire the real-time position of the sweeper to be selected in the three-dimensional map in the motion process and mark the real-time position as an absolute position A;
the calculation module is used for calculating the visual position of a user wearing the AR equipment, and matching the visual position with the absolute position A based on the absolute position B to obtain a target sweeper corresponding to the absolute position A which is successfully matched;
and the cleaning module is used for controlling the target sweeper to run to a target cleaning area for cleaning.
The invention has the beneficial effects that: according to the interactive cleaning method, device and computer equipment of the AR equipment and the sweeper, when a user uses the AR equipment in a wearing mode and the like, the AR equipment is connected to the edge equipment, and when the user wears the AR equipment, the current environment is scanned to generate the three-dimensional map so that the sweeper can be used repeatedly; the user wears AR equipment to obtain the sweeper to be selected, which is in communication connection with the edge equipment, so that the visual position and the sweeper position are matched to obtain the target sweeper. The AR equipment uploads the three-dimensional map to the edge equipment, and the sweeper to be selected can acquire the three-dimensional map from the edge equipment and reposition the three-dimensional map to obtain the position of the sweeper to be selected in the three-dimensional map. At this time, when the user visually selects the sweeper, the position of the sweeper needs to be tracked in real time. And recording the current position of the sweeper to be selected as an absolute position A. The AR equipment can accurately select the target sweeper according to the visual position and the sweeper position, is more direct and convenient, is not easy to cause various wrong operations and select errors or overlapping, and finally controls the target sweeper to move to a target area for sweeping. According to the invention, the position of the sweeper is not required to be manually selected on the APP by a user, the position of the sweeper can be visually tracked in real time, the situation of mistaken operation/sweeper selection is not easy to occur, the operation is simple and convenient, and the manual operation on the APP by the user is not relied.
Drawings
Fig. 1 is a schematic flow chart of a method according to an embodiment of the present invention.
Fig. 2 is a schematic structural diagram of an apparatus according to an embodiment of the present invention.
Fig. 3 is a schematic diagram of an internal structure of a computer device according to an embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
As shown in fig. 1, the present invention provides a cleaning method for interaction between an AR device and a sweeper, which is applied to the AR device and includes:
s1, when the AR equipment is started, the edge equipment is accessed;
s2, the AR equipment scans the current environment, acquires a three-dimensional map generated by scanning the current environment and the position of the AR equipment, records the three-dimensional map as an absolute position B, and acquires a sweeper to be selected, which is in communication connection with the edge equipment;
s3, uploading the three-dimensional map to edge equipment, so that the sweeper to be selected can acquire the three-dimensional map from the edge equipment and reposition the three-dimensional map to obtain the position of the sweeper to be selected in the three-dimensional map, or so that the sweeper to be selected can acquire the real-time position of the sweeper to be selected in the three-dimensional map in the motion process and mark the real-time position as an absolute position A;
s4, acquiring an absolute position A of the sweeper to be selected in the edge equipment;
s5, calculating the visual position of a user wearing the AR equipment, and matching the visual position with the absolute position A based on the absolute position B to obtain a target sweeper corresponding to the absolute position A which is successfully matched;
and S6, controlling the target sweeper to run to a target sweeping area for sweeping.
As described in the foregoing steps S1-S2, when the user wears an AR (Augmented Reality) device, the edge device (or cloud computing module) is accessed, the edge device is broadcast-controlled by the router, and at this time, the started sweeper to be selected is also accessed to the edge device, and the ID number (as long as the ID number/name of the sweeper object can be conveniently confirmed, and all of the ID numbers are in the category of the ID number) of the sweeper to be selected is assigned by the edge device, so that the target sweepers can be clearly distinguished when a plurality of target sweepers are provided subsequently. When a user wears the AR equipment, the current environment is scanned to generate a three-dimensional map so that the to-be-selected sweeper can be used repeatedly, the current environment refers to the environment of a room where the user is located, the user scans the whole room in a mode of rotating and walking for one circle or four places, and if the area is larger, the user walks all areas as far as possible to realize scanning in a larger range. The AR equipment worn by the user can acquire the ID numbers of the sweeper to be selected in the edge equipment, so that the multiple target sweepers can be distinguished and the sweeping area can be assigned according to the ID numbers.
As described in the above steps S3-S4, the AR device uploads the three-dimensional map to the edge device, at this time, the to-be-selected sweeper can obtain the three-dimensional map from the edge device to reposition and obtain the position of the to-be-selected sweeper in the three-dimensional map, at this time, if the to-be-selected sweeper starts to move, the position of the to-be-selected sweeper can be obtained in real time, and a trajectory route can be formed with the previous position, or the previous position is covered so that the to-be-selected sweeper always has a unique position, at this time, when the user visually selects the to-be-selected sweeper, the position of the to-be-selected sweeper needs to be tracked in real time. And recording the current position of the sweeper to be selected as an absolute position A, wherein the absolute position A is inevitably the current position of the sweeper to be selected. When a plurality of to-be-selected sweeper exist, each to-be-selected sweeper acquires an absolute position A, and associates the current position with the ID number (the allocated ID number) of the sweeper, so that each to-be-selected sweeper has the association relationship between the ID number and the position; the AR equipment acquires the association relation between the ID numbers of all the to-be-selected sweeper and the positions of the to-be-selected sweeper in real time, so that the AR equipment can accurately select the target sweeper according to the visual position and the position of the to-be-selected sweeper and acquire the ID numbers of the target sweeper, and selection errors or overlapping are avoided.
As described in step S5, AR is a technology for skillfully fusing virtual information with the real world, and a variety of technical means such as multimedia, three-dimensional modeling, real-time tracking and registration, intelligent interaction, and sensing are widely applied, and virtual information such as characters, images, three-dimensional models, music, and videos generated by a computer is applied to the real world after being simulated, so that the two kinds of information complement each other, thereby realizing "enhancement" of the real world.
The sweeper to be selected is in an immovable state in an initial starting state, and before matching operation is carried out, ID numbers of the sweeper to be selected can be distributed, so that when a plurality of sweepers to be selected exist, which sweeper to be selected is selected by a user can be distinguished. The ID number of the sweeper to be selected has a binding relationship with the position of the sweeper to be selected in an initial state or a real-time moving position, and in the initial state, the sweeper to be selected acquires the position of the sweeper to be selected through relocation; in the moving state, the sweeper to be selected can acquire the current position (which can be bound with the ID number) in real time, and form a track route with the previously stored position, or cover the current position with the previous position, so that the sweeper to be selected has a unique position. After the user wears the AR equipment, the visual position of the user can be obtained through the identification of the AR equipment, the visual range of the user can reach 10m generally, the visual position of the user is matched with the position of the sweeper to be selected, the target sweeper is obtained, and meanwhile, the ID number of the target sweeper can also be obtained through the binding relationship between the position of the sweeper to be selected and the ID number.
As described in step S6, before the target sweeper is operated, the sweeping mode and the strength level may be selected for the target sweeper, and after the AR device acquires the target sweeping area, the sweeping mode thereof, and the strength level thereof, the target sweeping area, the sweeping mode thereof, and the strength level thereof are transmitted to the target sweeper for sweeping. The target sweeper can also distinguish through the ID number, and after the target sweeper acquires corresponding data, the target sweeper can operate to a target cleaning area and adopt a corresponding cleaning mode to the target cleaning area, so that cleaning is performed in a strong degree.
In one embodiment, the step of calculating the visual position of the user wearing the AR device, and matching the visual position with the absolute position a based on the absolute position B to obtain the target sweeper corresponding to the successfully matched absolute position a includes:
s51, acquiring the relative position of the visual position of the user relative to the absolute position B, and recording as a relative position C;
s52, calculating an absolute position A' according to the absolute position B and the relative position C;
s53, matching the absolute position A' with the absolute position A one by one; wherein, the matching is to judge whether the absolute position A' and the absolute position A are within a preset distance range;
and S54, if the absolute position A within the preset distance range with the absolute position A' is matched, obtaining the target sweeper corresponding to the successfully matched absolute position A.
As described in the above steps S51-S52, when the user wears the AR device and scans the current environment to generate the three-dimensional map, the position of the user relative to the three-dimensional map may be obtained and recorded as the absolute position B; the user faces a certain sweeper to be selected, the user watches the sweeper to be selected in a binocular mode, the eyeball tracking technology (the pupil direction of the left eye and the pupil direction of the right eye and the triangle distance measurement are used) is used, the watching direction of the user can be obtained, the position of the visual position relative to the user, namely the relative position (including the direction and the size) of the user is obtained through identification, the relative position is recorded as a relative position C, and at the moment, the visual position of the user can be calculated through the absolute position B and the relative position C and recorded as an absolute position A'.
As described in the above steps S53-S54, the calculated absolute position a 'is matched with all the absolute positions a one by one, and it is determined whether the absolute position a' and the absolute position a are within a preset distance range (the preset distance range is specifically set according to the user 'S needs, and is preferably a circular range, and the preset distance is a radius distance), and if the absolute position a' and the absolute position a are within the preset distance range, it indicates that the visual position of the user has a selectable sweeper to be selected, and the sweeper to be selected is used as the target sweeper. In addition, the ID number of the target sweeper can be obtained through the incidence relation between the ID number of the sweeper to be selected and the position of the sweeper.
In one embodiment, when the number of the target sweeper is multiple, after the step of calculating the visual position of the user wearing the AR device, and matching the visual position with the absolute position a based on the absolute position B to obtain the target sweeper corresponding to the absolute position a successfully matched, the method further includes:
and S55, acquiring one of the target sweeping machines selected by the user. .
As described in the step S55, when there are multiple candidate scanners, the absolute position a 'of the user' S eye and the absolute position a of the candidate scanner are within the preset distance range, the number of the matched target scanners is greater than one, and the ID numbers of the multiple target scanners can be obtained through the association relationship between the ID numbers, and at this time, the user only needs to be prompted to select one of the scanners (the scanners can be selected through the ID numbers).
In one embodiment, before the step of controlling the target sweeper to move to the target sweeping area for sweeping, the method further includes:
s061, acquiring an intersection point formed by visual sight of a user and the connecting line on the ground, and taking the intersection point as a midpoint; wherein the connecting line is a connecting line between a finger and the ground;
and S062, acquiring a graph selected by a user and the size of the graph relative to the midpoint, and obtaining a target cleaning area according to the midpoint, the graph selected by the user and the size of the graph relative to the midpoint.
As described in step S061, the user visually observes (i.e., the eyes of the user) to form a straight line with the ground, and the user' S fingers form a straight line with the ground, and when the intersection point formed between the two straight lines is located on the ground, the intersection point located on the ground is used as the middle point of the target cleaning region for the next operation. When a user visually observes that the intersection point of the straight line formed by the straight line and the ground and the intersection point of the straight line formed by the gesture and the ground are not on the ground, the user is prompted through voice broadcast or interface display that the ground is not ground and cannot be cleaned, next-step graphic selection cannot be carried out, and the intersection point needs to be adjusted.
When the intersection point formed between the two straight lines is on the ground, the intersection point on the ground is taken as the middle point, and the user is allowed to select the figure of the target area and the size of the figure relative to the middle point; if the figure selected by the user is a circle, selecting the radius of the circle by taking the middle point as the circle center, and further obtaining a circular target cleaning area; or the graph selected by the user is a rectangle, and then the distances from the middle point to the four vertexes of the rectangle are selected to obtain a rectangular target cleaning area; the target cleaning area is not limited to the circular and rectangular shapes as described above, but may also include other shapes, which are not described herein.
The user can wear the AR equipment and select the target sweeper through visual mode, wherein, target sweeper self can have the ID number, so that can obviously distinguish many target sweepers, select target sweeper back AR equipment can report, after broadcasting and selecting the target sweeper, the user selects the target through visual and gesture mode and cleans the region, obtain the target and clean regional back, can also let the user select the regional mode and the powerful degree of cleaning of target cleaning, and clean the region with the target, clean the data transmission of mode and powerful degree and give the target sweeper, the target sweeper can move to the target and clean the region and clean through corresponding mode and powerful degree of cleaning. In the process, the AR glasses and the object recognition technology are used, so that what a user sees is what the user can obtain, the user guides the target sweeper to clean in an expected place in a visual mode or a gesture mode, and the operation is simple and convenient.
In one embodiment, the step of obtaining a user-selected figure and a size of the figure relative to the midpoint and obtaining a target sweeping area based on the midpoint, the user-selected figure and the size of the figure relative to the midpoint includes:
s0621, in each set duration, obtaining a graph selected by a user and the size of the graph relative to the midpoint, and obtaining a target range according to the midpoint, the graph selected by the user and the size of the graph relative to the midpoint; the midpoint is an intersection point formed by the visual sight line and the connecting line on the ground, wherein the visual sight line is acquired within each set time length;
s0622, calculating the areas of a plurality of target ranges, and sorting the areas according to the sequence from high to low or from low to high;
and S0623, taking the sequenced target range as a target cleaning area.
As described in step S0621, within a set time period (usually set to 1 minute, and may also be adjusted according to specific circumstances), a straight line may be formed by the user 'S eyes (i.e., eyes) and the ground, and a straight line may be formed by the user' S fingers and the ground as well, when an intersection point formed between two straight lines is located on the ground, the intersection point located on the ground is taken as a midpoint, so that the user may select a graphic of the target area and a size of the graphic relative to the midpoint (as described above, the detailed description is omitted), and obtain a target range; and in the next 1 minute, another target range is obtained, and a plurality of target ranges can be obtained by repeating the steps. When a user visually observes that the intersection point of the straight line formed by the straight line and the ground and the intersection point of the straight line formed by the gesture and the ground are not on the ground, the user is prompted through voice broadcast or interface display that the ground is not ground and cannot be cleaned, next image selection cannot be performed, and the intersection point needs to be adjusted.
As described in the foregoing steps S0622-S0623, after a plurality of target ranges are obtained, the areas of the target ranges are calculated by obtaining the graphs and the size relationships selected by the user during the process, and the target ranges are sorted according to the areas from high to low or from low to high, and the sorted target ranges are used as target cleaning areas, so that the target sweeper cleans one by one according to the sorting, and the operation complexity caused by repeated control over the target sweeper for many times is avoided.
In one embodiment, before the step of acquiring an intersection formed by a line of sight of a user's eyes and a connecting line on the ground, and taking the intersection as a midpoint, the method further comprises:
s01, acquiring the time length of the user for visually observing the target sweeper, and judging whether the time length exceeds a first time length;
s02, if the duration does not exceed the first time length, the target sweeper is not selected;
and S03, if the duration exceeds the first time length, broadcasting that the target sweeper is selected by the user, and broadcasting a prompt for starting to select a target sweeping area.
As described in the foregoing steps S01-S03, after the target sweeper is selected (when the target sweeper has the ID number, the ID number of the target sweeper is obtained), a duration for the user to visually observe the target sweeper is obtained, and whether the duration exceeds a first time length (generally set to 5 seconds, or specifically set according to actual conditions) is determined, the visual duration of the user may be displayed in front of the user, so that the user can know the visual duration. If the time for the user to visually observe the target sweeper does not exceed 5 seconds, the user is indicated that the target sweeper is not selected; if the time length of the user for visually observing the target sweeper exceeds 5 seconds, the fact that the user selects the target sweeper (the ID number of the target sweeper is acquired when the ID number exists) is shown, the AR equipment starts broadcasting after the user selects the target sweeper in a visual mode (the ID number of the target sweeper is acquired), the user is prompted to select the target sweeper and starts selecting a target sweeping area, the broadcasting mode generally adopts voice broadcasting and can also display that the operation of the user for selecting the target sweeper is completed and starts selecting the target sweeping area, and the purpose is to remind the user of which step the current operation is performed, so that confusion is avoided, and the integral management and command of the sweeper are facilitated.
In one embodiment, the step of controlling the target sweeper to move to the target sweeping area for sweeping includes:
s61, acquiring the cleaning mode and the strength degree of the identified target cleaning area;
and S62, uploading the target cleaning area, the cleaning mode and the strength degree thereof to the edge equipment, so that the target sweeper can obtain the target cleaning area, the cleaning mode and the strength degree thereof from the edge equipment, running to the target cleaning area, and cleaning by adopting the corresponding cleaning mode and the strength degree.
As described in step S61, the AR device can identify the target cleaning area and simultaneously identify the degree of soiling of the target cleaning area, and display the degree of soiling in the form of percentage or text in front of the user, so that the user can select the degree of soiling, or directly transmit the degree of soiling to the target sweeper for cleaning. After the user selects the target cleaning area, the user is prompted to select the cleaning mode and the strength degree of the target cleaning area, the user can select the cleaning mode and the strength degree of the target cleaning area according to the dirt degree recognized by the AR equipment, the user can also select the corresponding cleaning mode and the strength degree by himself/herself through distinguishing the target cleaning area by naked eyes, the cleaning mode can be cleaning while disinfection, only cleaning or only disinfection, and the like, the strength degree can be distinguished through different rolling brush currents of the target sweeper, the cleaning mode and the strength degree are not limited to the above-mentioned modes, and other modes can be adopted, and are not limited herein.
As described in step S62, after the target cleaning area, the cleaning pattern thereof, and the power level are obtained, and when the target sweeper has the ID number, the target cleaning area, the cleaning pattern thereof, and the power level are associated with the ID number of the target sweeper, so that the target sweeper can accurately distinguish and receive data according to the ID number.
In one embodiment, before the step of controlling the target sweeper to move to the target sweeping area for sweeping, the method further includes:
s001, recording a cleaning picture before cleaning is started;
after the step of controlling the target sweeper to move to the target sweeping area for sweeping, the method further comprises the following steps:
s002, recording the cleaning picture again after cleaning is finished;
and S003, acquiring a cleaning picture recorded before the target sweeper starts cleaning and a cleaning picture recorded after the target sweeper finishes cleaning, and displaying each cleaning picture in front of eyes of a user.
As described in the foregoing steps S001 to S003, the target cleaning area, the cleaning mode thereof, and the strength degree are uploaded to the edge device (when there is an ID number, the ID number is uploaded together), and since the target sweeper is connected to the edge device, the target sweeper can acquire information from the edge device and start to execute a task (when there is an ID number, the task is acquired through the ID number), a cleaning picture is recorded before the target sweeper is operated to the target cleaning area for cleaning, the cleaning picture is recorded again after the cleaning is completed, a cleaning picture recorded by the target sweeper before the cleaning is started and a cleaning picture recorded again after the cleaning is completed are acquired, and the cleaning picture is presented to the eyes of the user more truly, so that the user can distinguish whether the target sweeper is cleaned through the cleaning pictures recorded before and after the cleaning.
In one embodiment, after the step of controlling the target sweeper to move to the target sweeping area for sweeping, the method includes:
s7, acquiring a signal which is sent by the target sweeper and is difficult to determine a target sweeping area;
s8, re-acquiring an intersection point formed by the visual sight of the user and the connecting line on the ground, and taking the intersection point as a midpoint; wherein the connecting line is a connecting line between a finger and the ground;
re-acquiring the graph selected by the user and the size of the graph relative to the midpoint, and obtaining a new target cleaning area according to the S9, the midpoint, the graph selected by the user and the size of the graph relative to the midpoint;
and S10, controlling the target sweeper to move to a new target sweeping area for sweeping.
As described in the above steps S7-S8, there may be a case where the target sweeping area cannot be determined during the operation of the target sweeper, such as when the target cleaning area is determined by the midpoint and size, the target cleaning area may include the interior of a wall or an obstacle, at this time, the target sweeper sends a signal that the target sweeping area is difficult to determine (the signal may include a distress signal of the target sweeper, for example, when the target sweeper is trapped and difficult to escape, a signal that the target sweeping area is difficult to determine is sent to enable the AR equipment to assist in obtaining the environment map of the sweeping area), the AR equipment obtains the signal sent by the target sweeper and can obtain the ID number of the target sweeper when the signal has the ID number, the target cleaning area corresponding to the target sweeper which sends the signal at present can be obtained through the ID number, so that whether the target cleaning area can be cleaned or whether the situation of wrong judgment exists can be judged.
As described in the above steps S9-S10, after the target cleaning area corresponding to the target sweeper sending the signal is acquired, the user acquires the intersection point formed by the visual line of sight of the user and the connecting line (connecting line between the finger and the ground) on the ground again, and uses the intersection point as the midpoint, and acquires the graph selected by the user and the size of the graph relative to the midpoint again, that is, the user selects a new target cleaning area for the target sweeper again, uploads the new target cleaning area to the edge device, and the target sweeper sending the signal at present can acquire the new target cleaning area from the edge device and starts cleaning.
As shown in fig. 2, this embodiment also provides a cleaning apparatus for interaction between an AR device and a sweeper, which is applied to the AR device, and includes:
the access module 1 is used for accessing the edge device when the AR device is started;
the acquisition module 2 is used for the AR equipment to scan the current environment, acquiring a three-dimensional map generated by scanning the current environment and the position of the AR equipment, recording the three-dimensional map as an absolute position B, and acquiring a sweeper to be selected in communication connection with the edge equipment;
the uploading module 3 is used for uploading the three-dimensional map to edge equipment so that the sweeper to be selected can acquire the three-dimensional map from the edge equipment and reposition the three-dimensional map to obtain the position of the sweeper to be selected in the three-dimensional map, or so that the sweeper to be selected can acquire the real-time position of the sweeper to be selected in the three-dimensional map in the motion process and record the real-time position as an absolute position A;
the calculation module 4 is used for calculating the visual position of a user wearing the AR equipment, and matching the visual position with the absolute position A based on the absolute position B to obtain a target sweeper corresponding to the absolute position A which is successfully matched;
and the cleaning module 5 is used for controlling the target sweeper to run to a target cleaning area for cleaning.
In one embodiment, the calculation module 4 comprises:
the position acquisition unit is used for acquiring the relative position of the visual position of the user relative to the absolute position B and recording the relative position as a relative position C;
a position calculation unit for calculating an absolute position A' from the absolute position B and the relative position C;
the range judging unit is used for matching the absolute position A' with the absolute position A one by one; wherein, the matching is to judge whether the absolute position A' and the absolute position A are within a preset distance range;
and the first target sweeper acquisition unit is used for acquiring a target sweeper corresponding to the absolute position A which is successfully matched when the absolute position A which is within the preset distance range with the absolute position A' is matched.
In one embodiment, further comprising:
and the second target sweeper acquisition unit is used for acquiring one sweeper selected by the user from the plurality of target sweepers.
In one embodiment, further comprising:
the intersection point acquisition unit is used for acquiring an intersection point formed by a visual sight of a user and the connecting line on the ground and taking the intersection point as a middle point; wherein the connecting line is a connecting line between a finger and the ground;
and the graph acquisition unit is used for acquiring a graph selected by a user and the size of the graph relative to the midpoint, and acquiring a target cleaning area according to the midpoint, the graph selected by the user and the size of the graph relative to the midpoint.
In one embodiment, the graphics capture unit comprises:
the target range unit is used for acquiring a graph selected by a user and the size of the graph relative to the midpoint in each set duration, and acquiring a target range according to the midpoint, the graph selected by the user and the size of the graph relative to the midpoint; the midpoint is an intersection point formed by the visual sight line and the connecting line on the ground, wherein the visual sight line is acquired within each set time length;
the area calculation unit is used for calculating the areas of the target ranges and sorting the areas from high to low or from low to high;
and the target cleaning area unit is used for taking the sequenced target range as a target cleaning area.
In one embodiment, further comprising:
the duration acquisition unit is used for acquiring the duration of the target sweeper viewed by the user and judging whether the duration exceeds a first time length;
the unselected unit is used for indicating that the target sweeper is not selected when the duration does not exceed a first time length;
and the broadcasting unit is used for broadcasting that the user selects the target sweeper and broadcasting a prompt for starting to select the target sweeping area when the duration exceeds the first time length.
In one embodiment, the purge module comprises:
a mode acquisition unit for acquiring a cleaning mode and a strength degree of the identified target cleaning area;
the first cleaning unit is used for uploading the target cleaning area, the cleaning mode and the strength degree thereof to the edge equipment, so that the target sweeper can obtain the target cleaning area, the cleaning mode and the strength degree thereof from the edge equipment, operate to the target cleaning area and perform cleaning by adopting the corresponding cleaning mode and the strength degree.
In one embodiment, further comprising:
a first recording unit for recording a cleaning screen before starting cleaning;
the second recording unit is used for recording the cleaning picture again after the cleaning is finished;
and the presentation unit is used for acquiring a cleaning picture recorded before the target sweeper starts cleaning and a cleaning picture recorded after the cleaning is finished, and presenting each cleaning picture in front of eyes of a user.
In one embodiment, the method comprises the following steps:
the determining unit is used for acquiring a signal which is sent by the target sweeper and is difficult to determine a target sweeping area;
the reacquiring intersection point unit is used for reacquiring an intersection point formed by a visual sight line of a user and the connecting line on the ground, and taking the intersection point as a midpoint; wherein the connecting line is a connecting line between a finger and the ground;
the re-acquisition graphic unit is used for re-acquiring a graphic selected by a user and the size of the graphic relative to the midpoint, and acquiring a new target cleaning area according to the midpoint, the graphic selected by the user and the size of the graphic relative to the midpoint;
and the second cleaning unit controls the target sweeper to run to a new target cleaning area for cleaning.
Each unit is a cleaning device for correspondingly executing interaction between the AR equipment and the sweeper.
As shown in fig. 3, the present invention also provides a computer device, which may be a server, and the internal structure of which may be as shown in fig. 3. The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the computer designed processor is used to provide computational and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The memory provides an environment for the operation of the operating system and the computer program in the non-volatile storage medium. The database of the computer device is used to store all the data required by the process of the cleaning method of the interaction of the AR device with the sweeper. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a sweeping method of interaction of an AR device with a sweeper.
Those skilled in the art will appreciate that the architecture shown in fig. 3 is only a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects may be applied.
An embodiment of the present application further provides a computer-readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the cleaning method for interaction between any one of the AR devices and the sweeper is implemented.
It will be understood by those skilled in the art that all or part of the processes of the methods of the above embodiments may be implemented by hardware associated with instructions of a computer program, which may be stored on a non-volatile computer-readable storage medium, and when executed, may include processes of the above embodiments of the methods. Any reference to memory, storage, database, or other medium provided herein and used in the examples may include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double-rate SDRAM (SSRSDRAM), Enhanced SDRAM (ESDRAM), synchronous link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, apparatus, article, or method that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, apparatus, article, or method. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, apparatus, article, or method that includes the element.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. A cleaning method for interaction of AR equipment and a sweeper is applied to the AR equipment and is characterized by comprising the following steps:
when the AR equipment is started, the AR equipment is accessed to the edge equipment;
the AR equipment scans the current environment, acquires a three-dimensional map generated by scanning the current environment and the position of the AR equipment, records the three-dimensional map as an absolute position B, and acquires a sweeper to be selected, which is in communication connection with the edge equipment;
uploading the three-dimensional map to edge equipment so that the sweeper to be selected can acquire the three-dimensional map from the edge equipment and reposition the three-dimensional map to obtain the position of the sweeper to be selected in the three-dimensional map, or so that the sweeper to be selected can acquire the real-time position of the sweeper to be selected in the three-dimensional map in the motion process and record the real-time position as an absolute position A;
calculating the visual position of a user wearing the AR equipment, and matching the visual position with the absolute position A based on the absolute position B to obtain a target sweeper corresponding to the absolute position A which is successfully matched;
and controlling the target sweeper to run to a target sweeping area for sweeping.
2. The method of claim 1, wherein the step of calculating the visual position of the user wearing the AR device and matching the visual position with the absolute position a based on the absolute position B to obtain the target sweeper corresponding to the successfully matched absolute position a comprises:
acquiring the relative position of the visual position of the user relative to the absolute position B, and recording as a relative position C;
calculating an absolute position A' according to the absolute position B and the relative position C;
matching the absolute position A' with the absolute position A one by one; wherein, the matching is to judge whether the absolute position A' and the absolute position A are within a preset distance range;
and if the absolute position A which is within the preset distance range with the absolute position A' is matched, obtaining the target sweeper corresponding to the absolute position A which is successfully matched.
3. The method of claim 2, wherein when the number of the target sweeper is multiple, the step of calculating a visual position of the user wearing the AR device, and matching the visual position with the absolute position a based on the absolute position B to obtain a target sweeper corresponding to the successfully matched absolute position a further comprises:
and acquiring one sweeper in the target sweepers selected by the user.
4. The method of claim 1, wherein prior to the step of controlling the target sweeper to move to a target cleaning area for cleaning, the method further comprises:
acquiring an intersection point formed by a visual sight of a user and a connecting line on the ground, and taking the intersection point as a midpoint; wherein the connecting line is a connecting line between a finger and the ground;
and acquiring a graph selected by a user and the size of the graph relative to the midpoint, and acquiring a target cleaning area according to the midpoint, the graph selected by the user and the size of the graph relative to the midpoint.
5. The method of claim 4, wherein the step of obtaining a user-selected graphic and a size of the graphic relative to the midpoint, and deriving the target cleaning area from the midpoint, the user-selected graphic and the size of the graphic relative to the midpoint, comprises:
acquiring a graph selected by a user and the size of the graph relative to the midpoint in each set duration, and obtaining a target range according to the midpoint, the graph selected by the user and the size of the graph relative to the midpoint; the midpoint is an intersection point formed by the visual sight line and the connecting line on the ground, wherein the visual sight line is acquired within each set time length;
calculating the areas of a plurality of target ranges and sorting the areas from high to low or from low to high;
and taking the sorted target range as a target cleaning area.
6. The method of claim 4, wherein the step of obtaining an intersection of the line of sight of the user's eyes and the ground surface and using the intersection as a midpoint further comprises:
acquiring the time length of a user for visually observing the target sweeper, and judging whether the time length exceeds a first time length;
if the duration does not exceed the first time length, the target sweeper is not selected;
if the duration exceeds the first time length, the broadcasting user selects the target sweeper and broadcasts a prompt for starting to select a target sweeping area.
7. The method of claim 1, wherein the step of controlling the target sweeper to move to a target cleaning area for cleaning comprises:
acquiring a cleaning mode and a strength degree of the identified target cleaning area;
and uploading the target cleaning area, the cleaning mode and the strength degree thereof to the edge equipment, so that the target sweeper can obtain the target cleaning area, the cleaning mode and the strength degree thereof from the edge equipment, operate to the target cleaning area, and perform cleaning by adopting the corresponding cleaning mode and the strength degree.
8. The method of claim 1, wherein prior to the step of controlling the target sweeper to move to a target cleaning area for cleaning, the method further comprises:
recording a cleaning picture before cleaning is started;
after the step of controlling the target sweeper to move to the target sweeping area for sweeping, the method further comprises the following steps:
recording the cleaning picture again after cleaning;
and acquiring a cleaning picture recorded before the target sweeper starts cleaning and a cleaning picture recorded after the cleaning is finished, and displaying each cleaning picture in front of eyes of a user.
9. The AR equipment and sweeper interaction sweeping method of claim 4, wherein the step of controlling the target sweeper to move to a target sweeping area for sweeping comprises, after:
acquiring a signal which is sent by the target sweeper and is difficult to determine a target sweeping area;
reacquiring an intersection point formed by a visual sight line of a user and the connecting line on the ground, and taking the intersection point as a midpoint; wherein the connecting line is a connecting line between a finger and the ground;
re-acquiring a graph selected by a user and the size of the graph relative to the midpoint, and obtaining a new target cleaning area according to the midpoint, the graph selected by the user and the size of the graph relative to the midpoint;
and controlling the target sweeper to run to a new target sweeping area for sweeping.
10. The utility model provides a mutual cleaning device of AR equipment and street sweeper, is applied to AR equipment, its characterized in that includes:
the access module is used for accessing the edge device when the AR device is started;
the acquisition module is used for scanning the current environment by the AR equipment, acquiring a three-dimensional map generated by scanning the current environment and the position of the AR equipment, recording the three-dimensional map as an absolute position B, and acquiring a sweeper to be selected connected with the edge equipment;
the uploading module is used for uploading the three-dimensional map to edge equipment so that the sweeper to be selected can acquire the three-dimensional map from the edge equipment and reposition the three-dimensional map to obtain the position of the sweeper to be selected in the three-dimensional map, or so that the sweeper to be selected can acquire the real-time position of the sweeper to be selected in the three-dimensional map in the motion process and mark the real-time position as an absolute position A;
the calculation module is used for calculating the visual position of a user wearing the AR equipment, and matching the visual position with the absolute position A based on the absolute position B to obtain a target sweeper corresponding to the absolute position A which is successfully matched;
and the cleaning module is used for controlling the target sweeper to run to a target cleaning area for cleaning.
CN202110523060.1A 2021-05-13 2021-05-13 Cleaning method and device for interaction of AR equipment and sweeper and computer equipment Active CN113440050B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110523060.1A CN113440050B (en) 2021-05-13 2021-05-13 Cleaning method and device for interaction of AR equipment and sweeper and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110523060.1A CN113440050B (en) 2021-05-13 2021-05-13 Cleaning method and device for interaction of AR equipment and sweeper and computer equipment

Publications (2)

Publication Number Publication Date
CN113440050A CN113440050A (en) 2021-09-28
CN113440050B true CN113440050B (en) 2022-04-22

Family

ID=77809762

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110523060.1A Active CN113440050B (en) 2021-05-13 2021-05-13 Cleaning method and device for interaction of AR equipment and sweeper and computer equipment

Country Status (1)

Country Link
CN (1) CN113440050B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113934307B (en) * 2021-12-16 2022-03-18 佛山市霖云艾思科技有限公司 Method for starting electronic equipment according to gestures and scenes
CN117336354A (en) * 2022-06-27 2024-01-02 华为技术有限公司 Anti-collision method of Virtual Reality (VR) equipment and electronic equipment

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016189390A2 (en) * 2015-05-28 2016-12-01 Eyesight Mobile Technologies Ltd. Gesture control system and method for smart home
WO2019209878A1 (en) * 2018-04-23 2019-10-31 Purdue Research Foundation Robot navigation and robot-iot interactive task planning using augmented reality
CN112739244B (en) * 2018-07-13 2024-02-09 美国iRobot公司 Mobile robot cleaning system
CN109408234A (en) * 2018-10-19 2019-03-01 国云科技股份有限公司 A kind of augmented reality data-optimized systems and method based on edge calculations
CN109324693A (en) * 2018-12-04 2019-02-12 塔普翊海(上海)智能科技有限公司 AR searcher, the articles search system and method based on AR searcher
US20230230374A1 (en) * 2020-05-15 2023-07-20 Beijing Xiaomi Mobile Software Co., Ltd. Method and device for positioning internet of things devices

Also Published As

Publication number Publication date
CN113440050A (en) 2021-09-28

Similar Documents

Publication Publication Date Title
CN113440050B (en) Cleaning method and device for interaction of AR equipment and sweeper and computer equipment
CN112215843B (en) Ultrasonic intelligent imaging navigation method and device, ultrasonic equipment and storage medium
CN105391970B (en) The method and system of at least one image captured by the scene camera of vehicle is provided
KR101652311B1 (en) System and method for storing information of vision image, and the recording media storing the program performing the said method
WO2019204118A1 (en) System and method for detecting human gaze and gesture in unconstrained environments
CN107004275A (en) For determining that at least one of 3D in absolute space ratio of material object reconstructs the method and system of the space coordinate of part
CN102846339B (en) Method and ultrasonic image-forming system for image bootstrap
CN112907751B (en) Virtual decoration method, system, equipment and medium based on mixed reality
KR101949261B1 (en) method for generating VR video, method for processing VR video, and system for processing VR video
CN111248815B (en) Method, device and equipment for generating working map and storage medium
CN101542532A (en) A method, an apparatus and a computer program for data processing
CN115100742A (en) Meta-universe exhibition and display experience system based on air-separating gesture operation
CN109558004A (en) A kind of control method and device of human body auxiliary robot
CN108829233A (en) A kind of exchange method and device
CN116416518A (en) Intelligent obstacle avoidance method and device
CN117333644A (en) Virtual reality display picture generation method, device, equipment and medium
CN111487980B (en) Control method of intelligent device, storage medium and electronic device
CN112220405A (en) Self-moving tool cleaning route updating method, device, computer equipment and medium
CN111984017A (en) Cleaning equipment control method, device and system and computer readable storage medium
KR20180054380A (en) Golf system and Image acquisition device for use therein
JP7133971B2 (en) 3D model generation device and 3D model generation method
CN115904188A (en) Method and device for editing house-type graph, electronic equipment and storage medium
CN113093907B (en) Man-machine interaction method, system, equipment and storage medium
CN115174857A (en) Intelligent examination room monitoring method, system, equipment and storage medium for experimental examination
IL308975A (en) Using slam 3d information to optimize training and use of deep neural networks for recognition and tracking of 3d object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant