CN108297099A - A kind of self-rescue method and robot of robot - Google Patents

A kind of self-rescue method and robot of robot Download PDF

Info

Publication number
CN108297099A
CN108297099A CN201810097406.4A CN201810097406A CN108297099A CN 108297099 A CN108297099 A CN 108297099A CN 201810097406 A CN201810097406 A CN 201810097406A CN 108297099 A CN108297099 A CN 108297099A
Authority
CN
China
Prior art keywords
user
robot
self
preset
network database
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810097406.4A
Other languages
Chinese (zh)
Inventor
洪帆
周能文
刘雄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Wind Communication Technologies Co Ltd
Original Assignee
Shanghai Wind Communication Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Wind Communication Technologies Co Ltd filed Critical Shanghai Wind Communication Technologies Co Ltd
Priority to CN201810097406.4A priority Critical patent/CN108297099A/en
Publication of CN108297099A publication Critical patent/CN108297099A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The present embodiments relate to smart machine fields, disclose a kind of self-rescue method and robot of robot.In the present invention, the self-rescue method of robot includes:When out of touch with the first user, sent to preset second user and help to ask;When receiving for the agreement feedback for helping to ask, control authority is provided for second user;Receive the control instruction of second user and execution;Wherein, preset second user comes from relationship grid database so that robot can voluntarily ask for help when out of touch with specific user, to avoid the loss of robot as far as possible.

Description

Self-rescue method of robot and robot
Technical Field
The embodiment of the invention relates to the field of intelligent equipment, in particular to a self-rescue technology of the intelligent equipment.
Background
A Robot (Robot) is a machine device that automatically performs work. The intelligent robot can accept human command, run a pre-programmed program and perform actions according to principles formulated by artificial intelligence technology, and has the task of assisting or replacing human work, such as production, construction or dangerous work. In the modern society, a robot enters a home to serve family members, the robot can provide various types of services such as nursing services, leg running services, cooking services and the like in the home, and the robot can perform more and more complex work due to rapid development of artificial intelligence.
The inventor finds that at least the following problems exist in the prior art: at present, the work executed by the robot is generally sent by people to execute instructions, and once the instructions are lost, the robot cannot know the work required to be executed. Further, if contact is lost with a person having control authority over it, it cannot perform work according to the new instruction, or even return to its "home". This may cause the robot to become lost or even lost. In addition, when the robot executes the "going-out task", the robot may need to leave the "owner" to execute the task by itself, and once a fault occurs in the midway, the robot cannot recover by itself, or may be too far away from the "owner" to be connected, so that the robot cannot normally complete the task and may be lost.
Disclosure of Invention
The embodiment of the invention aims to provide a self-rescue method of a robot and the robot, so that the robot can seek help by self when losing contact with a specific user, and the robot is prevented from being lost as much as possible.
In order to solve the technical problem, an embodiment of the present invention provides a robot self-rescue method, including: when the first user loses contact, sending a help request to a preset second user; providing control rights for the second user upon receiving consent feedback for the help request; receiving and executing a control instruction of the second user; and the preset second user comes from a relational network database.
Embodiments of the present invention also provide a robot including: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the robot self-rescue method as described above.
Embodiments of the present invention also provide a computer-readable storage medium storing a computer program, which when executed by a processor implements the robot self-rescue method described above.
Compared with the prior art, the implementation mode of the invention has the main differences and the effects that: after losing contact with the predetermined user, the robot does not passively wait for the user to contact with the predetermined user, but actively seeks help from the specific user in the user relationship network database, contacts with the owner as soon as possible or is brought back to the safety zone, and the safety of the robot per se can be ensured because the robot only receives the control of the user who is provided with the control authority. The control authority can limit the range of the controlled object of the robot, and other safety problems are avoided. Therefore, the self-rescue method of the robot in the embodiment of the invention enables the robot to seek help by self when losing contact with a specific user, thereby avoiding the loss of the robot as much as possible.
As a further improvement, the method sends a help request to a preset second user, specifically: and sending a help request to the user terminal bound by the preset second user. And the user terminal is utilized to contact the second user conveniently and directly.
As a further improvement, the preset user is a user belonging to a preset group. With the pre-set grouping, it is convenient to identify users who can provide help.
As a further improvement, the method further comprises the following steps: and when the user terminal loses contact with the bound user terminal, moving to a preset place. The robot can go to the safe position by itself by using the preset place, and self-rescue of the robot is facilitated.
As a further improvement, before receiving and executing the control instruction of the user terminal obtaining the control authority, the method further includes: identifying the received biometric characteristic; and when the received biological identification feature belongs to the second user, executing the control instruction of the user terminal receiving the control authority and executing. Further, before receiving the control instruction, the biometric feature of the other party needs to be verified again, so that the authorized user can control the robot on the spot, and the robot is safer.
As a further improvement, the relational network database comprises corresponding relations between users and biological identification features; and verifying whether the biological identification feature belongs to the second user or not by using the corresponding relation in the relational network database. The validity of the biological identification features is further limited to be verified by using the corresponding relation in the relational network database, and the robot can automatically verify the validity, so that the verification process is accurate and rapid.
As a further refinement, the biometric characteristic includes at least one of: fingerprint, face image, iris. Further, the biometric features can be fingerprints, face images, irises or a combination of fingerprints, face images and irises, and accuracy in verification is guaranteed.
As a further improvement, the relational network database is built by the following method: collecting voice data and/or image data; identifying the existing human body object from the collected voice data and/or the image data; extracting the identification features of the identified human body objects; searching from a relational network database of the user according to the extracted identification features; and if the data record matched with the identification characteristic cannot be searched from the relational network database, updating the identified human body object into the relational network database as a newly added data record. And further defining the establishment mode of a relational network database, identifying human body objects encountered by the user according to the real-time voice data and the real-time image data, and storing the human body objects serving as the communication objects of the user into the relational network database of the user. The relational network database established by the method contains various persons met by the user, and can be conversational or non-conversational, and through self-recognition of voice and images, the social objects around the user can be rapidly collected in a wide range.
Drawings
One or more embodiments are illustrated by way of example in the accompanying drawings, which correspond to the figures in which like reference numerals refer to similar elements and which are not to scale unless otherwise specified.
Fig. 1 is a flow chart of a self-rescue method of a robot according to a first embodiment of the present invention;
fig. 2 is a flowchart of a method of establishing a relationship network used in the self-rescue method of the robot according to the first embodiment of the present invention;
fig. 3 is a flow chart of a self-rescue method of a robot according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of a smart device according to a fourth embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention more apparent, embodiments of the present invention will be described in detail below with reference to the accompanying drawings. However, it will be appreciated by those of ordinary skill in the art that numerous technical details are set forth in order to provide a better understanding of the present application in various embodiments of the present invention. However, the technical solution claimed in the present application can be implemented without these technical details and various changes and modifications based on the following embodiments.
A first embodiment of the present invention relates to a self-rescue method for a robot.
The self-rescue method of the robot in the present embodiment is applied to a robot, which may be a nursing robot, a leg-running robot, a cooking robot, or the like, or may be another robot, and is not described here. The flow of the robot self-rescue method in the present embodiment is shown in fig. 1, and specifically as follows:
step 101, detecting whether a first user loses contact; if yes, go to step 102; if not, the robot self-rescue method in the present embodiment is ended.
Specifically, the present embodiment may be executed when a new instruction is not received for a period of time, determine whether to lose connection, execute the subsequent steps if the connection cannot be obtained, and directly end the process without executing the subsequent steps if the connection can be obtained.
More specifically, the first user may contact the robot through a terminal (e.g., a mobile phone, a controller, etc.) of the user, such as periodically checking a status of the robot or sending an instruction to the robot.
And step 102, sending a help request to a preset second user.
It should be noted that, when sending the help request, the help request may be sent to a user terminal to which a preset second user is bound. The help request may include the location and status of the robot, the help content to be provided, and the like, and may also include the time required for the second user to determine whether the second user is able to provide or willing to provide help.
Specifically, the predetermined second user is from a relational database, and more specifically, may be a user belonging to a predetermined group. Such as a group of relatives or important persons. It is worth mentioning that the second user may also be a specific group of people, such as police or other robots, which may be robots providing maintenance services, which are not listed here.
It should be noted that, as shown in fig. 2, the method for establishing the relational network database in the present embodiment is specifically established by using the following method:
step 201, voice data and/or image data are collected.
Specifically speaking, the smart machine that the user carried includes camera or microphone or the combination of both, and the camera is used for gathering image data, and the microphone is used for gathering voice data, when only having the camera in the smart machine, can gather image data, when the smart machine only has the microphone, can gather voice data, when having camera and microphone simultaneously in the smart machine, just can gather voice data and image data simultaneously. Of course, although the smart device may have both the camera and the microphone in practical application, the working modes of the camera and the microphone may be respectively defined, such as only turning on the camera or only turning on the microphone, which is not listed here.
Because the intelligent equipment is carried by the user, the information collected by the intelligent equipment comes from the user, and the communication object of the user generally appears nearby in the actual life, and the two ranges are just in line, so that the information collection is more accurate.
In step 202, the presence of human objects is identified from the captured voice data and/or image data.
Specifically, different speaking objects in the voice data can be accurately distinguished by utilizing voiceprint information and the like in the voice data, so that the existing human body can be identified; the object in the image data can be accurately distinguished by utilizing face recognition and the like in the image data, so that the existing human body can be identified.
Step 203, extracting the identification features of the identified human body object.
Specifically, the identifying feature may include one of: face images, voice print data, names, and combinations thereof. More specifically, facial imagery may be extracted from image data, voiceprint data may be extracted from voice data, and names may be extracted from voice data via semantic recognition.
Step 204, searching from a relational network database of the user according to the extracted identification features; if so, go to step 205; if not, go to step 207.
Specifically, the relational network database of the user is used for storing people who can become communication objects around the user, the related information of each communication object is stored as a data record, a plurality of groups are preset in the relational network database, and each data record corresponds to at least one group. When a user views the relational network database, the user can clearly know the grouping of each person in the database.
More specifically, when the data records are stored, each data record corresponds to an identification feature, so that the identification features are unique, the data records which accord with the identification features are searched by searching the identification features with the uniqueness, and the data records can be regarded as the same person.
Step 205, confirming whether the group to which the identified human body object belongs is the same as the group to which the matched data record belongs; if yes, ending the robot self-rescue method in the embodiment; if not, go to step 206.
Specifically, when the data record matching the identification feature is found from the relational network database, it is possible to further confirm whether the group to which the identified human body object belongs is the same as the group to which the matched data record belongs, using the collected voice data. The grouping may be divided into a relative group, a neighbor group, a co-worker group, a friend group, etc., and other grouping methods may also be adopted in practical applications, which are not listed one by one.
Specifically, in the confirmation process, a specific vocabulary can be searched from the voice data; and if the recognized human body objects are found, determining the groups to which the recognized human body objects belong according to the found specific words. The specific words mentioned can be set by the user or preset by the system, such as the set of terms.
For example, in the confirmation, if words such as "grandpa", "brother", "tert" and the like, which are called words indicating relations between relatives, are found, the user can be considered to be the grandpa/brother/tert of the user, and the other party can be automatically identified. As another example, for example, recognized speech that is directly commensurate in name, such as "Zhang three" or "Liqu", may be recognized as a group of friends.
Step 206, updating the group to which the matched data record belongs.
Specifically, upon confirming that the identified packet is not the same as an existing packet in the relational network database, the packet to which the matched data record belongs is updated. The updating mode can be that the newly identified group replaces the original group, or both the newly identified group and the original group are reserved, the next identification result is waited, when the user is in contact with the human body for many times and is identified by the intelligent device, the group can be identified for many times, the final group is further judged according to the identification results for many times, and therefore the accuracy of automatic grouping is improved.
It should be noted that, in practical application, the grouping may also be set by the user and updated according to the setting instruction of the user.
And step 207, updating the identified human body object into a relationship network database as a newly added data record.
Specifically, if the data record matching the identification feature is not found from the relational network database, the identified human body object is updated into the relational network database as a new data record.
It should be noted that, while or after the data record is added, the group to which the added data record belongs may also be identified, the identification method may be similar to that mentioned in step 205, and the group to which the data record belongs may be identified by searching for the specific vocabulary from the voice data through semantic analysis, which is not described herein again.
In addition, during updating, the name of the user to the human body object, the real-time shot picture, the real-time collected voiceprint characteristics and the like can be updated at the same time.
The steps 201 to 207 are specific building methods of the relational network database, and after confirming the second user and sending a help request to the second user, the subsequent step 103 is continuously executed.
And 103, providing the control authority for the second user when the consent feedback is received.
Specifically, this step provides the second user with control authority upon receiving consent feedback for the help request. More specifically, when the second user can and is willing to provide help, the second user sends feedback of agreement to the help request, and after receiving the feedback, the robot provides control authority for the second user.
It should be noted that the provided control authority may be provided with a certain control range, such as controlling the robot to move, or controlling the robot to perform basic operations, but not controlling the robot to perform complex operations. The setting of the control range can be set according to actual requirements, and is not described in detail herein.
And 104, receiving and executing a control instruction of the second user.
Specifically, the execution can be performed later when a control instruction of the second user is received. In practical application, it can also be determined whether the received control command conforms to the provided control authority, and if so, the control command is executed, otherwise, the control command is discarded.
More specifically, the receiving mode is various, and the robot can receive the control instruction through the interactive panel on the robot body, and can also receive the control instruction remotely transmitted by the user through the server.
Compared with the prior art, the main differences and effects of the present embodiment are as follows: after losing contact with the predetermined user, the robot does not passively wait for the user to contact with the predetermined user, but actively seeks help from the specific user in the user relationship network database, contacts with the owner as soon as possible or is brought back to the safety zone, and the safety of the robot per se can be ensured because the robot only receives the control of the user who is provided with the control authority. The control authority can limit the range of the controlled object of the robot, and other safety problems are avoided. Therefore, the self-rescue method of the robot in the embodiment enables the robot to seek help by oneself when losing contact with a specific user, so that the robot is prevented from being lost as much as possible. In addition, human body objects encountered by the user are identified according to the real-time voice data and the real-time image data, and the human body objects are stored in a relational network database of the user as the communication objects of the user. The relational network database established by the method contains various persons met by the user, and can be conversational or non-conversational, and through self-recognition of voice and images, the social objects around the user can be rapidly collected in a wide range. Therefore, the relational network database established by the robot self-rescue method in the embodiment is automatically established for the user, and the established relational network is more in line with the actual situation of the user.
A second embodiment of the present invention relates to a self-rescue method of a robot. The second embodiment is a further improvement on the first embodiment, and the main improvement is that: in the second embodiment of the invention, the step of active movement is added, so that the robot can move forward by itself, and the self-rescue of the robot is facilitated.
Specifically, in this embodiment, after determining that the first user loses contact, the first user may move to a preset location. Specifically, the preset location may be a location such as a "home" or a "nearest police station" of the robot, or some other location preset by the first user.
It should be noted that the moving timing may be before the help request is sent or after the help request is sent, which is not limited herein.
Therefore, the robot can go to the safe position by self by using the preset place, and the self-rescue of the robot is facilitated.
A third embodiment of the present invention relates to a self-rescue method of a robot. The third embodiment is a further improvement on the first embodiment, and the main improvement is that: in a third embodiment of the invention, the verification of the biometric feature of the second user is added, and the received instruction is accepted and executed only when the verification is passed, thereby increasing the robot-controlled security.
Specifically, a flow chart of a self-rescue method of the robot in the embodiment is shown in fig. 3, which specifically includes the following steps:
steps 301 to 303 of this embodiment are similar to steps 101 to 103 of the first embodiment, and are not described again here.
At step 304, the received biometric characteristic is identified.
Specifically, when the second user sends the control instruction, it is necessary to send a biometric feature to the robot in advance, where the biometric feature may be one of a fingerprint, a face image, an iris, a voiceprint, or a combination of multiple biometric features, or other biometric features.
The method of receiving the biometric characteristic may be varied. For example, the second user shoots a face image of the second user through a real-time image acquisition function, and then sends the face image to the robot for verification; for another example, the second user may input information such as a fingerprint using an interactive panel on the robot for verification by the robot.
Step 305, judging whether the identified biological identification feature belongs to a second user; if yes, go on to step 306; if not, go back to step 304.
Specifically, when the identified biological identification features are verified, the corresponding relationship in the relational network database can be used for verification, and since the identification features of each person are generally pre-stored in the relational network database, the verification by directly using the relational network database is simple and convenient.
And step 306, receiving the control instruction of the user terminal with the control authority and executing the control instruction.
Specifically, when the received biometric feature belongs to the second user, the control instruction of the user terminal with the control authority is received and executed.
Therefore, the self-rescue method of the robot in the embodiment further limits that the biological identification characteristics of the other side need to be verified again before the control instruction is received, so that the authorized user can control the robot on the spot, and the robot is safer. Meanwhile, the validity of the biological identification features is limited to be verified by using the corresponding relation in the relational network database, so that the robot can automatically verify, and the verification process is accurate and rapid.
The steps of the above methods are divided for clarity, and the implementation may be combined into one step or split some steps, and the steps are divided into multiple steps, so long as the same logical relationship is included, which are all within the protection scope of the present patent; it is within the scope of the patent to add insignificant modifications to the algorithms or processes or to introduce insignificant design changes to the core design without changing the algorithms or processes.
A fourth embodiment of the present invention relates to a robot, as shown in fig. 4, including:
at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the self-rescue method of any one of the robots as mentioned in the first to third embodiments.
Where the memory and processor are connected by a bus, the bus may comprise any number of interconnected buses and bridges, the buses connecting together one or more of the various circuits of the processor and the memory. The bus may also connect various other circuits such as peripherals, voltage regulators, power management circuits, and the like, which are well known in the art, and therefore, will not be described any further herein. A bus interface provides an interface between the bus and the transceiver. The transceiver may be one element or a plurality of elements, such as a plurality of receivers and transmitters, providing a means for communicating with various other apparatus over a transmission medium. The data processed by the processor is transmitted over a wireless medium via an antenna, which further receives the data and transmits the data to the processor.
The processor is responsible for managing the bus and general processing and may also provide various functions including timing, peripheral interfaces, voltage regulation, power management, and other control functions. And the memory may be used to store data used by the processor in performing operations.
A fifth embodiment of the present invention relates to a computer-readable storage medium storing a computer program. The computer program realizes the above-described method embodiments when executed by a processor.
That is, as can be understood by those skilled in the art, all or part of the steps in the method according to the above embodiments may be implemented by a program instructing related hardware, where the program is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps in the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It will be understood by those of ordinary skill in the art that the foregoing embodiments are specific examples for carrying out the invention, and that various changes in form and details may be made therein without departing from the spirit and scope of the invention in practice.

Claims (10)

1. A self-rescue method of a robot is characterized by comprising the following steps:
when the first user loses contact, sending a help request to a preset second user;
providing control rights for the second user upon receiving consent feedback for the help request;
receiving and executing a control instruction of the second user;
and the preset second user comes from a relational network database.
2. The robot self-rescue method according to claim 1, wherein the sending of the help request to the preset second user is specifically: and sending a help request to the user terminal bound by the preset second user.
3. A robot self-rescue method according to claim 2, characterized in that the preset user is a user belonging to a preset group.
4. The robot self-rescue method according to claim 1, further comprising: and when the user terminal loses contact with the bound user terminal, moving to a preset place.
5. The robot self-rescue method according to claim 1, wherein before receiving and executing the control command of the user terminal with the control authority, the method further comprises: identifying the received biometric characteristic;
and when the received biological identification feature belongs to the second user, executing the control instruction of the user terminal receiving the control authority and executing.
6. The robot self-rescue method according to claim 5, wherein the relational network database includes correspondences of users and biometric features;
and verifying whether the biological identification feature belongs to the second user or not by using the corresponding relation in the relational network database.
7. A robotic self-rescue method according to claim 5, wherein the biometric features include at least one of: fingerprint, face image, iris.
8. A robot self-rescue method according to claim 1, wherein the relational network database is built by:
collecting voice data and/or image data;
identifying the existing human body object from the collected voice data and/or the image data;
extracting the identification features of the identified human body objects;
searching from a relational network database of the user according to the extracted identification features;
and if the data record matched with the identification characteristic cannot be searched from the relational network database, updating the identified human body object into the relational network database as a newly added data record.
9. A robot, comprising:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform a method of self-rescue of a robot as claimed in any one of claims 1 to 8.
10. A computer-readable storage medium, storing a computer program, characterized in that the computer program, when being executed by a processor, implements the self-rescue method of a robot according to any one of claims 1 to 8.
CN201810097406.4A 2018-01-31 2018-01-31 A kind of self-rescue method and robot of robot Pending CN108297099A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810097406.4A CN108297099A (en) 2018-01-31 2018-01-31 A kind of self-rescue method and robot of robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810097406.4A CN108297099A (en) 2018-01-31 2018-01-31 A kind of self-rescue method and robot of robot

Publications (1)

Publication Number Publication Date
CN108297099A true CN108297099A (en) 2018-07-20

Family

ID=62850903

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810097406.4A Pending CN108297099A (en) 2018-01-31 2018-01-31 A kind of self-rescue method and robot of robot

Country Status (1)

Country Link
CN (1) CN108297099A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112566076A (en) * 2020-11-27 2021-03-26 深圳优地科技有限公司 Robot request assistance method, device, robot and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2716417A2 (en) * 2011-05-25 2014-04-09 Future Robot Co., Ltd. System and method for operating a smart service robot
CN105182828A (en) * 2015-09-18 2015-12-23 深圳前海达闼科技有限公司 Method and equipment for requesting assistance by equipment and responding to request for assistance by equipment
CN107263440A (en) * 2017-08-15 2017-10-20 湖州佳创自动化科技有限公司 A kind of outdoor self-powered mobile robot
CN107454866A (en) * 2016-05-23 2017-12-08 达闼科技(北京)有限公司 A kind of three-dimension modeling method and apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2716417A2 (en) * 2011-05-25 2014-04-09 Future Robot Co., Ltd. System and method for operating a smart service robot
CN105182828A (en) * 2015-09-18 2015-12-23 深圳前海达闼科技有限公司 Method and equipment for requesting assistance by equipment and responding to request for assistance by equipment
CN107454866A (en) * 2016-05-23 2017-12-08 达闼科技(北京)有限公司 A kind of three-dimension modeling method and apparatus
CN107263440A (en) * 2017-08-15 2017-10-20 湖州佳创自动化科技有限公司 A kind of outdoor self-powered mobile robot

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112566076A (en) * 2020-11-27 2021-03-26 深圳优地科技有限公司 Robot request assistance method, device, robot and storage medium
CN112566076B (en) * 2020-11-27 2023-05-16 深圳优地科技有限公司 Robot request assistance method, device, robot and storage medium

Similar Documents

Publication Publication Date Title
JP5218991B2 (en) Biometric authentication system and biometric authentication method using multiple types of templates
CN107170068B (en) Movable attendance checking method based on scene and image recognition
CN108537929B (en) Remote unlocking system and remote unlocking method
CN109389712B (en) Unlocking method of intelligent lock, mobile terminal, server and readable storage medium
KR20180042802A (en) Method and system for tracking an object in a defined area
CN108122314B (en) Doorbell call processing method, cloud server, medium and system
WO2019148491A1 (en) Human-computer interaction method and device, robot, and computer readable storage medium
CN108351707A (en) Man-machine interaction method and device, terminal equipment and computer readable storage medium
JP6303141B2 (en) Biometric authentication method and biometric authentication system
CN204990444U (en) Intelligent security controlgear
WO2020034645A1 (en) Facial recognition method, facial recognition system, and electronic device
CN107124463A (en) A kind of data-sharing systems applied to intelligent plant
CN108297099A (en) A kind of self-rescue method and robot of robot
CN110796770A (en) Access control method and device, storage medium and electronic device
CN110838196B (en) Intelligent door lock control method, intelligent door lock control system and intelligent door lock
CN111970369A (en) Contactless equipment control method and device
JP2019003296A (en) Acceptance inspection robot system
CN111047761A (en) Voice interactive visitor identity recognition method and system based on intelligent terminal
CN108256099A (en) The method for building up of network of personal connections, based reminding method and smart machine based on network of personal connections
CN112669501B (en) Access control method, device and computer readable storage medium
CN115527241A (en) Fingerprint template updating method and device, embedded equipment and storage medium
CN113233267A (en) Intelligent elevator stop management method and system based on machine vision feature recognition and computer storage medium
WO2017016027A1 (en) Method for establishing connection, apparatus for establishing connection, and communication system
CN112422504A (en) Working method for carrying out remote safety information authentication and identification through cloud platform
CN110908289A (en) Smart home control method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20180720

WD01 Invention patent application deemed withdrawn after publication