CN111324129B - Navigation method and device based on face recognition - Google Patents

Navigation method and device based on face recognition Download PDF

Info

Publication number
CN111324129B
CN111324129B CN202010196330.8A CN202010196330A CN111324129B CN 111324129 B CN111324129 B CN 111324129B CN 202010196330 A CN202010196330 A CN 202010196330A CN 111324129 B CN111324129 B CN 111324129B
Authority
CN
China
Prior art keywords
navigation
client
distance
face recognition
interval distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010196330.8A
Other languages
Chinese (zh)
Other versions
CN111324129A (en
Inventor
翁伟东
郭敏鸿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CCB Finetech Co Ltd
Original Assignee
CCB Finetech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CCB Finetech Co Ltd filed Critical CCB Finetech Co Ltd
Priority to CN202010196330.8A priority Critical patent/CN111324129B/en
Publication of CN111324129A publication Critical patent/CN111324129A/en
Application granted granted Critical
Publication of CN111324129B publication Critical patent/CN111324129B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/08Systems for measuring distance only
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Abstract

The invention provides a navigation method and a navigation device based on face recognition, wherein the method comprises the following steps: acquiring face photos and destination information of a client; generating a first navigation walking route according to the destination information; when navigation is performed based on the first navigation walking route, calculating the interval distance between the client and the client in real time according to the face photo of the client; and adjusting the navigation walking speed of the user based on the interval distance. The invention can navigate for guests, can cope with complex environments, improves the navigation efficiency and reliability, and further improves the movement efficiency and safety of navigation.

Description

Navigation method and device based on face recognition
Technical Field
The invention relates to the technical field of navigation robots, in particular to a navigation method and device based on face recognition.
Background
In large indoor public places such as airports, exhibition halls, office halls and the like, it is difficult to accurately and quickly find an expected destination, which is a problem frequently encountered by the public, and even if a foreground or a attendant guides, the user often walks by mistake or goes around.
With the development of intelligent technology, an attempt is made to use intelligent robots to take on road guidance and navigation tasks, and a part of robots can also navigate autonomously with clients to specified destinations. After receiving destination information of a client, the robot calculates a walking route from a current place to the destination according to an indoor positioning navigation technology (map information is generally built in, a walking route between two points of a map can be calculated, a real walking route and speed can be determined by receiving sensor signals, walking and obstacle avoidance can be controlled), and then a navigation walking function is started to lead the client to the destination.
The existing robot navigation function solves the problem that the robot moves to a destination, and can objectively provide guiding road service for clients, but cannot detect whether the clients catch up in time and whether the clients are not caught up due to the influence of other things or obstacles in the middle, and finally, the robot reaches the destination according to a set route, but the clients remain in place or get lost.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides a navigation method and a navigation device based on face recognition, which can cope with complex environments and improve navigation efficiency and reliability.
In order to solve the technical problems, the invention provides the following technical scheme:
in a first aspect, the present invention provides a navigation method based on face recognition, including:
acquiring face photos and destination information of a client;
generating a first navigation walking route according to the destination information;
when navigation is performed based on the first navigation walking route, calculating the interval distance between the client and the client in real time according to the face photo of the client;
and adjusting the navigation walking speed of the user based on the interval distance.
The calculating the distance between the client and the client in real time according to the face photo of the client comprises the following steps:
performing face recognition according to the face photo of the client, and determining the azimuth of the client;
the distance between the client and the client is measured based on the direction of the client and the radar ranging mode.
The adjusting the navigation walking speed based on the interval distance comprises the following steps:
when the interval distance is greater than or equal to the preset distance, the navigation walking speed of the user is reduced, and navigation is continued;
and when the interval distance is larger than the preset distance and the duration time of the interval distance larger than the preset distance is larger than the preset time, generating a second navigation walking route according to the client position and navigating based on the navigation walking route until the interval distance is smaller than the preset distance.
Further, after the first navigation walking route is generated according to the destination information, the method further includes:
the customers following themselves are detected by means of infrared detection.
Further, the method further comprises the following steps:
and stopping navigation according to the stopping instruction input by the client.
In a second aspect, the present invention provides a navigation device based on face recognition, including:
the acquisition unit is used for acquiring face photos and destination information of the clients;
the route unit is used for generating a first navigation walking route according to the destination information;
the searching unit is used for calculating the interval distance between the client and the client in real time according to the face photo of the client when the navigation is performed based on the first navigation walking route;
and the adjusting unit is used for adjusting the navigation walking speed of the user based on the interval distance.
Wherein the search unit includes:
the face recognition subunit is used for carrying out face recognition according to the face photo of the client and determining the position of the client;
and the distance measurement subunit is used for measuring the distance between the client and the client in a radar ranging mode based on the azimuth of the client.
Wherein the adjusting unit includes:
the first adjusting subunit is used for reducing the navigation walking speed of the first adjusting subunit and continuing navigation when the interval distance is greater than or equal to the preset distance;
and the second adjustment subunit is used for generating a second navigation walking route according to the client position and navigating based on the navigation walking route when the interval distance is larger than the preset distance and the duration time of the interval distance larger than the preset distance is larger than the preset time until the interval distance is smaller than the preset distance.
Further, the method further comprises the following steps:
an infrared unit for detecting a customer following himself by means of infrared detection.
Further, the method further comprises the following steps:
and the termination unit is used for stopping navigation according to the stopping instruction input by the client.
In a third aspect, the present invention provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the face recognition based navigation method when executing the program.
In a fourth aspect, the present invention provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the face recognition based navigation method.
According to the technical scheme, the invention provides a navigation method and a navigation device based on face recognition, which are implemented by acquiring a face photo and destination information of a client; generating a first navigation walking route according to the destination information; when navigation is performed based on the first navigation walking route, calculating the interval distance between the client and the client in real time according to the face photo of the client; based on the interval distance, the navigation walking speed of the device is adjusted, so that the device can navigate for guests, can cope with complex environments, improves the navigation walking efficiency and reliability, and further improves the movement efficiency and safety of navigation.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of a first method for face recognition-based navigation according to an embodiment of the present invention.
Fig. 2 is a schematic diagram of a second flow of a navigation method based on face recognition in an embodiment of the present invention.
Fig. 3 is a schematic diagram of a third flow of a navigation method based on face recognition in an embodiment of the present invention.
Fig. 4 is a schematic structural diagram of a navigation device based on face recognition in an embodiment of the present invention.
Fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The invention provides an embodiment of a navigation method based on face recognition, referring to fig. 1, the navigation method based on face recognition specifically comprises the following contents:
s101: acquiring face photos and destination information of a client;
in this step, the purpose of acquiring the client destination information is to generate a navigation walking route, and lead the client to walk based on the navigation walking route, thereby realizing lead navigation. And acquiring the face photo of the client so as to carry out face recognition, and further determining whether the led client follows walking.
In particular implementations, destination information of the customer is obtained by at least one of voice input, text input, and text recognition. And collecting face photos of the clients through the cameras.
S102: generating a first navigation walking route according to the destination information;
in this step, the location where the customer arrives in advance can be known from the destination information, and the travel route between two points of the map can be calculated by incorporating map information according to the location where the customer arrives in advance and the location where the customer is located. The route is the first navigation walking route.
S103: when navigation is performed based on the first navigation walking route, calculating the interval distance between the client and the client in real time according to the face photo of the client;
in the step, when navigation is performed according to the first navigation walking route, surrounding videos or images are collected according to the camera, specific targets are identified from the videos or images, and the distance between the specific targets and the camera is calculated.
It should be noted that the specific target is the customer corresponding to the face photo in step S101.
When the method is implemented, face recognition is carried out according to the face photos of the clients, specifically, face recognition is carried out on collected surrounding videos or images, the face photos of the clients in the surrounding videos or images are recognized, and the positions of the clients can be determined by taking the face photos as the center; the distance between the client and the client is measured based on the direction of the client and the radar ranging mode, specifically, the radar ranging is performed on the direction of the client, and the distance between the client and the client can be obtained.
S104: and adjusting the navigation walking speed of the user based on the interval distance.
In the step, the interval distance between the client and the client is determined, and whether the client is lost is judged based on the interval distance, so that different preset distances are set according to the use requirement, and whether the client is lost is judged according to the interval distance between the client and the preset distance. If the tracking loss is likely to occur, the self travelling speed is adjusted, so that the client can find out in time, and the quasi-group navigation is realized.
In practice, if the separation distance is greater than or equal to the preset distance, it is determined that heel loss may occur.
When the interval distance is greater than or equal to a preset distance, the navigation walking speed of the user is reduced, and navigation is continued; and when the interval distance is larger than the preset distance and the duration time of the interval distance larger than the preset distance is larger than the preset time, generating a second navigation walking route according to the client position and navigating based on the navigation walking route until the interval distance is smaller than the preset distance.
As can be seen from the above description, the navigation method and apparatus based on face recognition according to the embodiments of the present invention obtain a face photograph and destination information of a client; generating a first navigation walking route according to the destination information; when navigation is performed based on the first navigation walking route, calculating the interval distance between the client and the client in real time according to the face photo of the client; based on the interval distance, the navigation walking speed of the device is adjusted, so that the device can navigate for guests, can cope with complex environments, improves the navigation walking efficiency and reliability, and further improves the movement efficiency and safety of navigation.
In an embodiment of the present invention, referring to fig. 2, after step S102 in the embodiment of the navigation method based on face recognition, step S105 is further included, which specifically includes the following:
s105: the customers following themselves are detected by means of infrared detection.
In the navigation route guiding process, the human body azimuth is searched in an infrared detection mode, and the camera is used for acquiring the human face for comparison, so that the customer is ensured to follow all the time, the accuracy of determining the customer azimuth is further improved, and the navigation route guiding efficiency is further improved.
In an embodiment of the present invention, referring to fig. 3, in the embodiment of the navigation method based on face recognition, step S106 is further included, which specifically includes the following:
and stopping navigation according to the stopping instruction input by the client.
In the specific implementation, the client can reach the destination by himself without continuing to navigate the route, and the navigation is stopped by the input stop instruction.
The client can determine whether to continue the navigation according to the own demand, and if not, can input a stop instruction to stop the navigation.
In order to further explain the scheme, the invention provides a full-flow embodiment of a navigation method based on face recognition, which specifically comprises the following contents:
1. acquiring a navigation route guiding requirement of a client, and calculating a navigation walking route according to target information and self positioning information;
2. and calling a front camera, taking a picture of the face of the client, storing the picture in temporary navigation task information, and starting navigation and route guiding.
3. Infrared detection is called to detect whether a human body walks along with the infrared detection;
4. invoking radar ranging, detecting whether the human body distance is within a set threshold, decelerating or stopping if the distance exceeds a standard, and re-accelerating if the distance is returned to the threshold;
5. calling a rear camera, capturing the following client face, and sending the face to a face recognition module;
6. comparing the received photo with the face stored when the navigation task is started, and confirming that the nearest following person is the previous customer;
7. if the pictures are consistent, walking is carried out as usual, if the pictures are inconsistent, walking is suspended;
8. and the camera performs omni-directional search and searches for clients according to the human face.
9. After the client is found, the client is walked, and the navigation route is continued after the client returns to the threshold value.
10. And (3) arriving at the destination or ending the navigation route guiding task after receiving the instruction of canceling the navigation route guiding by the client.
The embodiment of the invention provides a specific implementation manner of a human face recognition-based navigation device capable of realizing all contents in the human face recognition-based navigation method, and referring to fig. 4, the human face recognition-based navigation device specifically comprises the following contents:
an acquisition unit 10 for acquiring a face photograph and destination information of a customer;
a route unit 20 for generating a first navigation walking route according to the destination information;
a searching unit 30, configured to calculate, in real time, a distance between the client and the client according to the face photo of the client when navigating based on the first navigation walking route;
an adjusting unit 40 for adjusting the navigation walking speed of the user based on the distance.
Wherein the search unit 30 includes:
the face recognition subunit is used for carrying out face recognition according to the face photo of the client and determining the position of the client;
and the distance measurement subunit is used for measuring the distance between the client and the client in a radar ranging mode based on the azimuth of the client.
Wherein the adjusting unit 40 includes:
the first adjusting subunit is used for reducing the navigation walking speed of the first adjusting subunit and continuing navigation when the interval distance is greater than or equal to the preset distance;
and the second adjustment subunit is used for generating a second navigation walking route according to the client position and navigating based on the navigation walking route when the interval distance is larger than the preset distance and the duration time of the interval distance larger than the preset distance is larger than the preset time until the interval distance is smaller than the preset distance.
Further, the method further comprises the following steps:
an infrared unit 50 for detecting clients following themselves by means of infrared detection.
Further, the method further comprises the following steps:
and a termination unit 60 for stopping navigation according to a stop instruction inputted by the client.
The embodiment of the navigation device based on face recognition provided by the invention can be particularly used for executing the processing flow of the embodiment of the navigation method based on face recognition in the above embodiment, and the functions of the processing flow are not repeated herein, and reference can be made to the detailed description of the above method embodiment.
As can be seen from the above description, the navigation device based on face recognition provided by the embodiment of the present invention obtains the face photograph and the destination information of the client; generating a first navigation walking route according to the destination information; when navigation is performed based on the first navigation walking route, calculating the interval distance between the client and the client in real time according to the face photo of the client; based on the interval distance, the navigation walking speed of the device is adjusted, so that the device can navigate for guests, can cope with complex environments, improves the navigation walking efficiency and reliability, and further improves the movement efficiency and safety of navigation.
The application provides an embodiment of an electronic device for implementing all or part of content in the navigation method based on face recognition, wherein the electronic device specifically comprises the following contents:
a processor (processor), a memory (memory), a communication interface (Communications Interface), and a bus; the processor, the memory and the communication interface complete communication with each other through the bus; the communication interface is used for realizing information transmission between related devices; the electronic device may be a desktop computer, a tablet computer, a mobile terminal, etc., and the embodiment is not limited thereto. In this embodiment, the electronic device may be implemented with reference to an embodiment for implementing the face recognition-based navigation method and an embodiment for implementing the face recognition-based navigation apparatus, and the contents thereof are incorporated herein, and are not repeated here.
Fig. 5 is a schematic block diagram of a system configuration of an electronic device 9600 of an embodiment of the present application. As shown in fig. 5, the electronic device 9600 may include a central processor 9100 and a memory 9140; the memory 9140 is coupled to the central processor 9100. Notably, this fig. 5 is exemplary; other types of structures may also be used in addition to or in place of the structures to implement telecommunications functions or other functions.
In one embodiment, the face recognition based navigation function may be integrated into the central processor 9100. The central processor 9100 may be configured to perform the following control:
acquiring face photos and destination information of a client;
generating a first navigation walking route according to the destination information;
when navigation is performed based on the first navigation walking route, calculating the interval distance between the client and the client in real time according to the face photo of the client;
and adjusting the navigation walking speed of the user based on the interval distance.
As can be seen from the above description, the electronic device provided in the embodiments of the present application obtains a face photo and destination information of a client; generating a first navigation walking route according to the destination information; when navigation is performed based on the first navigation walking route, calculating the interval distance between the client and the client in real time according to the face photo of the client; based on the interval distance, the navigation walking speed of the device is adjusted, so that the device can navigate for guests, can cope with complex environments, improves the navigation walking efficiency and reliability, and further improves the movement efficiency and safety of navigation.
In another embodiment, the navigation device based on face recognition may be configured separately from the central processor 9100, for example, the navigation device based on face recognition may be configured as a chip connected to the central processor 9100, and the navigation function based on face recognition is implemented by control of the central processor.
As shown in fig. 5, the electronic device 9600 may further include: a communication module 9110, an input unit 9120, an audio processor 9130, a display 9160, and a power supply 9170. It is noted that the electronic device 9600 need not include all of the components shown in fig. 5; in addition, the electronic device 9600 may further include components not shown in fig. 5, and reference may be made to the related art.
As shown in fig. 5, the central processor 9100, sometimes referred to as a controller or operational control, may include a microprocessor or other processor device and/or logic device, which central processor 9100 receives inputs and controls the operation of the various components of the electronic device 9600.
The memory 9140 may be, for example, one or more of a buffer, a flash memory, a hard drive, a removable media, a volatile memory, a non-volatile memory, or other suitable device. The information about failure may be stored, and a program for executing the information may be stored. And the central processor 9100 can execute the program stored in the memory 9140 to realize information storage or processing, and the like.
The input unit 9120 provides input to the central processor 9100. The input unit 9120 is, for example, a key or a touch input device. The power supply 9170 is used to provide power to the electronic device 9600. The display 9160 is used for displaying display objects such as images and characters. The display may be, for example, but not limited to, an LCD display.
The memory 9140 may be a solid state memory such as Read Only Memory (ROM), random Access Memory (RAM), SIM card, etc. But also a memory which holds information even when powered down, can be selectively erased and provided with further data, an example of which is sometimes referred to as EPROM or the like. The memory 9140 may also be some other type of device. The memory 9140 includes a buffer memory 9141 (sometimes referred to as a buffer). The memory 9140 may include an application/function storage portion 9142, the application/function storage portion 9142 storing application programs and function programs or a flow for executing operations of the electronic device 9600 by the central processor 9100.
The memory 9140 may also include a data store 9143, the data store 9143 for storing data, such as contacts, digital data, pictures, sounds, and/or any other data used by an electronic device. The driver storage portion 9144 of the memory 9140 may include various drivers of the electronic device for communication functions and/or for performing other functions of the electronic device (e.g., messaging applications, address book applications, etc.).
The communication module 9110 is a transmitter/receiver 9110 that transmits and receives signals via an antenna 9111. A communication module (transmitter/receiver) 9110 is coupled to the central processor 9100 to provide input signals and receive output signals, as in the case of conventional mobile communication terminals.
Based on different communication technologies, a plurality of communication modules 9110, such as a cellular network module, a bluetooth module, and/or a wireless local area network module, etc., may be provided in the same electronic device. The communication module (transmitter/receiver) 9110 is also coupled to a speaker 9131 and a microphone 9132 via an audio processor 9130 to provide audio output via the speaker 9131 and to receive audio input from the microphone 9132 to implement usual telecommunications functions. The audio processor 9130 can include any suitable buffers, decoders, amplifiers and so forth. In addition, the audio processor 9130 is also coupled to the central processor 9100 so that sound can be recorded locally through the microphone 9132 and sound stored locally can be played through the speaker 9131.
An embodiment of the present invention also provides a computer-readable storage medium capable of implementing all steps in the face recognition-based navigation method in the above embodiment, the computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements all steps in the face recognition-based navigation method in the above embodiment, for example, the processor implements the following steps when executing the computer program:
acquiring face photos and destination information of a client;
generating a first navigation walking route according to the destination information;
when navigation is performed based on the first navigation walking route, calculating the interval distance between the client and the client in real time according to the face photo of the client;
and adjusting the navigation walking speed of the user based on the interval distance.
As can be seen from the above description, the computer readable storage medium provided by the embodiments of the present invention obtains the face photo and the destination information of the client; generating a first navigation walking route according to the destination information; when navigation is performed based on the first navigation walking route, calculating the interval distance between the client and the client in real time according to the face photo of the client; based on the interval distance, the navigation walking speed of the device is adjusted, so that the device can navigate for guests, can cope with complex environments, improves the navigation walking efficiency and reliability, and further improves the movement efficiency and safety of navigation.
Although the invention provides method operational steps as described in the examples or flowcharts, more or fewer operational steps may be included based on conventional or non-inventive labor. The order of steps recited in the embodiments is merely one way of performing the order of steps and does not represent a unique order of execution. When implemented by an actual device or client product, the instructions may be executed sequentially or in parallel (e.g., in a parallel processor or multi-threaded processing environment) as shown in the embodiments or figures.
It will be appreciated by those skilled in the art that embodiments of the present description may be provided as a method, apparatus (system) or computer program product. Accordingly, the present specification embodiments may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for system embodiments, since they are substantially similar to method embodiments, the description is relatively simple, as relevant to see a section of the description of method embodiments. In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. The orientation or positional relationship indicated by the terms "upper", "lower", etc. are based on the orientation or positional relationship shown in the drawings, are merely for convenience of description and to simplify the description, and are not indicative or implying that the apparatus or elements in question must have a specific orientation, be constructed and operated in a specific orientation, and therefore should not be construed as limiting the present invention. Unless specifically stated or limited otherwise, the terms "mounted," "connected," and "coupled" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art according to the specific circumstances. It should be noted that, without conflict, the embodiments of the present invention and features of the embodiments may be combined with each other. The present invention is not limited to any single aspect, nor to any single embodiment, nor to any combination and/or permutation of these aspects and/or embodiments. Moreover, each aspect and/or embodiment of the invention may be used alone or in combination with one or more other aspects and/or embodiments.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the invention, and are intended to be included within the scope of the appended claims and description.

Claims (10)

1. A navigation method based on face recognition, comprising:
acquiring face photos and destination information of a client;
generating a first navigation walking route according to the destination information;
when navigation is performed based on the first navigation walking route, calculating the interval distance between the client and the client in real time according to the face photo of the client;
adjusting the navigation walking speed of the user based on the interval distance;
the adjusting the navigation walking speed based on the interval distance comprises the following steps:
when the interval distance is greater than or equal to the preset distance, the navigation walking speed of the user is reduced, and navigation is continued;
and when the interval distance is larger than the preset distance and the duration time of the interval distance larger than the preset distance is larger than the preset time, generating a second navigation walking route according to the client position and navigating based on the navigation walking route until the interval distance is smaller than the preset distance.
2. The navigation method based on face recognition according to claim 1, wherein the calculating the distance between the client and the client in real time according to the face photo of the client comprises:
performing face recognition according to the face photo of the client, and determining the azimuth of the client;
the distance between the client and the client is measured based on the direction of the client and the radar ranging mode.
3. The face recognition-based navigation method of claim 1, further comprising, after the generating the first navigation walking route according to the destination information:
the customers following themselves are detected by means of infrared detection.
4. The face recognition-based navigation method of claim 1, further comprising:
and stopping navigation according to the stopping instruction input by the client.
5. A navigation device based on face recognition, comprising:
the acquisition unit is used for acquiring face photos and destination information of the clients;
the route unit is used for generating a first navigation walking route according to the destination information;
the searching unit is used for calculating the interval distance between the client and the client in real time according to the face photo of the client when the navigation is performed based on the first navigation walking route;
the adjusting unit is used for adjusting the navigation walking speed of the user based on the interval distance;
the adjusting unit includes:
the first adjusting subunit is used for reducing the navigation walking speed of the first adjusting subunit and continuing navigation when the interval distance is greater than or equal to the preset distance;
and the second adjustment subunit is used for generating a second navigation walking route according to the client position and navigating based on the navigation walking route when the interval distance is larger than the preset distance and the duration time of the interval distance larger than the preset distance is larger than the preset time until the interval distance is smaller than the preset distance.
6. The face recognition-based navigation apparatus of claim 5, wherein the search unit comprises:
the face recognition subunit is used for carrying out face recognition according to the face photo of the client and determining the position of the client;
and the distance measurement subunit is used for measuring the distance between the client and the client in a radar ranging mode based on the azimuth of the client.
7. The face recognition-based navigation apparatus of claim 5, further comprising:
an infrared unit for detecting a customer following himself by means of infrared detection.
8. The face recognition-based navigation apparatus of claim 5, further comprising:
and the termination unit is used for stopping navigation according to the stopping instruction input by the client.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps of the face recognition based navigation method of any one of claims 1 to 4 when the program is executed by the processor.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the face recognition based navigation method of any one of claims 1 to 4.
CN202010196330.8A 2020-03-19 2020-03-19 Navigation method and device based on face recognition Active CN111324129B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010196330.8A CN111324129B (en) 2020-03-19 2020-03-19 Navigation method and device based on face recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010196330.8A CN111324129B (en) 2020-03-19 2020-03-19 Navigation method and device based on face recognition

Publications (2)

Publication Number Publication Date
CN111324129A CN111324129A (en) 2020-06-23
CN111324129B true CN111324129B (en) 2023-07-18

Family

ID=71171682

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010196330.8A Active CN111324129B (en) 2020-03-19 2020-03-19 Navigation method and device based on face recognition

Country Status (1)

Country Link
CN (1) CN111324129B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106426213A (en) * 2016-11-23 2017-02-22 深圳哈乐派科技有限公司 Accompanying and nursing robot

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4072033B2 (en) * 2002-09-24 2008-04-02 本田技研工業株式会社 Reception guidance robot device
JP4088237B2 (en) * 2003-10-23 2008-05-21 株式会社ナビタイムジャパン Navigation device, navigation method, navigation program
CN107390721B (en) * 2017-07-26 2021-05-18 歌尔科技有限公司 Robot following control method and device and robot
CN108734083B (en) * 2018-03-21 2023-04-25 北京猎户星空科技有限公司 Control method, device, equipment and storage medium of intelligent equipment
CN109333535B (en) * 2018-10-25 2022-05-20 同济大学 Guiding method of autonomous mobile robot
CN109571499A (en) * 2018-12-25 2019-04-05 广州天高软件科技有限公司 A kind of intelligent navigation leads robot and its implementation
CN109935310A (en) * 2019-03-08 2019-06-25 弗徕威智能机器人科技(上海)有限公司 A kind of robot guidance system and method applied to medical institutions
CN110405767B (en) * 2019-08-01 2022-06-17 深圳前海微众银行股份有限公司 Leading method, device, equipment and storage medium for intelligent exhibition hall

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106426213A (en) * 2016-11-23 2017-02-22 深圳哈乐派科技有限公司 Accompanying and nursing robot

Also Published As

Publication number Publication date
CN111324129A (en) 2020-06-23

Similar Documents

Publication Publication Date Title
RU2642150C2 (en) Method and device for programmable control of user's motion path to elevator/escalator
KR101948728B1 (en) Method and system for collecting data
EP2769333B1 (en) Video based pedestrian traffic estimation
US20120252483A1 (en) Camera enabled headset for navigation
CN104697533A (en) Navigation method and device
US10989559B2 (en) Methods, systems, and devices for displaying maps
WO2011144967A1 (en) Extended fingerprint generation
CN109634263A (en) Based on data synchronous automatic Pilot method, terminal and readable storage medium storing program for executing
CN105101404A (en) Positioning method, device and terminal
CN103968824A (en) Method for discovering augmented reality target, and terminal
CN106406343A (en) Control method, device and system of unmanned aerial vehicle
US9829322B2 (en) Systems and methods for directing a vision-impaired user to a vehicle
US20190086222A1 (en) Driverless transportation system
CN108326875B (en) Communication control method and apparatus, telepresence robot, and storage medium
KR101413605B1 (en) System and method for Navigation
CN105388503A (en) Navigation method and device
CN106200654A (en) The control method of unmanned plane during flying speed and device
CN112433211A (en) Pose determination method and device, electronic equipment and storage medium
CN105188027A (en) Nearby user searching method and device
JP5832394B2 (en) Navigation device
CN111324129B (en) Navigation method and device based on face recognition
JP2019066440A (en) Navigation device, destination guidance system and program
US20210089983A1 (en) Vehicle ride-sharing assist system
US10735902B1 (en) Method and computer program for taking action based on determined movement path of mobile devices
CN104023130B (en) Position reminding method and apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20220930

Address after: 12 / F, 15 / F, 99 Yincheng Road, Pudong New Area pilot Free Trade Zone, Shanghai, 200120

Applicant after: Jianxin Financial Science and Technology Co.,Ltd.

Address before: 25 Financial Street, Xicheng District, Beijing 100033

Applicant before: CHINA CONSTRUCTION BANK Corp.

Applicant before: Jianxin Financial Science and Technology Co.,Ltd.

GR01 Patent grant
GR01 Patent grant