CN108733420A - Awakening method, device, smart machine and the storage medium of smart machine - Google Patents
Awakening method, device, smart machine and the storage medium of smart machine Download PDFInfo
- Publication number
- CN108733420A CN108733420A CN201810236921.6A CN201810236921A CN108733420A CN 108733420 A CN108733420 A CN 108733420A CN 201810236921 A CN201810236921 A CN 201810236921A CN 108733420 A CN108733420 A CN 108733420A
- Authority
- CN
- China
- Prior art keywords
- smart machine
- target
- wake
- condition
- state parameter
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/4401—Bootstrapping
- G06F9/4418—Suspend and resume; Hibernate and awake
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
- G06F1/3206—Monitoring of events, devices or parameters that trigger a change in power modality
- G06F1/3231—Monitoring the presence, absence or movement of users
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- Computer Security & Cryptography (AREA)
- Manipulator (AREA)
- Image Analysis (AREA)
Abstract
The present invention proposes a kind of awakening method of smart machine, device, smart machine and storage medium, wherein method includes:Smart machine in a dormant state when, obtain the ambient image of smart machine monitoring range, ambient image is identified, when identifying target from ambient image, the state parameter for obtaining target judges whether to meet wake-up condition according to state parameter, when determination meets wake-up condition, smart machine is waken up.This method passes through after recognizing target, according to the state parameter of target, judge whether to meet wake-up condition, when meeting wake-up condition, it is autonomous wake up smart machine, it is not directly to be waken up when detecting people, it is also contemplated that the state parameter of target, keep the wake-up of smart machine more accurate, reduces false wake-up.
Description
Technical field
The present invention relates to technical field of intelligent equipment more particularly to a kind of awakening method of smart machine, device, intelligence to set
Standby and storage medium.
Background technology
With popularizing for smart machine, people are more and more frequent to the frequency of use of smart machine, the power consumption of these equipment
Also increasing, this also results in the standby time of smart machine shorter and shorter.In order to improve the cruising ability of smart machine, when
When user does not use smart machine, the related service in equipment can stop or suspend, smart machine in a dormant state, in this way
Electricity can effectively be preserved.When user uses smart machine, smart machine is waken up from dormant state again.
Currently, smart machine develops to autonomous wake-up by being waken up by waking up word, waking up button.For independently waking up,
Specifically, smart machine can carry out Face datection in monitoring range, when detecting face, it can independently wake up intelligence and set
It is standby, to reduce the operation of people.
But based on detecting that face just independently wakes up, often there is the case where false wake-up.For example, when people are road
In the case of crossing robot, autonomous wake up will result in waste at this time.
Invention content
The present invention is directed to solve at least some of the technical problems in related technologies.
The present invention proposes a kind of awakening method of smart machine, by after recognizing target, being joined according to the state of target
Number, judges whether to meet wake-up condition, when meeting wake-up condition, independently wakes up smart machine, be not straight when detecting people
Connect wake-up, it is also contemplated that the state parameter of target keeps the wake-up of smart machine more accurate, reduces false wake-up.
One aspect of the present invention embodiment proposes a kind of awakening method of smart machine, including:
Smart machine in a dormant state when, the ambient image of the smart machine monitoring range is obtained, to the ring
Border image is identified;
When identifying target from the ambient image, the state parameter of the target is obtained;
According to the state parameter, judge whether to meet wake-up condition;
When determination meets wake-up condition, the smart machine is waken up.
As a kind of possible realization method of one aspect of the present invention embodiment, the state parameter of the target includes:It is described
The facial angle of the distance between target and the smart machine and the target;
According to the state parameter, judge whether to meet wake-up condition, including:
If the distance between the target and the smart machine are less than preset distance threshold, and the people of the target
When face angle is in preset angular range, determination meets wake-up condition.
As a kind of possible realization method of one aspect of the present invention embodiment, the state parameter of the target further includes:Institute
State the lip state parameter of target;
Before determination meets wake-up condition, further include:
Determine indicate the target by the lip state parameter of the target there are lips it is dynamic.
As a kind of possible realization method of one aspect of the present invention embodiment, the state parameter of the target further includes:Institute
State stay time of the target within the scope of the distance threshold;
Before determination meets wake-up condition, further include:
Determine that stay time of the target within the scope of the distance threshold reaches preset duration threshold value.
As a kind of possible realization method of one aspect of the present invention embodiment, in the environment for obtaining smart machine monitoring range
Before image, further include:
Obtain the voice messaging in environment;
When, there are when preset keyword, determining the Sounnd source direction of the voice messaging in the voice messaging;
It controls the smart machine and turns to the Sounnd source direction.
It is also wrapped after waking up the smart machine as a kind of possible realization method of one aspect of the present invention embodiment
It includes:
The state parameter for persistently obtaining the target, judges whether persistently to meet wake-up condition;
If persistently meeting the wake-up condition, the smart machine is persistently waken up.
It is also wrapped after waking up the smart machine as a kind of possible realization method of one aspect of the present invention embodiment
It includes:
The ambient image for persistently obtaining the smart machine monitoring range, is identified the ambient image;
When continuous preset duration does not identify the target from the ambient image, controls the smart machine and be in
Dormant state;Alternatively,
Assignment instructions are received, when receiving the assignment instructions, then execute the assignment instructions.
Judge whether according to the state parameter as a kind of possible realization method of one aspect of the present invention embodiment
Meet in the deterministic process of wake-up condition, if receiving wake-up instruction, stop the deterministic process, is referred to according to the wake-up
It enables executing and wakes up flow.
The awakening method of the smart machine of the embodiment of the present invention, by smart machine in a dormant state when, obtain intelligence
The ambient image of energy monitoring of tools range, is identified ambient image, when identifying target from ambient image, obtains mesh
Target state parameter judges whether to meet wake-up condition according to state parameter, when determination meets wake-up condition, wakes up intelligent
Equipment.In the present embodiment, by after recognizing target, according to the state parameter of target, judging whether to meet wake-up condition,
It when meeting wake-up condition, smart machine is independently waken up, is directly waken up when detecting people, it is also contemplated that the state ginseng of target
Number, keeps the wake-up of smart machine more accurate, reduces false wake-up.
Another aspect of the present invention embodiment proposes a kind of Rouser of smart machine, including:
First acquisition module, for smart machine in a dormant state when, obtain the smart machine monitoring range
The ambient image is identified in ambient image;
Second acquisition module, the state for when identifying target from the ambient image, obtaining the target are joined
Number;
Judgment module, for according to the state parameter, judging whether to meet wake-up condition;
Wake-up module, for when determination meets wake-up condition, waking up the smart machine.
As a kind of possible realization method of another aspect of the present invention embodiment, the state parameter of the target includes:Institute
The facial angle of the distance between target and the smart machine and the target is stated, the judgment module is additionally operable to:
If the distance between the target and the smart machine are less than preset distance threshold, and the people of the target
When face angle is in preset angular range, determination meets wake-up condition.
As a kind of possible realization method of another aspect of the present invention embodiment, the state parameter of the target further includes:
The lip state parameter of the target;Described device further includes:
First determining module, for before determination meets wake-up condition, determining that the lip state parameter of the target refers to
Showing the target, there are lips to move.
As a kind of possible realization method of another aspect of the present invention embodiment, the state parameter of the target further includes:
Stay time of the target within the scope of the distance threshold;Described device further includes:
Second determining module, for before determination meets wake-up condition, determining the target in the distance threshold model
Stay time in enclosing reaches preset duration threshold value.
As a kind of possible realization method of another aspect of the present invention embodiment, which further includes:
Third acquisition module, for before the ambient image for obtaining smart machine monitoring range, obtaining the language in environment
Message ceases;
Third determining module, for when, there are when preset keyword, determining the voice messaging in the voice messaging
Sounnd source direction;
First control module turns to the Sounnd source direction for controlling the smart machine.
As a kind of possible realization method of another aspect of the present invention embodiment, which further includes:
Second acquisition module is additionally operable to after waking up the smart machine, persistently obtains the described of the target
State parameter;
The judgment module is additionally operable to judge whether persistently to meet wake-up condition;
The wake-up module is additionally operable to when persistently meeting the wake-up condition, then persistently wakes up the smart machine.
As a kind of possible realization method of another aspect of the present invention embodiment, which further includes:
First acquisition module is additionally operable to after waking up the smart machine, persistently obtains the smart machine prison
The ambient image for controlling range, is identified the ambient image;
Second control module, for when continuous preset duration does not identify the target from the ambient image, controlling
Make the smart machine in a dormant state;Alternatively,
Execution module when receiving the assignment instructions, then executes the task for being received to assignment instructions
Instruction.
As a kind of possible realization method of another aspect of the present invention embodiment, the wake-up module is additionally operable to:
According to the state parameter, judge whether to meet in the deterministic process of wake-up condition, instruction is waken up receiving
When, stop the deterministic process, flow is waken up according to the wake-up instruction execution.
The Rouser of the smart machine of the embodiment of the present invention, by smart machine in a dormant state when, obtain intelligence
The ambient image of energy monitoring of tools range, is identified ambient image, when identifying target from ambient image, obtains mesh
Target state parameter judges whether to meet wake-up condition according to state parameter, when determination meets wake-up condition, wakes up intelligent
Equipment.In the present embodiment, by after recognizing target, according to the state parameter of target, judging whether to meet wake-up condition,
It when meeting wake-up condition, smart machine is independently waken up, is directly waken up when detecting people, it is also contemplated that the state ginseng of target
Number, keeps the wake-up of smart machine more accurate, reduces false wake-up.
Another aspect of the present invention embodiment proposes a kind of smart machine, including:Shell, processor, memory, circuit board
And power circuit, wherein the circuit board is placed in the space interior that the shell surrounds, the processor and the memory
It is arranged on the circuit board;The power circuit, for being each circuit or the device power supply of above-mentioned smart machine;It is described to deposit
Reservoir is for storing executable program code;Wherein, the processor is by reading the executable journey stored in the memory
Sequence code runs program corresponding with the executable program code, for realizing as described in the above-mentioned one side embodiment
The awakening method of smart machine.
Another aspect of the present invention embodiment proposes a kind of non-transitorycomputer readable storage medium, is stored thereon with meter
Calculation machine program realizes the awakening method of the smart machine as described in above-mentioned one side embodiment when the program is executed by processor.
Another aspect of the present invention embodiment proposes a kind of computer program product, when in the computer program product
The awakening method of the smart machine as described in above-mentioned one side is realized when instruction is executed by processor.
The additional aspect of the present invention and advantage will be set forth in part in the description, and will partly become from the following description
Obviously, or practice through the invention is recognized.
Description of the drawings
Above-mentioned and/or additional aspect and advantage of the invention will become from the following description of the accompanying drawings of embodiments
Obviously and it is readily appreciated that, wherein:
A kind of flow diagram of the awakening method for smart machine that Fig. 1 is provided by the embodiment of the present invention;
Fig. 2 is the principle schematic that binocular vision provided in an embodiment of the present invention calculates distance;
Fig. 3 is the flow diagram of the awakening method of another smart machine provided in an embodiment of the present invention;
Fig. 4 is the flow diagram of the awakening method of another smart machine provided in an embodiment of the present invention;
Fig. 5 is the flow diagram of the awakening method of another smart machine provided in an embodiment of the present invention;
Fig. 6 is the flow diagram of the awakening method of another smart machine provided in an embodiment of the present invention;
Fig. 7 is a kind of structural schematic diagram of the Rouser of smart machine provided in an embodiment of the present invention;
Fig. 8 is the structural schematic diagram of smart machine one embodiment of the present invention.
Specific implementation mode
The embodiment of the present invention is described below in detail, examples of the embodiments are shown in the accompanying drawings, wherein from beginning to end
Same or similar label indicates same or similar element or element with the same or similar functions.Below with reference to attached
The embodiment of figure description is exemplary, it is intended to for explaining the present invention, and is not considered as limiting the invention.
Below with reference to the accompanying drawings awakening method, device, smart machine and the storage of the smart machine of the embodiment of the present invention are described
Medium.
For based on detecting that face just independently wakes up, often there is the case where false wake-up, the embodiment of the present invention proposes one
The awakening method of kind smart machine.
The awakening method of the smart machine of the embodiment of the present invention, by after recognizing target, being joined according to the state of target
Number, judges whether to meet wake-up condition, when meeting wake-up condition, independently wakes up smart machine, be not straight when detecting people
Connect wake-up, it is also contemplated that the state parameter of target keeps the wake-up of smart machine more accurate, reduces false wake-up.
Fig. 1 is a kind of flow diagram of the awakening method of smart machine provided in an embodiment of the present invention.
As shown in Figure 1, the awakening method of the smart machine includes:
Step 101, smart machine in a dormant state when, obtain smart machine monitoring range ambient image, to ring
Border image is identified.
Wherein, smart machine can be robot, intelligent appliance, smart mobile phone etc..
In the present embodiment, smart machine be configured with photographic device, smart machine in a dormant state when, camera shooting can be passed through
Image in device captured in real-time monitoring range, obtains ambient image.After obtaining ambient image, ambient image can be examined
It surveys, to identify the target into monitoring range.Goal is appreciated that as people.
Specifically, smart machine can be by Face datection or human testing, the people in environment-identification image.More specifically
Ground extracts the profile of object from ambient image, and the contour of object of extraction and the facial contour or human body contour outline that prestore carry out
It compares.When the similarity between the profile and preset profile of extraction is more than preset threshold value, it is believed that from ambient image
People is recognized.To identify all people in ambient image by this method.
Step 102, when identifying target from ambient image, the state parameter of target is obtained.
In the present embodiment, when recognizing a target from ambient image, the state parameter of the target can be obtained.Its
In, state parameter may include:The distance between target and smart machine, the facial angle of target, target are in distance threshold range
Interior stay time etc..
When recognizing multiple targets from ambient image, referred to herein as candidate target, in order to avoid false wake-up, raising is called out
Awake accuracy chooses a candidate target as target from multiple candidate targets.
As alternatively possible realization method, when detecting multiple candidate targets, from smart machine distance recently
Candidate target in choose a candidate target as target.
In the present embodiment, candidate mesh can be obtained by depth camera either binocular vision camera or laser radar
The distance between mark and smart machine.
As a kind of possible realization method, it is configured with depth camera in smart machine, by depth camera, is obtained
The depth map of candidate target.In specific implementation, controllable light can be projected to candidate target surface by structured light projector
Point, striation or smooth surface structure, and image is obtained by the imaging sensor in depth camera, triangle is utilized by geometrical relationship
The three-dimensional coordinate of candidate target is calculated in principle, so as to obtain the distance between candidate target and smart machine.
As a kind of possible realization method, binocular vision camera is configured in smart machine, is taken the photograph by binocular vision
As head, candidate target is shot.Then, the parallax for calculating image captured by binocular vision camera, according to disparity computation
The distance between candidate target and smart machine.
Fig. 2 is the principle schematic that binocular vision provided in an embodiment of the present invention calculates distance.In Fig. 2, in real space
In, depict two camera position OlAnd OrAnd the optical axis of left and right camera, the focal plane of two cameras are burnt
The distance of plane is f where two cameras of plan range.
As shown in Fig. 2, p and p ' is positions of the same candidate target P in different shooting images respectively.Wherein, p points distance
The distance that place shoots the left border of image is xl, the distance of p ' left borders that image is shot apart from place is xr。OlWith
OrRespectively two cameras, for the two cameras in same plane, the distance between two cameras are Z.
Based on principle of triangulation, the distance between plane b where P and two camera in Fig. 2 has such as ShiShimonoseki
System:
Based on this, can push awayWherein, d is image captured by same candidate target binocular camera
Visual difference.Since Z, f are definite value, between plane where can determining candidate target and camera according to visual difference d
Distance b, i.e. the distance between candidate target and smart machine.
As another possible realization method, laser radar is configured in smart machine, by laser radar to monitoring
Emit laser in range, the laser of transmitting encounters the barrier in monitoring range and will be reflected.Smart machine receives monitoring range
The laser that interior each barrier returns, the binary map of each barrier is generated according to the laser of return.Then, by each two-value
Figure is merged with ambient image, and binary map corresponding with candidate target is identified from all binary maps.Specifically, Ke Yigen
The profile or size of each barrier can be identified according to the binary map of each barrier, then by each mesh in ambient image
Target profile or size are matched, so as to obtain the corresponding binary map of candidate target.Later, candidate target is corresponded to
The laser time of return of binary map be multiplied by the light velocity, and divided by 2, obtain the distance between candidate target and smart machine.
After at a distance from each candidate target of acquisition is between smart machine, set to intelligence by comparing each candidate target
Standby distance is selected with smart machine apart from nearest candidate target, and from smart machine in nearest candidate target
A candidate target is selected as target.
It is understood that apart from nearest candidate target may be one with smart machine, it is also possible to multiple.When with
Smart machine apart from nearest candidate target only there are one, can be using the candidate target as target.When with smart machine distance most
When close candidate target has multiple, a candidate target can be chosen from multiple and smart machine in nearest candidate target
As target.
By taking robot as an example, a robot is placed on certain company foreground, can be in machine when user needs to enter company
Information registration is carried out in people, i.e., is registered in robot.Or the face of registration user can be downloaded from company's site
Image, in storage to robot, the user registered in company's site is synchronously registered in robot.Generally
The user registered in the robot, than the user of unregistered mistake and robot interactive interact intention stronger.It as a result, can root
It is whether registered according to candidate target, target is determined in nearest candidate target with smart machine from multiple.
Robot can acquire the facial image of visitor or company personnel, utilize acquisition in daily reception work
The facial image of visitor or company personnel build a registered users facial image database, and website registration can also be utilized to use
The facial image at family builds the facial image database.
As a kind of possible realization method, smart machine can be in local search and smart machine apart from nearest candidate mesh
Mark, if registered smart machine.Specifically, smart machine can prestore registered users facial image database, face figure
User's facial image as being stored with registered smart machine in library.When with smart machine apart from nearest candidate target be it is multiple
When, the facial image with smart machine apart from nearest candidate target can be compared with the facial image in facial image database
Compared with.
If there are a facial images with smart machine apart from nearest candidate target in facial image database, illustrate this
Candidate target is registered, then using the candidate target as target.
If there is no facial image with smart machine apart from nearest candidate target in facial image database, illustrate and intelligence
Energy equipment is unregistered apart from nearest candidate target, in nearest candidate target, can be randomly selected from smart machine
One candidate target is as target.
If there are multiple and facial image of the smart machine apart from nearest candidate target, explanation to have in facial image database
It is multiple registered apart from nearest candidate target with smart machine, then will can at first inquire with smart machine distance recently
Candidate target as target, also can from it is registered and with smart machine in nearest candidate target, randomly select one
Candidate target is as target.
As alternatively possible realization method, when with smart machine apart from nearest candidate target be multiple when, can will
All and facial image of the smart machine apart from nearest candidate target, is sent to server, query result is returned by server
To smart machine, smart machine is determined according to comparison result hands over target.
Specifically, server storage has the facial image database of registered users, when with smart machine apart from nearest candidate
When target is multiple, multiple and facial image of the smart machine apart from nearest candidate target is sent to service by smart machine
Device.Server receives facial image, and in registered users facial image database, and inquiry whether there is and smart machine distance
The facial image of nearest candidate target.Then, query result is sent to smart machine by server.Smart machine is according to inquiry
As a result, determining the I-goal of smart machine, specific to determine that method can be found in the above method, details are not described herein.
For example, A is passed by face of robot, and there is no interactions to be intended to, and B is the regular guest of company, before complete
At registration.When A and B at a distance from robot be less than 3 meters of distance threshold, and it is identical at a distance from robot when, robot can
The registration B completed is chosen as target, is greeted to B.
In the present embodiment, when detecting multiple candidate targets from ambient image, can from smart machine distance recently
Candidate target in choose a candidate target as target, when having multiple apart from nearest candidate target with smart machine,
According to registered facial image database, target is chosen, so that the target chosen most likely interacts intention with smart machine
Target, just waken up compared to when detecting face, keep the wake-up of smart machine more accurate, reduce accidentally calling out for smart machine
It wakes up.
After choosing target in multiple candidate targets, the state parameter of target is obtained.
Step 103, according to state parameter, judge whether to meet wake-up condition.
After the state parameter for obtaining target, according to state parameter, judge whether to meet wake-up condition.Specifically, can be
Corresponding condition is arranged in each state parameter, when all state parameters are satisfied by corresponding condition, it is believed that smart machine
Meet wake-up condition.
In the present embodiment, by the way that wake-up condition is arranged, avoids smart machine from being waken up when detecting people, reduce intelligence and set
Standby false wake-up.
Step 104, when determination meets wake-up condition, smart machine is waken up.
In the present embodiment, when smart machine meets wake-up condition, smart machine is waken up, smart machine is made to be in normal work
Make state.Compared to when detecting face just wake up smart machine, it is also contemplated that the state parameter of target is called out in state parameter satisfaction
When the condition of waking up, smart machine is waken up, the accuracy of wake-up is substantially increased, reduces false wake-up, decrease and people is bothered.
Illustrate above-described embodiment to become apparent from, explained later when state parameter include between target and smart machine away from
From and target facial angle when, the awakening method of smart machine.Fig. 3 is another intelligence provided in an embodiment of the present invention
The flow diagram of the awakening method of equipment.
As shown in figure 3, the awakening method of the smart machine includes:
Step 201, smart machine in a dormant state when, obtain smart machine monitoring range ambient image, to ring
Border image is identified.
In the present embodiment, step 201 is similar with the content that the step 101 in above-described embodiment is recorded, therefore no longer superfluous herein
It states.
Step 202, when identifying target from ambient image, the distance between target and smart machine are obtained, and
The facial angle of target.
It is understood that the distance between target and smart machine are closer, illustrate exist between target and smart machine
The possibility that interaction is intended to is bigger, therefore in the present embodiment, obtains the distance between target and smart machine.Wherein, mesh is obtained
The method of mark and the distance between smart machine, reference can be made to the related content described in above-described embodiment, details are not described herein.
In practice, when people passes by robot, if people's rotary head is seen to robot, or work as face face robot
When, illustrate that the attention rate of Human-to-Machine people is higher, people exists interacts intention with robot interactive.Target can be also obtained as a result,
Facial angle.
Wherein, facial angle can be the angle that image central axes are deviateed in face central axes, face central axes and including water
Square to central axes and vertical direction central axes, corresponding image central axes also include the central axes of horizontal direction and vertical
The central axes in direction.Face horizontal direction central axes and the central axes of vertical direction can be identified from ambient image, respectively
Deviate the angle of the central axes of the Vertical Square with the central axes of the horizontal direction of image and image, the angle got is exactly face
Angle.
Step 203, if the distance between target and smart machine are less than preset distance threshold, and the face angle of target
When degree is in preset angular range, determination meets wake-up condition.
In the present embodiment, distance threshold and angular range can be pre-set, by the distance between target and smart machine with
Distance threshold is compared, and facial angle is compared with the upper limit value of angular range and lower limiting value.
When the distance between target and smart machine are less than preset distance threshold, and the facial angle of target is preset
When in angular range, it is believed that target has the interaction interacted with smart machine and is intended to, and determines that smart machine meets wake-up bars
Part.
It is assumed that distance threshold is 3 meters, angular range is [0 °, 45 °].When distance of the people apart from robot be less than 3 meters, and
When facial angle is in [0 °, 45 °] range, it may be determined that meet wake-up condition.
It, can by the distance between state parameter target and smart machine and the facial angle of target in the present embodiment
To judge that target whether there is the intention interacted with smart machine.When there is interaction intention, it is believed that meet wake-up condition,
It avoids and directly wakes up smart machine when detecting face, reduce the false wake-up of smart machine.
Step 204, when determination meets wake-up condition, smart machine is waken up.
In the present embodiment, when smart machine meets wake-up condition, smart machine is waken up, smart machine is made to be in normal work
Make state.For example, when robot meets wake-up condition, can voice broadcast " hello ", be then wake up robot.
The awakening method of the smart machine of the embodiment of the present invention, by between the state parameter target and smart machine of target
Distance and target facial angle, judge whether to meet wake-up condition, it is autonomous to wake up intelligence when meeting wake-up condition
Equipment directly wakes up compared to when detecting face, keeps the wake-up of smart machine more accurate, reduces accidentally calling out for smart machine
It wakes up.
On the basis of the above embodiments, if detecting that the lip of people changes, it is also contemplated that people exists and intelligence
The interaction of equipment interaction is intended to.As a result, in order to further increase the accuracy of wake-up, dbjective state parameter may also include target
Lip state parameter.Wherein, the state parameter of lip may include the size that the radian state of lip, face are opened.Fig. 4 is this hair
The flow diagram of the awakening method for another smart machine that bright embodiment provides.
As shown in figure 4, the awakening method of the smart machine includes:
Step 301, smart machine in a dormant state when, obtain smart machine monitoring range ambient image, to ring
Border image is identified.
In the present embodiment, step 301 is similar with the content that the step 101 in above-described embodiment is recorded, therefore no longer superfluous herein
It states.
Step 302, when identifying target from ambient image, the distance between target and smart machine, target are obtained
Facial angle and target lip state parameter.
Wherein, it is dynamic with the presence or absence of lip to be used to indicate target for lip state parameter.Wherein, lip state parameter may include lip
Radian state, face open size.
In the present embodiment, the method for obtaining the facial angle of the distance between target and smart machine and target is and above-mentioned
Method described in embodiment is similar, and details are not described herein.
When obtaining lip state parameter, as a kind of possible realization method, the lip when lip is closed can be prestored
The image in portion.The facial image that target is intercepted from ambient image, by the image of lip in facial image and pre-stored lip
Portion's image is compared, to determine whether lip radian changes.When lip radian changes, such as corners of the mouth when smile
It raises up, it is believed that there are lips to move for target.
As alternatively possible realization method, the lip image of target can be intercepted from the ambient image of acquisition, with inspection
Survey whether face is in open configuration.When face is in open configuration, it is believed that there are lips to move for target.
Step 303, if the distance between target and smart machine are less than preset distance threshold, and the face angle of target
When spending in preset angular range, and determining that the lip state parameter instruction target of target is dynamic there are lip, determine that satisfaction is called out
The condition of waking up.
In the present embodiment, when the distance between target and smart machine are less than preset distance threshold, and the people of target
Face angle is in preset angular range, and when determining that target is dynamic there are lip, it is believed that target exists and robot interactive
Intention, determination meet wake-up condition.
At a distance from target is between smart machine and on the basis of the state parameter of the facial angle of target, increase
Target moves this condition with the presence or absence of lip, to further improve the accuracy of wake-up, reduces false wake-up.
Step 304, when determination meets wake-up condition, smart machine is waken up.
In the present embodiment, when smart machine meets wake-up condition, smart machine is waken up, smart machine is made to be in normal work
Make state.For example, when robot meets wake-up condition, can voice broadcast " hello ", be then wake up robot.
The awakening method of the smart machine of the embodiment of the present invention, by according to the distance between target and smart machine, mesh
Target facial angle and target are dynamic with the presence or absence of lip, these three conditions judge whether to meet wake-up condition, to further
The accuracy for improving smart machine wake-up, avoids error starting.
Since people is in the range closer apart from smart machine, residence time is longer, it is believed that people exists and intelligence
The wish of equipment interaction is stronger.It, can also be by target within the scope of distance threshold as a result, on the basis of embodiment shown in Fig. 3
Stay time is as state parameter.Fig. 5 is that the flow of the awakening method of another smart machine provided in an embodiment of the present invention is shown
It is intended to.
Step 401, smart machine in a dormant state when, obtain smart machine monitoring range ambient image, to ring
Border image is identified.
In the present embodiment, step 401 is similar with the content that the step 101 in above-described embodiment is recorded, therefore no longer superfluous herein
It states.
Step 402, when identifying target from ambient image, the distance between target and smart machine, target are obtained
Stay time within the scope of distance threshold of facial angle and target.
In the present embodiment, the method for obtaining the facial angle of the distance between target and smart machine and target is and above-mentioned
Method described in embodiment is similar, and details are not described herein.
Due to when the distance between target and smart machine farther out when, the friendship that is interacted with smart machine may be not present in target
Mutually be intended to, or distance it is closer, but target residence time it is shorter may also be not present interact with smart machine interaction intention.
In order to further increase the accuracy of wake-up, stay time of the target within the scope of distance threshold can be also obtained.Tool
Body, since detecting that target is in distance range for the first time, statistics target is in the duration in distance range.
Step 403, if the distance between target and smart machine are less than preset distance threshold, and the face angle of target
Degree is in preset angular range, and determines that stay time of the target within the scope of distance threshold reaches preset duration threshold value
When, determination meets wake-up condition.
In the present embodiment, the distance between target and smart machine are compared with distance threshold, by facial angle with
The upper limit value and lower limiting value of angular range are compared, and stay time is compared with preset duration threshold value.
When the distance between target and smart machine are less than preset distance threshold, and the facial angle of target is default
Angular range in and stay time of the target within the scope of distance threshold when reaching duration threshold value, it is believed that target is deposited
With the intention that interacts of robot interactive, determination meet wake-up condition.
It is assumed that distance threshold is 3 meters, angular range is [0 °, 45 °], and duration threshold value is 3 seconds.When people is apart from robot
Distance is less than 3 meters, and facial angle is in [0 °, 45 °] range and stay time of people in the range of less than 3 meters is big
When 3 seconds, it may be determined that meet wake-up condition.
At a distance from target is between smart machine and on the basis of the state parameter of the facial angle of target, increase
Whether stay time of the target within the scope of distance threshold reaches preset duration threshold value, this condition, to further put forward
The high accuracy waken up, reduces false wake-up.
Step 404, when determination meets wake-up condition, smart machine is waken up.
In the present embodiment, when smart machine meets wake-up condition, smart machine is waken up, smart machine is made to be in normal work
Make state.For example, when robot meets wake-up condition, can voice broadcast " hello ", be then wake up robot.
The awakening method of the smart machine of the embodiment of the present invention, by according to the distance between target and smart machine, mesh
Whether the stay time of target facial angle and target within the scope of distance threshold reaches duration threshold value, these three conditions are sentenced
It is disconnected whether to meet wake-up condition, to further improve the accuracy of smart machine wake-up, avoid error starting.
By taking robot as an example, in real life, when people sees robot, at this moment people may not be in the monitoring of robot
In range, people may actively greet " hello ", or " may I ask " etc. with robot, be exchanged with robot.
Based on this, the embodiment of the present invention also proposes before the ambient image for obtaining smart machine monitoring range, to voice
Information is monitored, and after monitoring preset keyword, by the state parameter of target, is further judged target, with true
Determine whether smart machine meets wake-up condition.Fig. 6 is the awakening method of another smart machine provided in an embodiment of the present invention
Flow diagram.
Before the ambient image for obtaining smart machine monitoring range, as shown in fig. 6, the awakening method of the smart machine is also
It may include:
Step 501, the voice messaging in environment is obtained.
In the present embodiment, it is configured with microphone array on smart machine, microphone array can be led to and acquired in environment in real time
Voice messaging.
Step 502, when, there are when preset keyword, determining the Sounnd source direction of voice messaging in voice messaging.
Specifically, keyword can be pre-set, keyword may include greeting, inquire common keyword, such as " hello ",
" robot ", " may I ask " etc..
Smart machine can extract keyword from the voice messaging of acquisition, the keyword that will be extracted and preset keyword
It is compared, whether to determine in voice messaging comprising keyword.When there are when preset keyword, being based on wheat in voice messaging
The sound of gram roads wind array Zhong Mei microphone acquisition, positions sound source, determines the direction of sound source.
Step 503, control smart machine turns to Sounnd source direction.
After determining Sounnd source direction, smart machine can control to turn to Sounnd source direction.By taking robot as an example, sound source side is being determined
After method, controls holder and chassis turns to Sounnd source direction.
After turning to Sounnd source direction, ambient image is obtained, and ambient image is identified.After recognizing target, obtain
The state parameter for taking target judges whether to meet wake-up condition by the state parameter of target.When meeting wake-up condition, call out
Awake smart machine.Wherein, using three kinds of methods described in above-described embodiment, determine whether smart machine meets wake-up bars
Part.
The awakening method of the smart machine of the embodiment of the present invention, by obtaining the environment map in smart machine monitoring range
Before picture, the voice messaging in environment is monitored, when, there are when preset keyword, Sounnd source direction being determined, so that intelligence in voice messaging
Energy equipment turns to Sounnd source direction, is further judged whether to meet wake-up condition according to state parameter.In the present embodiment, listening to
When keyword, smart machine is not waken up not instead of directly, further determines whether to meet wake-up condition by state parameter, to
Substantially increase the accuracy of wake-up.Also, the active steering people when listening to keyword makes one the interaction with smart machine,
More close to interpersonal exchange, the intelligence of smart machine is improved.
Currently, smart machine wakes up when either waking up button by wake-up word or detecting face, smart machine is called out
The awake duration is shorter, user it is also following and interacted with smart machine, smart machine is turned off wake-up.User thinks to link up again
It must wake up again, increase the use cost of user.
On the basis of the above embodiments, after waking up smart machine, the also sustainable state parameter for obtaining target, root
According to state parameter, judge whether smart machine meets wake-up condition.Wherein, judgment method can utilize the side that above-described embodiment is recorded
Method.If persistently meeting wake-up condition, sustainable wake-up smart machine.To, reduce the use of wake-up times and user at
This.
It is finished when people interacts with smart machine, after leaving, if smart machine can cause resource unrestrained still in wake-up states
Take.As a result, after waking up smart machine, the sustainable ambient image obtained in smart machine monitoring range, and to environment map
As being identified.If in continuous preset duration, target is not recognized from ambient image, it is believed that people and smart machine
It has interacted and has finished, smart machine can be controlled in a dormant state, to save electricity, improved the cruising ability of smart machine.
In addition, after waking up smart machine, smart machine can receive assignment instructions, after receiving assignment instructions, execute
Assignment instructions.For example, guest-meeting robot upon awakening, receives reception assignment instructions, then robot receives, guides guest to finger
Determine place.
Further, judged whether during meeting wake-up condition, such as according to the state parameter of target in smart machine
Fruit smart machine receives wake-up instruction, stops deterministic process, flow is waken up according to instruction execution is waken up, to reduce from suspend mode
State is switched to the time needed for waking up.
Wherein, it includes waking up word instruction, wake operation instruction etc. to wake up instruction.
By taking robot as an example, when robot is according to the state parameter of people, judge whether during meeting wake-up condition, people
Press wake-up button.At this moment, robot receives wake operation instruction, then stops deterministic process, according to wake-up push-botton operation
Instruction wakes up robot.
The awakening method of the smart machine of the embodiment of the present invention, by smart machine in a dormant state when, obtain intelligence
The ambient image of energy monitoring of tools range, is identified ambient image, when identifying target from ambient image, obtains mesh
Target state parameter judges whether to meet wake-up condition according to state parameter, when determination meets wake-up condition, wakes up intelligent
Equipment.In the present embodiment, by after recognizing target, according to the state parameter of target, judging whether to meet wake-up condition,
It when meeting wake-up condition, smart machine is independently waken up, is directly waken up when detecting people, it is also contemplated that the state ginseng of target
Number, keeps the wake-up of smart machine more accurate, reduces false wake-up.
In order to realize that above-described embodiment, the embodiment of the present invention propose a kind of Rouser of smart machine.Fig. 7 is the present invention
A kind of structural schematic diagram of the Rouser for smart machine that embodiment provides.
As shown in fig. 7, the Rouser of the smart machine includes:First acquisition module 610, is sentenced second acquisition module 620
Disconnected module 630, wake-up module 640.
First acquisition module 610 be used for smart machine in a dormant state when, obtain smart machine monitoring range ring
Border image, is identified ambient image.
Second acquisition module 620 is used for when identifying target from ambient image, obtains the state parameter of target.
Judgment module 630 is used to, according to state parameter, judge whether to meet wake-up condition.
Wake-up module 640 is used to, when determination meets wake-up condition, wake up smart machine.
In a kind of possible realization method of the present embodiment, the state parameter of target includes:Between target and smart machine
Distance and target facial angle, judgment module 630 is additionally operable to:
If the distance between target and smart machine are less than preset distance threshold, and the facial angle of target is default
Angular range in when, determination meet wake-up condition.
In a kind of possible realization method of the present embodiment, the state parameter of target further includes:The lip shape of the target
State parameter;The device further includes:
First determining module, the lip state parameter instruction mesh for before determination meets wake-up condition, determining target
There are lips to move for mark.
In a kind of possible realization method of the present embodiment, the state parameter of target further includes:Target is in distance threshold model
Enclose interior stay time;The device further includes:
Second determining module, for before determination meets wake-up condition, determining that target is stopped within the scope of distance threshold
Duration is stayed to reach preset duration threshold value.
In a kind of possible realization method of the present embodiment, which further includes:
Third acquisition module, for before the ambient image for obtaining smart machine monitoring range, obtaining the language in environment
Message ceases;
Third determining module, for when, there are when preset keyword, determining the sound source side of voice messaging in voice messaging
To;
First control module turns to Sounnd source direction for controlling smart machine.
In a kind of possible realization method of the present embodiment, the second acquisition module 620 be additionally operable to wake up smart machine it
Afterwards, the state parameter of target is persistently obtained;
Judgment module 630 is additionally operable to judge whether persistently to meet wake-up condition.
Wake-up module 640 is additionally operable to when persistently meeting wake-up condition, then persistently wake up smart machine.
In a kind of possible realization method of the present embodiment, which further includes:
First acquisition module 610 is additionally operable to after waking up smart machine, persistently obtains the ring of smart machine monitoring range
Border image, is identified ambient image;
Second control module, for when continuous preset duration does not identify target from ambient image, control intelligence to be set
For in a dormant state;Alternatively,
Execution module when receiving assignment instructions, then executes assignment instructions for being received to assignment instructions.
In a kind of possible realization method of the present embodiment, wake-up module 640 is additionally operable to:
According to state parameter, judge whether to meet in the deterministic process of wake-up condition, when receiving wake-up instruction, stop
Only deterministic process wakes up flow according to instruction execution is waken up.
It should be noted that the explanation of the aforementioned awakening method embodiment to smart machine, is also applied for this implementation
The Rouser of the smart machine of example, therefore details are not described herein.
The Rouser of the smart machine of the embodiment of the present invention, by smart machine in a dormant state when, obtain intelligence
The ambient image of energy monitoring of tools range, is identified ambient image, when identifying target from ambient image, obtains mesh
Target state parameter judges whether to meet wake-up condition according to state parameter, when determination meets wake-up condition, wakes up intelligent
Equipment.In the present embodiment, by after recognizing target, according to the state parameter of target, judging whether to meet wake-up condition,
It when meeting wake-up condition, smart machine is independently waken up, is directly waken up when detecting people, it is also contemplated that the state ginseng of target
Number, keeps the wake-up of smart machine more accurate, reduces false wake-up.
In order to realize that above-described embodiment, the embodiment of the present invention also propose a kind of smart machine.Fig. 8 is smart machine of the present invention
The structural schematic diagram of one embodiment.
As shown in figure 8, the smart machine includes:Shell 710, processor 720, memory 730, circuit board 740 and power supply
Circuit 750, wherein circuit board 740 is placed in the space interior that shell 710 surrounds, and processor 720 and the setting of memory 730 exist
On circuit board 740;Power circuit 750, for being each circuit or the device power supply of above-mentioned smart machine;Memory 730 is used for
Store executable program code;Processor 720 is run by reading the executable program code stored in memory 730 and can
Execute the corresponding program of program code, the awakening method for executing the smart machine described in above-described embodiment.
In order to realize that above-described embodiment, the embodiment of the present invention also propose a kind of non-transitorycomputer readable storage medium,
It is stored thereon with computer program, the wake-up of the smart machine as described in above-described embodiment is realized when which is executed by processor
Method.
In order to realize that above-described embodiment, the embodiment of the present invention also propose a kind of computer program product, work as computer program
The awakening method of the smart machine as described in above-described embodiment is realized when instruction in product is executed by processor.
In the description of this specification, reference term " one embodiment ", " some embodiments ", " example ", " specifically show
The description of example " or " some examples " etc. means specific features, structure, material or spy described in conjunction with this embodiment or example
Point is included at least one embodiment or example of the invention.In the present specification, schematic expression of the above terms are not
It must be directed to identical embodiment or example.Moreover, particular features, structures, materials, or characteristics described can be in office
It can be combined in any suitable manner in one or more embodiments or example.In addition, without conflicting with each other, the skill of this field
Art personnel can tie the feature of different embodiments or examples described in this specification and different embodiments or examples
It closes and combines.
In addition, term " first ", " second " are used for description purposes only, it is not understood to indicate or imply relative importance
Or implicitly indicate the quantity of indicated technical characteristic.Define " first " as a result, the feature of " second " can be expressed or
Implicitly include at least one this feature.In the description of the present invention, the meaning of " plurality " is at least two, such as two, three
It is a etc., unless otherwise specifically defined.
Any process described otherwise above or method description are construed as in flow chart or herein, and expression includes
It is one or more for realizing custom logic function or process the step of executable instruction code module, segment or portion
Point, and the range of the preferred embodiment of the present invention includes other realization, wherein can not press shown or discuss suitable
Sequence, include according to involved function by it is basic simultaneously in the way of or in the opposite order, to execute function, this should be of the invention
Embodiment person of ordinary skill in the field understood.
Expression or logic and/or step described otherwise above herein in flow charts, for example, being considered use
In the order list for the executable instruction for realizing logic function, may be embodied in any computer-readable medium, for
Instruction execution system, device or equipment (system of such as computer based system including processor or other can be held from instruction
The instruction fetch of row system, device or equipment and the system executed instruction) it uses, or combine these instruction execution systems, device or set
It is standby and use.For the purpose of this specification, " computer-readable medium " can any can be included, store, communicating, propagating or passing
Defeated program is for instruction execution system, device or equipment or the dress used in conjunction with these instruction execution systems, device or equipment
It sets.The more specific example (non-exhaustive list) of computer-readable medium includes following:Electricity with one or more wiring
Interconnecting piece (electronic device), portable computer diskette box (magnetic device), random access memory (RAM), read-only memory
(ROM), erasable edit read-only storage (EPROM or flash memory), fiber device and portable optic disk is read-only deposits
Reservoir (CDROM).In addition, computer-readable medium can even is that the paper that can print described program on it or other are suitable
Medium, because can be for example by carrying out optical scanner to paper or other media, then into edlin, interpretation or when necessary with it
His suitable method is handled electronically to obtain described program, is then stored in computer storage.
It should be appreciated that each section of the present invention can be realized with hardware, software, firmware or combination thereof.Above-mentioned
In embodiment, software that multiple steps or method can in memory and by suitable instruction execution system be executed with storage
Or firmware is realized.Such as, if realized in another embodiment with hardware, following skill well known in the art can be used
Any one of art or their combination are realized:With for data-signal realize logic function logic gates from
Logic circuit is dissipated, the application-specific integrated circuit with suitable combinational logic gate circuit, programmable gate array (PGA), scene can compile
Journey gate array (FPGA) etc..
Those skilled in the art are appreciated that realize all or part of step that above-described embodiment method carries
Suddenly it is that relevant hardware can be instructed to complete by program, the program can be stored in a kind of computer-readable storage medium
In matter, which includes the steps that one or a combination set of embodiment of the method when being executed.
In addition, each functional unit in each embodiment of the present invention can be integrated in a processing module, it can also
That each unit physically exists alone, can also two or more units be integrated in a module.Above-mentioned integrated mould
The form that hardware had both may be used in block is realized, can also be realized in the form of software function module.The integrated module is such as
Fruit is realized in the form of software function module and when sold or used as an independent product, can also be stored in a computer
In read/write memory medium.
Storage medium mentioned above can be read-only memory, disk or CD etc..Although having been shown and retouching above
The embodiment of the present invention is stated, it is to be understood that above-described embodiment is exemplary, and should not be understood as the limit to the present invention
System, those skilled in the art can be changed above-described embodiment, change, replace and become within the scope of the invention
Type.
Claims (10)
1. a kind of awakening method of smart machine, which is characterized in that include the following steps:
Smart machine in a dormant state when, the ambient image of the smart machine monitoring range is obtained, to the environment map
As being identified;
When identifying target from the ambient image, the state parameter of the target is obtained;
According to the state parameter, judge whether to meet wake-up condition;
When determination meets wake-up condition, the smart machine is waken up.
2. according to the method described in claim 1, it is characterized in that, the state parameter of the target includes:The target and institute
State the facial angle of the distance between smart machine and the target;
According to the state parameter, judge whether to meet wake-up condition, including:
If the distance between the target and the smart machine are less than preset distance threshold, and the face angle of the target
When degree is in preset angular range, determination meets wake-up condition.
3. according to the method described in claim 2, it is characterized in that, the state parameter of the target further includes:The target
Lip state parameter;
Before determination meets wake-up condition, further include:
Determine indicate the target by the lip state parameter of the target there are lips it is dynamic.
4. according to the method described in claim 2, it is characterized in that, the state parameter of the target further includes:The target exists
Stay time within the scope of the distance threshold;
Before determination meets wake-up condition, further include:
Determine that stay time of the target within the scope of the distance threshold reaches preset duration threshold value.
5. according to the method described in claim 1, it is characterized in that, obtain smart machine monitoring range ambient image it
Before, further include:
Obtain the voice messaging in environment;
When, there are when preset keyword, determining the Sounnd source direction of the voice messaging in the voice messaging;
It controls the smart machine and turns to the Sounnd source direction.
6. according to the method described in claim 1, it is characterized in that, after waking up the smart machine, further include:
The state parameter for persistently obtaining the target, judges whether persistently to meet wake-up condition;
If persistently meeting the wake-up condition, the smart machine is persistently waken up.
7. according to the method described in claim 1, it is characterized in that, after waking up the smart machine, further include:
The ambient image for persistently obtaining the smart machine monitoring range, is identified the ambient image;
When continuous preset duration does not identify the target from the ambient image, controls the smart machine and be in suspend mode
State;Alternatively,
Assignment instructions are received, when receiving the assignment instructions, then execute the assignment instructions.
8. a kind of Rouser of smart machine, which is characterized in that including:
First acquisition module, for smart machine in a dormant state when, obtain the environment of the smart machine monitoring range
The ambient image is identified in image;
Second acquisition module, for when identifying target from the ambient image, obtaining the state parameter of the target;
Judgment module, for according to the state parameter, judging whether to meet wake-up condition;
Wake-up module, for when determination meets wake-up condition, waking up the smart machine.
9. a kind of smart machine, which is characterized in that including:Shell, processor, memory, circuit board and power circuit, wherein
The circuit board is placed in the space interior that the shell surrounds, and the processor and the memory are arranged in the circuit board
On;The power circuit, for being each circuit or the device power supply of above-mentioned smart machine;The memory is for storing and can hold
Line program code;Wherein, the processor is run and institute by reading the executable program code stored in the memory
The corresponding program of executable program code is stated, for realizing the wake-up of the smart machine as described in any in claim 1-7
Method.
10. a kind of non-transitorycomputer readable storage medium, is stored thereon with computer program, which is characterized in that the program
The awakening method of the smart machine as described in any in claim 1-7 is realized when being executed by processor.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810236921.6A CN108733420B (en) | 2018-03-21 | 2018-03-21 | Awakening method and device of intelligent equipment, intelligent equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810236921.6A CN108733420B (en) | 2018-03-21 | 2018-03-21 | Awakening method and device of intelligent equipment, intelligent equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108733420A true CN108733420A (en) | 2018-11-02 |
CN108733420B CN108733420B (en) | 2022-04-29 |
Family
ID=63941039
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810236921.6A Active CN108733420B (en) | 2018-03-21 | 2018-03-21 | Awakening method and device of intelligent equipment, intelligent equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108733420B (en) |
Cited By (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109189211A (en) * | 2018-08-09 | 2019-01-11 | 北京云迹科技有限公司 | Intelligent awakening method and device for robot |
CN109543578A (en) * | 2018-11-13 | 2019-03-29 | 北京猎户星空科技有限公司 | Smart machine control method, device and storage medium |
CN109640224A (en) * | 2018-12-26 | 2019-04-16 | 北京猎户星空科技有限公司 | A kind of sound pick-up method and device |
CN109683708A (en) * | 2018-12-14 | 2019-04-26 | 北京猎户星空科技有限公司 | Smart machine control method, device, electronic equipment and storage medium |
CN109683610A (en) * | 2018-12-14 | 2019-04-26 | 北京猎户星空科技有限公司 | Smart machine control method, device and storage medium |
CN109885164A (en) * | 2019-02-22 | 2019-06-14 | 北京猎户星空科技有限公司 | Smart machine control method, device and storage medium |
CN109960537A (en) * | 2019-03-29 | 2019-07-02 | 北京金山安全软件有限公司 | Interaction method and device and electronic equipment |
CN110187921A (en) * | 2019-04-23 | 2019-08-30 | 阿里巴巴集团控股有限公司 | The method and device of wake-up device |
CN110187766A (en) * | 2019-05-31 | 2019-08-30 | 北京猎户星空科技有限公司 | A kind of control method of smart machine, device, equipment and medium |
CN110188179A (en) * | 2019-05-30 | 2019-08-30 | 浙江远传信息技术股份有限公司 | Speech-oriented identifies exchange method, device, equipment and medium |
CN110196900A (en) * | 2019-06-13 | 2019-09-03 | 三星电子(中国)研发中心 | Exchange method and device for terminal |
CN110253595A (en) * | 2019-06-21 | 2019-09-20 | 北京猎户星空科技有限公司 | A kind of smart machine control method and device |
CN110297665A (en) * | 2019-06-10 | 2019-10-01 | 青岛小鸟看看科技有限公司 | A kind of switching method and camera of equipment operating mode |
CN110730115A (en) * | 2019-09-11 | 2020-01-24 | 北京小米移动软件有限公司 | Voice control method and device, terminal and storage medium |
CN111230891A (en) * | 2018-11-29 | 2020-06-05 | 深圳市优必选科技有限公司 | Robot and voice interaction system thereof |
CN111314187A (en) * | 2020-01-21 | 2020-06-19 | 星络智能科技有限公司 | Storage medium, smart home device and awakening method thereof |
CN111332252A (en) * | 2020-02-19 | 2020-06-26 | 上海商汤临港智能科技有限公司 | Vehicle door unlocking method, device, system, electronic equipment and storage medium |
CN111399627A (en) * | 2020-03-09 | 2020-07-10 | 宁波视睿迪光电有限公司 | Energy-saving method and system for 3D display device |
CN111432128A (en) * | 2020-04-13 | 2020-07-17 | 深圳市华冠科讯电子有限公司 | Video monitoring system with high detection precision |
CN111583937A (en) * | 2020-04-30 | 2020-08-25 | 珠海格力电器股份有限公司 | Voice control awakening method, storage medium, processor, voice equipment and intelligent household appliance |
CN112130918A (en) * | 2020-09-25 | 2020-12-25 | 深圳市欧瑞博科技股份有限公司 | Intelligent device awakening method, device and system and intelligent device |
CN112230569A (en) * | 2020-09-24 | 2021-01-15 | 广州爱孕记信息科技有限公司 | Control method and device of detection equipment |
CN112235882A (en) * | 2020-09-25 | 2021-01-15 | 深圳星标科技股份有限公司 | Method and device for removing ice and snow of visual navigation aid equipment, computer equipment and storage medium |
CN112256534A (en) * | 2020-10-13 | 2021-01-22 | 武汉朋鼎科技有限公司 | Method for transmitting information in building system |
CN112739507A (en) * | 2020-04-22 | 2021-04-30 | 南京阿凡达机器人科技有限公司 | Interactive communication implementation method, equipment and storage medium |
CN112949327A (en) * | 2019-12-11 | 2021-06-11 | 苏涛 | Projection font automatic adjustment platform and method |
CN113286259A (en) * | 2021-05-26 | 2021-08-20 | 深圳康佳电子科技有限公司 | Terminal control method and device based on UWB positioning and positioning equipment |
CN113556649A (en) * | 2020-04-23 | 2021-10-26 | 百度在线网络技术(北京)有限公司 | Broadcasting control method and device of intelligent sound box |
CN113626778A (en) * | 2020-05-08 | 2021-11-09 | 百度在线网络技术(北京)有限公司 | Method, apparatus, electronic device, and computer storage medium for waking up device |
CN113672288A (en) * | 2021-07-02 | 2021-11-19 | 浪潮金融信息技术有限公司 | Method, system and medium for terminal equipment dormancy and awakening |
CN113759732A (en) * | 2021-04-30 | 2021-12-07 | 北京字节跳动网络技术有限公司 | Intelligent household appliance control method and device, electronic equipment and storage medium |
CN114253611A (en) * | 2021-11-25 | 2022-03-29 | 上海齐感电子信息科技有限公司 | Control method and control system |
CN114268337A (en) * | 2021-11-30 | 2022-04-01 | 歌尔科技有限公司 | Intelligent security control method, intelligent security equipment and controller |
CN114463903A (en) * | 2021-12-17 | 2022-05-10 | 广州新科佳都科技有限公司 | Ticket machine interaction method and device, ticket selling terminal and storage medium |
CN115200155A (en) * | 2022-07-26 | 2022-10-18 | 珠海格力电器股份有限公司 | Air conditioner control method and device, air conditioner and storage medium |
WO2023160000A1 (en) * | 2022-02-25 | 2023-08-31 | 上海商汤智能科技有限公司 | Sleep wakeup method and related product |
CN117389186A (en) * | 2023-10-16 | 2024-01-12 | 广州星际悦动股份有限公司 | Method, device, equipment and storage medium for controlling oral care equipment |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103312869A (en) * | 2012-03-09 | 2013-09-18 | 盛乐信息技术(上海)有限公司 | Method and system for mobile terminal screen to be unlocked and standby state conversion |
CN103324276A (en) * | 2012-03-22 | 2013-09-25 | 华为终端有限公司 | Method and device for controlling standby operation |
CN104461717A (en) * | 2013-09-18 | 2015-03-25 | 联想(北京)有限公司 | Information processing method and electronic equipment |
CN104934033A (en) * | 2015-04-21 | 2015-09-23 | 深圳市锐曼智能装备有限公司 | Control method of robot sound source positioning and awakening identification and control system of robot sound source positioning and awakening identification |
CN105096956A (en) * | 2015-08-05 | 2015-11-25 | 百度在线网络技术(北京)有限公司 | Artificial-intelligence-based intelligent robot multi-sound-source judgment method and device |
CN205588317U (en) * | 2016-04-11 | 2016-09-21 | 旗瀚科技股份有限公司 | Sound localization can be carried out and robot head pivoted device is linked |
CN106292732A (en) * | 2015-06-10 | 2017-01-04 | 上海元趣信息技术有限公司 | Intelligent robot rotating method based on sound localization and Face datection |
CN106339219A (en) * | 2016-08-19 | 2017-01-18 | 北京光年无限科技有限公司 | Robot service awakening method and device |
US20170060599A1 (en) * | 2015-08-28 | 2017-03-02 | Xiaomi Inc. | Method and apparatus for awakening electronic device |
CN106897713A (en) * | 2017-03-13 | 2017-06-27 | 宇龙计算机通信科技(深圳)有限公司 | A kind of method and mobile terminal for waking up mobile terminal screen |
CN107102540A (en) * | 2016-02-23 | 2017-08-29 | 芋头科技(杭州)有限公司 | A kind of method and intelligent robot for waking up intelligent robot |
CN107146614A (en) * | 2017-04-10 | 2017-09-08 | 北京猎户星空科技有限公司 | A kind of audio signal processing method, device and electronic equipment |
CN107297745A (en) * | 2017-06-28 | 2017-10-27 | 上海木爷机器人技术有限公司 | voice interactive method, voice interaction device and robot |
CN107679506A (en) * | 2017-10-12 | 2018-02-09 | Tcl通力电子(惠州)有限公司 | Awakening method, intelligent artifact and the computer-readable recording medium of intelligent artifact |
-
2018
- 2018-03-21 CN CN201810236921.6A patent/CN108733420B/en active Active
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103312869A (en) * | 2012-03-09 | 2013-09-18 | 盛乐信息技术(上海)有限公司 | Method and system for mobile terminal screen to be unlocked and standby state conversion |
CN103324276A (en) * | 2012-03-22 | 2013-09-25 | 华为终端有限公司 | Method and device for controlling standby operation |
CN104461717A (en) * | 2013-09-18 | 2015-03-25 | 联想(北京)有限公司 | Information processing method and electronic equipment |
CN104934033A (en) * | 2015-04-21 | 2015-09-23 | 深圳市锐曼智能装备有限公司 | Control method of robot sound source positioning and awakening identification and control system of robot sound source positioning and awakening identification |
CN106292732A (en) * | 2015-06-10 | 2017-01-04 | 上海元趣信息技术有限公司 | Intelligent robot rotating method based on sound localization and Face datection |
CN105096956A (en) * | 2015-08-05 | 2015-11-25 | 百度在线网络技术(北京)有限公司 | Artificial-intelligence-based intelligent robot multi-sound-source judgment method and device |
US20170060599A1 (en) * | 2015-08-28 | 2017-03-02 | Xiaomi Inc. | Method and apparatus for awakening electronic device |
CN107102540A (en) * | 2016-02-23 | 2017-08-29 | 芋头科技(杭州)有限公司 | A kind of method and intelligent robot for waking up intelligent robot |
CN205588317U (en) * | 2016-04-11 | 2016-09-21 | 旗瀚科技股份有限公司 | Sound localization can be carried out and robot head pivoted device is linked |
CN106339219A (en) * | 2016-08-19 | 2017-01-18 | 北京光年无限科技有限公司 | Robot service awakening method and device |
CN106897713A (en) * | 2017-03-13 | 2017-06-27 | 宇龙计算机通信科技(深圳)有限公司 | A kind of method and mobile terminal for waking up mobile terminal screen |
CN107146614A (en) * | 2017-04-10 | 2017-09-08 | 北京猎户星空科技有限公司 | A kind of audio signal processing method, device and electronic equipment |
CN107297745A (en) * | 2017-06-28 | 2017-10-27 | 上海木爷机器人技术有限公司 | voice interactive method, voice interaction device and robot |
CN107679506A (en) * | 2017-10-12 | 2018-02-09 | Tcl通力电子(惠州)有限公司 | Awakening method, intelligent artifact and the computer-readable recording medium of intelligent artifact |
Non-Patent Citations (1)
Title |
---|
钟秋波 等: "《机器人程序设计 仿人机器人竞技娱乐运动设计》", 31 August 2013, 西安电子科技大学出版社 * |
Cited By (52)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109189211A (en) * | 2018-08-09 | 2019-01-11 | 北京云迹科技有限公司 | Intelligent awakening method and device for robot |
WO2020098421A1 (en) * | 2018-11-13 | 2020-05-22 | 北京猎户星空科技有限公司 | Smart device control method and apparatus, and storage medium |
CN109543578A (en) * | 2018-11-13 | 2019-03-29 | 北京猎户星空科技有限公司 | Smart machine control method, device and storage medium |
US11247342B2 (en) | 2018-11-13 | 2022-02-15 | Beijing Orion Star Technology Co., Ltd. | Smart device control method and apparatus, and storage medium |
CN111230891B (en) * | 2018-11-29 | 2021-07-27 | 深圳市优必选科技有限公司 | Robot and voice interaction system thereof |
CN111230891A (en) * | 2018-11-29 | 2020-06-05 | 深圳市优必选科技有限公司 | Robot and voice interaction system thereof |
CN109683708A (en) * | 2018-12-14 | 2019-04-26 | 北京猎户星空科技有限公司 | Smart machine control method, device, electronic equipment and storage medium |
CN109683610A (en) * | 2018-12-14 | 2019-04-26 | 北京猎户星空科技有限公司 | Smart machine control method, device and storage medium |
CN109640224A (en) * | 2018-12-26 | 2019-04-16 | 北京猎户星空科技有限公司 | A kind of sound pick-up method and device |
CN109885164A (en) * | 2019-02-22 | 2019-06-14 | 北京猎户星空科技有限公司 | Smart machine control method, device and storage medium |
CN109960537A (en) * | 2019-03-29 | 2019-07-02 | 北京金山安全软件有限公司 | Interaction method and device and electronic equipment |
CN110187921A (en) * | 2019-04-23 | 2019-08-30 | 阿里巴巴集团控股有限公司 | The method and device of wake-up device |
CN110188179A (en) * | 2019-05-30 | 2019-08-30 | 浙江远传信息技术股份有限公司 | Speech-oriented identifies exchange method, device, equipment and medium |
CN110188179B (en) * | 2019-05-30 | 2020-06-19 | 浙江远传信息技术股份有限公司 | Voice directional recognition interaction method, device, equipment and medium |
CN110187766A (en) * | 2019-05-31 | 2019-08-30 | 北京猎户星空科技有限公司 | A kind of control method of smart machine, device, equipment and medium |
CN110297665B (en) * | 2019-06-10 | 2022-07-19 | 青岛小鸟看看科技有限公司 | Switching method of equipment working modes and camera |
CN110297665A (en) * | 2019-06-10 | 2019-10-01 | 青岛小鸟看看科技有限公司 | A kind of switching method and camera of equipment operating mode |
CN110196900A (en) * | 2019-06-13 | 2019-09-03 | 三星电子(中国)研发中心 | Exchange method and device for terminal |
CN110253595B (en) * | 2019-06-21 | 2022-01-21 | 北京猎户星空科技有限公司 | Intelligent equipment control method and device |
CN110253595A (en) * | 2019-06-21 | 2019-09-20 | 北京猎户星空科技有限公司 | A kind of smart machine control method and device |
CN110730115B (en) * | 2019-09-11 | 2021-11-09 | 北京小米移动软件有限公司 | Voice control method and device, terminal and storage medium |
US11335345B2 (en) | 2019-09-11 | 2022-05-17 | Beijing Xiaomi Mobile Software Co., Ltd. | Method for voice control, terminal, and non-transitory computer-readable storage medium |
CN110730115A (en) * | 2019-09-11 | 2020-01-24 | 北京小米移动软件有限公司 | Voice control method and device, terminal and storage medium |
CN112949327A (en) * | 2019-12-11 | 2021-06-11 | 苏涛 | Projection font automatic adjustment platform and method |
CN111314187A (en) * | 2020-01-21 | 2020-06-19 | 星络智能科技有限公司 | Storage medium, smart home device and awakening method thereof |
CN111332252A (en) * | 2020-02-19 | 2020-06-26 | 上海商汤临港智能科技有限公司 | Vehicle door unlocking method, device, system, electronic equipment and storage medium |
CN111332252B (en) * | 2020-02-19 | 2022-11-29 | 上海商汤临港智能科技有限公司 | Vehicle door unlocking method, device, system, electronic equipment and storage medium |
CN111399627A (en) * | 2020-03-09 | 2020-07-10 | 宁波视睿迪光电有限公司 | Energy-saving method and system for 3D display device |
CN111399627B (en) * | 2020-03-09 | 2021-09-28 | 宁波视睿迪光电有限公司 | Energy-saving method and system for 3D display device |
CN111432128A (en) * | 2020-04-13 | 2020-07-17 | 深圳市华冠科讯电子有限公司 | Video monitoring system with high detection precision |
CN112739507A (en) * | 2020-04-22 | 2021-04-30 | 南京阿凡达机器人科技有限公司 | Interactive communication implementation method, equipment and storage medium |
WO2021212388A1 (en) * | 2020-04-22 | 2021-10-28 | 南京阿凡达机器人科技有限公司 | Interactive communication implementation method and device, and storage medium |
CN113556649B (en) * | 2020-04-23 | 2023-08-04 | 百度在线网络技术(北京)有限公司 | Broadcasting control method and device of intelligent sound box |
CN113556649A (en) * | 2020-04-23 | 2021-10-26 | 百度在线网络技术(北京)有限公司 | Broadcasting control method and device of intelligent sound box |
CN111583937A (en) * | 2020-04-30 | 2020-08-25 | 珠海格力电器股份有限公司 | Voice control awakening method, storage medium, processor, voice equipment and intelligent household appliance |
CN113626778A (en) * | 2020-05-08 | 2021-11-09 | 百度在线网络技术(北京)有限公司 | Method, apparatus, electronic device, and computer storage medium for waking up device |
CN113626778B (en) * | 2020-05-08 | 2024-04-02 | 百度在线网络技术(北京)有限公司 | Method, apparatus, electronic device and computer storage medium for waking up device |
CN112230569A (en) * | 2020-09-24 | 2021-01-15 | 广州爱孕记信息科技有限公司 | Control method and device of detection equipment |
CN112235882A (en) * | 2020-09-25 | 2021-01-15 | 深圳星标科技股份有限公司 | Method and device for removing ice and snow of visual navigation aid equipment, computer equipment and storage medium |
CN112235882B (en) * | 2020-09-25 | 2022-09-06 | 深圳星标科技股份有限公司 | Method and device for removing ice and snow of visual navigation aid equipment, computer equipment and storage medium |
CN112130918A (en) * | 2020-09-25 | 2020-12-25 | 深圳市欧瑞博科技股份有限公司 | Intelligent device awakening method, device and system and intelligent device |
CN112256534A (en) * | 2020-10-13 | 2021-01-22 | 武汉朋鼎科技有限公司 | Method for transmitting information in building system |
CN113759732A (en) * | 2021-04-30 | 2021-12-07 | 北京字节跳动网络技术有限公司 | Intelligent household appliance control method and device, electronic equipment and storage medium |
CN113286259A (en) * | 2021-05-26 | 2021-08-20 | 深圳康佳电子科技有限公司 | Terminal control method and device based on UWB positioning and positioning equipment |
CN113672288A (en) * | 2021-07-02 | 2021-11-19 | 浪潮金融信息技术有限公司 | Method, system and medium for terminal equipment dormancy and awakening |
CN114253611A (en) * | 2021-11-25 | 2022-03-29 | 上海齐感电子信息科技有限公司 | Control method and control system |
CN114268337A (en) * | 2021-11-30 | 2022-04-01 | 歌尔科技有限公司 | Intelligent security control method, intelligent security equipment and controller |
CN114463903A (en) * | 2021-12-17 | 2022-05-10 | 广州新科佳都科技有限公司 | Ticket machine interaction method and device, ticket selling terminal and storage medium |
WO2023160000A1 (en) * | 2022-02-25 | 2023-08-31 | 上海商汤智能科技有限公司 | Sleep wakeup method and related product |
CN115200155A (en) * | 2022-07-26 | 2022-10-18 | 珠海格力电器股份有限公司 | Air conditioner control method and device, air conditioner and storage medium |
CN115200155B (en) * | 2022-07-26 | 2023-11-14 | 珠海格力电器股份有限公司 | Air conditioner control method and device, air conditioner and storage medium |
CN117389186A (en) * | 2023-10-16 | 2024-01-12 | 广州星际悦动股份有限公司 | Method, device, equipment and storage medium for controlling oral care equipment |
Also Published As
Publication number | Publication date |
---|---|
CN108733420B (en) | 2022-04-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108733420A (en) | Awakening method, device, smart machine and the storage medium of smart machine | |
CN108733419A (en) | Lasting awakening method, device, smart machine and the storage medium of smart machine | |
CN108733208A (en) | The I-goal of smart machine determines method and apparatus | |
US10860882B2 (en) | Apparatus and methods for tracking salient features | |
TWI705382B (en) | Intelligent device focus following method, device, intelligent equipment and storage media | |
CN105409212B (en) | The electronic equipment with depth sense is caught with multi-view image | |
CN105408938B (en) | System for the processing of 2D/3D space characteristics | |
TWI684136B (en) | Robot, control system and method for operating the robot | |
CN108733417A (en) | The work pattern selection method and device of smart machine | |
CN105074691A (en) | Context aware localization, mapping, and tracking | |
CN113116224B (en) | Robot and control method thereof | |
CN107480129A (en) | A kind of article position recognition methods and the system of view-based access control model identification and speech recognition | |
CN106991378A (en) | Facial orientation detection method, detection means and electronic installation based on depth | |
CN110383289A (en) | Device, method and the electronic equipment of recognition of face | |
CN107016348A (en) | With reference to the method for detecting human face of depth information, detection means and electronic installation | |
CN107111764A (en) | By the event of depth triggering of the object in the visual field of imaging device | |
US20180005024A1 (en) | Monitoring | |
Cordea et al. | Real-time 2 (1/2)-D head pose recovery for model-based video-coding | |
CN109697444A (en) | Object identifying method and device, equipment, storage medium based on depth image | |
CN109344685A (en) | A kind of wisdom pallet and its intelligent positioning method for tracing | |
Muñoz-Salinas et al. | People detection and tracking with multiple stereo cameras using particle filters | |
CN110187921A (en) | The method and device of wake-up device | |
Munoz-Salinas et al. | A fuzzy system for visual detection of interest in human-robot interaction | |
CN108833766A (en) | Control method, device, smart machine and the storage medium of smart machine | |
Liu | Comprehensive Analysis of Mobile Robot Target Tracking Technology Based on Computer Vision |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |