CN107704914A - A kind of intelligent interaction robot - Google Patents

A kind of intelligent interaction robot Download PDF

Info

Publication number
CN107704914A
CN107704914A CN201711081271.4A CN201711081271A CN107704914A CN 107704914 A CN107704914 A CN 107704914A CN 201711081271 A CN201711081271 A CN 201711081271A CN 107704914 A CN107704914 A CN 107704914A
Authority
CN
China
Prior art keywords
mrow
msup
module
noise
msub
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201711081271.4A
Other languages
Chinese (zh)
Inventor
龚土婷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201711081271.4A priority Critical patent/CN107704914A/en
Publication of CN107704914A publication Critical patent/CN107704914A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/004Artificial life, i.e. computing arrangements simulating life
    • G06N3/008Artificial life, i.e. computing arrangements simulating life based on physical entities controlled by simulated intelligence so as to replicate intelligent life forms, e.g. based on robots replicating pets or humans in their appearance or behaviour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Hardware Design (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Robotics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a kind of intelligent interaction robot,Including personal recognition module,Identification device,Interactive module,Intelligent control module,Data memory module and power module,The personal recognition module is used for user's mandate and logged in,The identification device is used to obtain the user images logged in by mandate,And user behavior is identified,The interactive module is used to produce the interaction content with user according to recognition result,And caused interaction content is sent to information storage module by intelligent control module,The intelligent control module is used to control interactive module to complete the interaction content,Described information memory module is used to store the interaction content,The power module and the personal recognition module,Identification device,Interactive module,Intelligent control module is connected with data memory module,For to the personal recognition module,Identification device,Interactive module,Intelligent control module and data memory module power supply.Beneficial effect:User is realized to interact with intelligent robot.

Description

A kind of intelligent interaction robot
Technical field
The present invention relates to intelligent robot technology field, and in particular to a kind of intelligent interaction robot.
Background technology
With the continuous development of scientific technology thousand are gradually entered into the continuous development of robot technology, intelligent robot Also there are many intelligent robots and bring convenient and enjoyment to the life of people in ten thousand families of family, in the market, wherein, interaction machine The one kind of people as intelligent robot, can be interactive with people, and the life to people adds many enjoyment.
The accurate interaction of people and machine is how realized, is related to robot and user images is accurately identified, however, real In image often all contain noise, execution caused by noise on image mainly have two aspect:Objectively, subjectivity is influenceed Visual effect.By the image of noise pollution, visual effect often becomes very poor.If noise intensity is big, in image Some details will be difficult to recognize.It is subjective, the Information Level of image and the processing of stratum of intellectual is reduced picture number from continuing The quality and precision handled according to layer.For some image processing process, noise often produces certain local ambiguity.Than Such as, in the case where there is noise jamming, the effect of many edge detection algorithms will reduce, and substantial amounts of empty inspection and missing inspection occur, So that follow-up Objective extraction and identification is difficult to.How efficiently to filter out noise turns into the basis for effectively utilizing image.
The content of the invention
A kind of in view of the above-mentioned problems, the present invention is intended to provide intelligent interaction robot.
The purpose of the present invention is realized using following technical scheme:
Provide a kind of intelligent interaction robot, including personal recognition module, identification device, interactive module, intelligent control Module, data memory module and power module, the personal recognition module are used for user's mandate and logged in, and the identification device is used for The user images logged in by mandate are obtained, and user behavior is identified, the interactive module is used for according to recognition result The interaction content with user is produced, and caused interaction content is sent to information storage module, institute by intelligent control module State intelligent control module to be used to control interactive module to complete the interaction content, described information memory module is used to store the friendship Mutual content, the power module are deposited with personal recognition module, identification device, interactive module, intelligent control module and the data Storage module is connected, for the personal recognition module, identification device, interactive module, intelligent control module and data storage mould Block is powered.
Beneficial effects of the present invention are:User is realized to interact with intelligent robot.
Brief description of the drawings
Using accompanying drawing, the invention will be further described, but the embodiment in accompanying drawing does not form any limit to the present invention System, for one of ordinary skill in the art, on the premise of not paying creative work, can also be obtained according to the following drawings Other accompanying drawings.
Fig. 1 is the structural representation of the present invention;
Reference:
Personal recognition module 1, identification device 2, interactive module 3, intelligent control module 4, data memory module 5, power supply mould Block 6.
Embodiment
The invention will be further described with the following Examples.
Referring to Fig. 1, a kind of intelligent interaction robot of the present embodiment, including personal recognition module 1, identification device 2, interaction Module 3, intelligent control module 4, data memory module 5 and power module 6, the personal recognition module 1 are used for user's mandate and stepped on Land, the identification device 2 is used to obtain the user images logged in by mandate, and user behavior is identified, the interaction Module 3 is used to be produced according to recognition result and the interaction content of user, and caused interaction content is passed through into intelligent control module 4 Send to information storage module 5, the intelligent control module 4 is used to control interactive module 3 to complete the interaction content, the letter Breath memory module 5 is used to store the interaction content, the power module 6 and the personal recognition module 1, identification device 2, friendship Mutual module 3, intelligent control module 4 are connected with data memory module 5, for the personal recognition module 1, identification device 2, friendship Mutual module 3, intelligent control module 4 and data memory module 5 are powered.
The present embodiment realizes user and interacted with intelligent robot.
Preferably, the identification device 2 includes image capture module, model building module, filtration module, effect assessment mould Block and identification module, described image acquisition module are used to obtain the user images logged in by mandate, the model building module For establishing image noise model, the filtration module is used to filter out picture noise, and the effect assessment module is used for Noise filtering effect is evaluated, the identification module is used to know user behavior according to filtering out the image after noise Not.
This preferred embodiment realizes the accurate filtering of image and the evaluation of filter effect, ensure that recognition effect, helps In raising level of interaction.
Preferably, the model building module is used to establish image noise model, specifically:
Image noise model is expressed as:I (i, j)=I0(i, j)+N (i, j), in formula, I (i, j), I0(i, j) and N (i, J) observed image is represented respectively, Noise original image and average are not that 0 variance is σ2White Gaussian noise;
Ask for the gradient field of observed image:
In formula,For position (i, j) center pixel four neighborhoods up and down difference Point;
Image I (i, j) gradient factor is asked for according to the gradient field of observed image;
In formula,Expression image I (i, j) gradient factor, u=1,2,3,4.
This preferred embodiment filters out for following noise and laid a good foundation, ask for image by establishing the noise model of image Gradient factor, it is easy to carry out noise filtering in gradient field.
Preferably, the filtration module asks for unit and filter unit including gradient factor, and the gradient factor asks for list Member is used for the gradient factor for asking for noise, and the filter unit is used to be filtered observed image processing;
The gradient factor asks for the gradient factor that unit is used to ask for noise, is specially:
Using the gradient factor of following formula estimation noise:
In formula,The estimate of the gradient factor of noise is represented,For constant, represent that different gradient fields are wanted The gradient field noise variance filtered out, ω represent the local neighborhood window that center is B × C in the size of (i, j);
The filter unit is used to be filtered observed image processing, is specially:
In formula,1(i, j) represents filtered image for the first time, Ik(i, j) represents the filtered image of kth time, setting filter Ripple number k, obtain filtering out the image after noise.
This preferred embodiment realizes the accurate filtering of image by filtration module, specifically, gradient factor asks for unit By calculating the gradient factor of noise, filter out and lay a good foundation for following noise, filter unit to noise by repeatedly filtering Remove, obtain good filter effect.
Preferably, the effect assessment module includes objective evaluation unit, subjective assessment unit and overall merit unit, institute State objective evaluation unit to be used to obtain noise filtering effect objective evaluation value, the subjective assessment unit is used to obtain noise filtering Effect subjective assessment value, the overall merit module are used to enter noise filtering effect according to objective evaluation value and subjective assessment value Row overall merit.
This preferred embodiment realizes the filter effect overall merit of subjective and objective combination.
Preferably, the objective evaluation unit is used to obtain noise filtering effect objective evaluation value, is obtained using following formula:
In formula, P1Represent objective evaluation value, I0(i, j) represents the original image of not Noise, and I ' (i, j) represents to filter out to make an uproar Image after sound;Objective evaluation value is smaller, represents that noise filtering effect is better;
The subjective assessment unit is used to obtain noise filtering effect subjective assessment value, obtains in the following ways:
The scoring of the original image of not Noise is designated as into full marks 100 to divide, the original using one group of observer to not Noise Beginning image and filter out the image after noise and observed, provide the scoring of the image after filtering out noise;
Calculate subjective assessment value:In formula, P2Subjective assessment value is represented, n represents the quantity of observer, FiTable The fraction for filtering out the image after noise for showing that i-th of observer provide;Subjective assessment value is bigger, represents that noise filtering effect is got over It is good;
The overall merit module is used to integrate noise filtering effect according to objective evaluation value and subjective assessment value Evaluation, carried out using the overall merit factor, the overall merit factor is calculated using following formula:
In formula, P represents the overall merit factor;The overall merit factor is bigger, shows that noise filtering effect is better.
This preferred embodiment is evaluated noise filtering effect by effect assessment module, ensure that filtering level, tool Body, to noise filtering effect assessment by way of subjective and objective combination, during objective evaluation, to the original of not Noise Image and filter out the image after noise and contrasted, objective evaluation value is obtained, during subjective assessment, using different sights The person of examining scores image, obtains subjective assessment value, so that evaluation has been provided simultaneously with subjective assessment and objective evaluation Advantage, obtain more accurate evaluation result.
Using intelligent interaction robot of the present invention carry out man-machine interaction, choose 5 users tested, respectively user 1, User 2, user 3, user 4, user 5, count to interactive efficiency and user satisfaction, are compared compared with interaction robot, It is caused to have the beneficial effect that shown in table:
Interactive efficiency improves User satisfaction improves
User 1 29% 27%
User 2 27% 26%
User 3 26% 26%
User 4 25% 24%
User 5 24% 22%
Finally it should be noted that the above embodiments are merely illustrative of the technical solutions of the present invention, rather than the present invention is protected The limitation of scope is protected, although being explained with reference to preferred embodiment to the present invention, one of ordinary skill in the art should Work as understanding, technical scheme can be modified or equivalent substitution, without departing from the reality of technical solution of the present invention Matter and scope.

Claims (7)

1. a kind of intelligent interaction robot, it is characterised in that including personal recognition module, identification device, interactive module, intelligence control Molding block, data memory module and power module, the personal recognition module are used for user's mandate and logged in, and the identification device is used In the user images that acquisition logs in by mandate, and user behavior is identified, the interactive module is used to be tied according to identification Fruit produces the interaction content with user, and caused interaction content is sent to information storage module by intelligent control module, The intelligent control module is used to control interactive module to complete the interaction content, and described information memory module is described for storing Interaction content, the power module and personal recognition module, identification device, interactive module, intelligent control module and the data Memory module is connected, for personal recognition module, identification device, interactive module, intelligent control module and the data storage Module for power supply.
2. intelligent interaction robot according to claim 1, it is characterised in that the identification device includes IMAQ mould Block, model building module, filtration module, effect assessment module and identification module, described image acquisition module are used to obtain and passed through The user images logged in are authorized, the model building module is used to establish image noise model, and the filtration module is used for figure As noise is filtered out, the effect assessment module is used to evaluate noise filtering effect, and the identification module is used for root User behavior is identified according to the image after noise is filtered out.
3. intelligent interaction robot according to claim 2, it is characterised in that the model building module, which is used to establish, schemes Picture noise model, specifically:
Image noise model is expressed as:I (i, j)=I0(i, j)+N (i, j), in formula, I (i, j), I0(i, j) and N (i, j) points Not Biao Shi observed image, Noise original image and average are not that 0 variance is σ2White Gaussian noise;
Ask for the gradient field of observed image:
<mrow> <mo>&amp;dtri;</mo> <msup> <mi>I</mi> <mn>1</mn> </msup> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>=</mo> <mi>I</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>-</mo> <mn>1</mn> <mo>)</mo> </mrow> <mo>-</mo> <mi>I</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow>
<mrow> <mo>&amp;dtri;</mo> <msup> <mi>I</mi> <mn>2</mn> </msup> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>=</mo> <mi>I</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>+</mo> <mn>1</mn> <mo>)</mo> </mrow> <mo>-</mo> <mi>I</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow>
<mrow> <mo>&amp;dtri;</mo> <msup> <mi>I</mi> <mn>3</mn> </msup> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>=</mo> <mi>I</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>+</mo> <mn>1</mn> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>I</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow>
<mrow> <mo>&amp;dtri;</mo> <msup> <mi>I</mi> <mn>4</mn> </msup> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>=</mo> <mi>I</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>-</mo> <mn>1</mn> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>I</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow>
In formula,For position (i, j) center pixel four neighborhoods up and down difference;
Image I (i, j) gradient factor is asked for according to the gradient field of observed image;
<mrow> <msub> <mi>T</mi> <mrow> <mo>&amp;dtri;</mo> <msup> <mi>I</mi> <mi>u</mi> </msup> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>=</mo> <mfrac> <mrow> <mn>3</mn> <mo>&amp;dtri;</mo> <msup> <mi>I</mi> <mi>u</mi> </msup> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>+</mo> <msqrt> <mrow> <msup> <mrow> <mo>&amp;lsqb;</mo> <mo>&amp;dtri;</mo> <msup> <mi>I</mi> <mi>u</mi> </msup> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>&amp;rsqb;</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <mn>1</mn> </mrow> </msqrt> </mrow> <mn>4</mn> </mfrac> </mrow>
In formula,Expression image I (i, j) gradient factor, u=1,2,3,4.
4. intelligent interaction robot according to claim 3, it is characterised in that the filtration module is asked including gradient factor Unit and filter unit are taken, the gradient factor asks for the gradient factor that unit is used to ask for noise, and the filter unit is used for Processing is filtered to observed image;
The gradient factor asks for the gradient factor that unit is used to ask for noise, is specially:
Using the gradient factor of following formula estimation noise:
<mrow> <msub> <mover> <mi>T</mi> <mo>^</mo> </mover> <mrow> <mo>&amp;dtri;</mo> <msup> <mi>N</mi> <mi>u</mi> </msup> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>=</mo> <mfrac> <msubsup> <mi>&amp;sigma;</mi> <mrow> <mo>&amp;dtri;</mo> <msup> <mi>N</mi> <mi>u</mi> </msup> </mrow> <mn>2</mn> </msubsup> <mrow> <msubsup> <mi>&amp;sigma;</mi> <mrow> <mo>&amp;dtri;</mo> <msup> <mi>N</mi> <mi>u</mi> </msup> </mrow> <mn>2</mn> </msubsup> <mo>+</mo> <mi>m</mi> <mi>a</mi> <mi>x</mi> <mrow> <mo>(</mo> <msubsup> <mi>Q</mi> <mrow> <mo>&amp;dtri;</mo> <msup> <mi>I</mi> <mi>u</mi> </msup> </mrow> <mn>2</mn> </msubsup> <mo>(</mo> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> <mo>)</mo> <mo>-</mo> <msubsup> <mi>&amp;sigma;</mi> <mrow> <mo>&amp;dtri;</mo> <msup> <mi>N</mi> <mi>u</mi> </msup> </mrow> <mn>2</mn> </msubsup> <mo>,</mo> <mn>0</mn> <mo>)</mo> </mrow> </mrow> </mfrac> <mo>&amp;CenterDot;</mo> <msub> <mi>T</mi> <mrow> <mo>&amp;dtri;</mo> <msup> <mi>I</mi> <mi>u</mi> </msup> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> </mrow>
<mrow> <msubsup> <mi>Q</mi> <mrow> <mo>&amp;dtri;</mo> <msup> <mi>I</mi> <mi>u</mi> </msup> </mrow> <mn>2</mn> </msubsup> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <mrow> <mi>B</mi> <mo>&amp;times;</mo> <mi>C</mi> </mrow> </mfrac> <mo>{</mo> <munder> <mo>&amp;Sigma;</mo> <mrow> <msup> <mi>i</mi> <mo>&amp;prime;</mo> </msup> <mo>,</mo> <msup> <mi>j</mi> <mo>&amp;prime;</mo> </msup> <mo>&amp;Element;</mo> <mi>&amp;omega;</mi> </mrow> </munder> <msup> <mrow> <mo>&amp;lsqb;</mo> <mo>&amp;dtri;</mo> <msup> <mi>I</mi> <mi>u</mi> </msup> <mrow> <mo>(</mo> <msup> <mi>i</mi> <mo>&amp;prime;</mo> </msup> <mo>,</mo> <msup> <mi>j</mi> <mo>&amp;prime;</mo> </msup> <mo>)</mo> </mrow> <mo>&amp;rsqb;</mo> </mrow> <mn>2</mn> </msup> <mo>-</mo> <munder> <mo>&amp;Sigma;</mo> <mrow> <msup> <mi>i</mi> <mo>&amp;prime;</mo> </msup> <mo>,</mo> <msup> <mi>j</mi> <mo>&amp;prime;</mo> </msup> <mo>&amp;Element;</mo> <mi>&amp;omega;</mi> </mrow> </munder> <mo>&amp;dtri;</mo> <msup> <mi>I</mi> <mi>u</mi> </msup> <mrow> <mo>(</mo> <msup> <mi>i</mi> <mo>&amp;prime;</mo> </msup> <mo>,</mo> <msup> <mi>j</mi> <mo>&amp;prime;</mo> </msup> <mo>)</mo> </mrow> <mo>}</mo> </mrow>
In formula,The estimate of the gradient factor of noise is represented,For constant, represent what different gradient fields to be filtered out Gradient field noise variance, ω represent the local neighborhood window that center is B × C in the size of (i, j).
5. intelligent interaction robot according to claim 4, it is characterised in that the filter unit is used for observed image Processing is filtered, is specially:
<mrow> <msub> <mi>I</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>=</mo> <mi>I</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>-</mo> <mn>0.25</mn> <msubsup> <mi>&amp;Sigma;</mi> <mrow> <mi>u</mi> <mo>=</mo> <mn>1</mn> </mrow> <mn>4</mn> </msubsup> <msub> <mover> <mi>T</mi> <mo>^</mo> </mover> <mrow> <mo>&amp;dtri;</mo> <msup> <mi>N</mi> <mi>u</mi> </msup> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> </mrow>
<mrow> <msub> <mi>I</mi> <mi>k</mi> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>=</mo> <msub> <mi>I</mi> <mrow> <mi>k</mi> <mo>-</mo> <mn>1</mn> </mrow> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>-</mo> <mn>0.25</mn> <msubsup> <mi>&amp;Sigma;</mi> <mrow> <mi>u</mi> <mo>=</mo> <mn>1</mn> </mrow> <mn>4</mn> </msubsup> <msub> <mover> <mi>T</mi> <mo>^</mo> </mover> <mrow> <mo>&amp;dtri;</mo> <msup> <mi>N</mi> <mi>u</mi> </msup> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>,</mo> <mrow> <mo>(</mo> <mi>k</mi> <mo>&amp;GreaterEqual;</mo> <mn>2</mn> <mo>)</mo> </mrow> </mrow>
In formula, I1(i, j) represents filtered image for the first time, Ik(i, j) represents the filtered image of kth time, setting filtering time Number k, obtains filtering out the image after noise.
6. intelligent interaction robot according to claim 5, it is characterised in that the effect assessment module includes objective comment Valency unit, subjective assessment unit and overall merit unit, for obtaining, noise filtering effect is objective to be commented the objective evaluation unit Value, the subjective assessment unit are used to obtain noise filtering effect subjective assessment value, and the overall merit module is used for basis Objective evaluation value and subjective assessment value carry out overall merit to noise filtering effect.
7. intelligent interaction robot according to claim 6, it is characterised in that the objective evaluation unit is made an uproar for acquisition Sound filtration result objective evaluation value, is obtained using following formula:
<mrow> <msub> <mi>P</mi> <mn>1</mn> </msub> <mo>=</mo> <msup> <mi>e</mi> <mrow> <mfrac> <mn>1</mn> <mrow> <mi>U</mi> <mo>&amp;times;</mo> <mi>V</mi> </mrow> </mfrac> <msubsup> <mi>&amp;Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>U</mi> </msubsup> <msubsup> <mi>&amp;Sigma;</mi> <mrow> <mi>j</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>V</mi> </msubsup> <msup> <mrow> <mo>&amp;lsqb;</mo> <mi>I</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>-</mo> <msup> <mi>I</mi> <mo>&amp;prime;</mo> </msup> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>&amp;rsqb;</mo> </mrow> <mn>2</mn> </msup> </mrow> </msup> <mo>+</mo> <mfrac> <mn>1</mn> <mrow> <mi>U</mi> <mo>&amp;times;</mo> <mi>V</mi> </mrow> </mfrac> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>U</mi> </munderover> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>j</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>V</mi> </munderover> <msup> <mrow> <mo>&amp;lsqb;</mo> <msub> <mi>I</mi> <mn>0</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>-</mo> <msup> <mi>I</mi> <mo>&amp;prime;</mo> </msup> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>&amp;rsqb;</mo> </mrow> <mn>2</mn> </msup> </mrow>
In formula, P1Represent objective evaluation value, I0(i, j) represents the original image of not Noise, and I ' (i, j) is represented after filtering out noise Image;Objective evaluation value is smaller, represents that noise filtering effect is better;
The subjective assessment unit is used to obtain noise filtering effect subjective assessment value, obtains in the following ways:
The scoring of the original image of not Noise is designated as into full marks 100 to divide, the original graph using one group of observer to not Noise Picture and filter out the image after noise and observed, provide the scoring of the image after filtering out noise;
Calculate subjective assessment value:In formula, P2Subjective assessment value is represented, n represents the quantity of observer, FiRepresent the The fraction of image after what i observer provided filter out noise;Subjective assessment value is bigger, represents that noise filtering effect is better;
The overall merit module is used to carry out overall merit to noise filtering effect according to objective evaluation value and subjective assessment value, Carried out using the overall merit factor, the overall merit factor is calculated using following formula:
<mrow> <mi>P</mi> <mo>=</mo> <msup> <mi>e</mi> <mrow> <mo>-</mo> <msub> <mi>P</mi> <mn>1</mn> </msub> </mrow> </msup> <mo>+</mo> <msup> <mi>e</mi> <msub> <mi>P</mi> <mn>2</mn> </msub> </msup> <mo>+</mo> <msup> <mrow> <mo>(</mo> <mfrac> <mn>1</mn> <msub> <mi>P</mi> <mn>1</mn> </msub> </mfrac> <mo>+</mo> <msub> <mi>P</mi> <mn>2</mn> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow>
In formula, P represents the overall merit factor;The overall merit factor is bigger, shows that noise filtering effect is better.
CN201711081271.4A 2017-11-07 2017-11-07 A kind of intelligent interaction robot Pending CN107704914A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711081271.4A CN107704914A (en) 2017-11-07 2017-11-07 A kind of intelligent interaction robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711081271.4A CN107704914A (en) 2017-11-07 2017-11-07 A kind of intelligent interaction robot

Publications (1)

Publication Number Publication Date
CN107704914A true CN107704914A (en) 2018-02-16

Family

ID=61177483

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711081271.4A Pending CN107704914A (en) 2017-11-07 2017-11-07 A kind of intelligent interaction robot

Country Status (1)

Country Link
CN (1) CN107704914A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1604139A (en) * 2004-10-28 2005-04-06 上海交通大学 Method for constructing image fusion estimation system
CN104202640A (en) * 2014-08-28 2014-12-10 深圳市国华识别科技开发有限公司 Intelligent television interaction control system and method based on image identification
CN106127729A (en) * 2016-06-08 2016-11-16 浙江传媒学院 A kind of picture noise level estimation method based on gradient
CN106327364A (en) * 2016-09-18 2017-01-11 长沙军鸽软件有限公司 Service method of reception robot
CN106584480A (en) * 2016-12-31 2017-04-26 天津菲戈博特智能科技有限公司 Robot and facial recognition method and voice control method thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1604139A (en) * 2004-10-28 2005-04-06 上海交通大学 Method for constructing image fusion estimation system
CN104202640A (en) * 2014-08-28 2014-12-10 深圳市国华识别科技开发有限公司 Intelligent television interaction control system and method based on image identification
CN106127729A (en) * 2016-06-08 2016-11-16 浙江传媒学院 A kind of picture noise level estimation method based on gradient
CN106327364A (en) * 2016-09-18 2017-01-11 长沙军鸽软件有限公司 Service method of reception robot
CN106584480A (en) * 2016-12-31 2017-04-26 天津菲戈博特智能科技有限公司 Robot and facial recognition method and voice control method thereof

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
黄立慧等,: ""基于方向梯度计算的图像椒盐噪声滤除算法"", 《福建电脑》 *

Similar Documents

Publication Publication Date Title
CN110348445A (en) A kind of example dividing method merging empty convolution sum marginal information
CN104143079B (en) The method and system of face character identification
CN106228528B (en) A kind of multi-focus image fusing method based on decision diagram and rarefaction representation
CN106709950A (en) Binocular-vision-based cross-obstacle lead positioning method of line patrol robot
CN106778856A (en) A kind of object identification method and device
CN106874914A (en) A kind of industrial machinery arm visual spatial attention method based on depth convolutional neural networks
CN110135282B (en) Examinee return plagiarism cheating detection method based on deep convolutional neural network model
CN107103277B (en) Gait recognition method based on depth camera and 3D convolutional neural network
CN107633511A (en) A kind of blower fan vision detection system based on own coding neutral net
CN105740780A (en) Method and device for human face in-vivo detection
CN106934455B (en) Remote sensing image optics adapter structure choosing method and system based on CNN
CN107730519A (en) A kind of method and system of face two dimensional image to face three-dimensional reconstruction
CN101908153B (en) Method for estimating head postures in low-resolution image treatment
CN107808376A (en) A kind of detection method of raising one&#39;s hand based on deep learning
CN104077580A (en) Pest image automatic recognition method based on high-reliability network
CN104853182B (en) Based on amplitude and the objective evaluation method for quality of stereo images of phase place
CN107134008A (en) A kind of method and system of the dynamic object identification based under three-dimensional reconstruction
CN106096015A (en) A kind of degree of depth learning method recommended based on big data double-way and two-way recommendation apparatus
CN106022227A (en) Gesture identification method and apparatus
CN106991428A (en) Insect image-recognizing method based on adaptive pool model
CN106446833B (en) A kind of bionical visible sensation method of multichannel for complex scene image recognition
CN107766864A (en) Extract method and apparatus, the method and apparatus of object identification of feature
CN107895503A (en) A kind of unattended parking farm monitoring system
CN111046213B (en) Knowledge base construction method based on image recognition
CN106203428A (en) The image significance detection method merged based on blur estimation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20180216

RJ01 Rejection of invention patent application after publication