CN104899251A - Information processing method and electronic device - Google Patents

Information processing method and electronic device Download PDF

Info

Publication number
CN104899251A
CN104899251A CN201510233084.8A CN201510233084A CN104899251A CN 104899251 A CN104899251 A CN 104899251A CN 201510233084 A CN201510233084 A CN 201510233084A CN 104899251 A CN104899251 A CN 104899251A
Authority
CN
China
Prior art keywords
information
feature information
attribute information
fisrt feature
main body
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510233084.8A
Other languages
Chinese (zh)
Other versions
CN104899251B (en
Inventor
卢德山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201510233084.8A priority Critical patent/CN104899251B/en
Publication of CN104899251A publication Critical patent/CN104899251A/en
Application granted granted Critical
Publication of CN104899251B publication Critical patent/CN104899251B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/60Information retrieval; Database structures therefor; File system structures therefor of audio data
    • G06F16/63Querying
    • G06F16/635Filtering based on additional data, e.g. user or group profiles

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An embodiment of the present invention discloses an information processing method. The method comprises: acquiring first feature information of a subject, wherein the first feature information is used for characterizing first attribute information of the subject; based on the first feature information, determining second feature information, wherein the second feature information is used for characterizing second attribute information of media data, and the second attribute information is the same as or opposite to the first attribute information; acquiring first media data according to the second feature information, so that third attribute information corresponding to the first media data corresponds to the second attribute information; and outputting the first media data. An embodiment of the present invention further discloses an electronic device.

Description

A kind of information processing method and electronic equipment
Technical field
The present invention relates to the information processing technology, particularly relate to a kind of information processing method and electronic equipment.
Background technology
In current rhythm of life faster city, people usually can select to listen to the music mood of releiving, but in prior art, electronic equipment according to the current emotional state of user, cannot export the music being applicable to described user's current emotional; Therefore, need a kind of method badly, to solve the problem, promote Consumer's Experience.
Summary of the invention
For solving the technical matters of existing existence, embodiments provide a kind of information processing method and electronic equipment.
The technical scheme of the embodiment of the present invention is achieved in that
Embodiments provide a kind of information processing method, comprising:
Obtain the fisrt feature information of main body; Described fisrt feature information is for characterizing the first attribute information of described main body;
Based on described fisrt feature information, determine second feature information; Described second feature information is used for the second attribute information of characterizing media data; Described second attribute information is identical or contrary with described first attribute information;
According to described second feature acquisition of information first media data, corresponding with described second attribute information with the 3rd attribute information making described first media data corresponding;
Export described first media data.
The embodiment of the present invention additionally provides a kind of electronic equipment, comprising:
First acquiring unit, for obtaining the fisrt feature information of main body; Described fisrt feature information is for characterizing the first attribute information of described main body;
Determining unit, for based on described fisrt feature information, determines second feature information; Described second feature information is used for the second attribute information of characterizing media data; Described second attribute information is identical or contrary with described first attribute information;
Second acquisition unit is for according to described second feature acquisition of information first media data, corresponding with described second attribute information with the 3rd attribute information making described first media data corresponding;
Output unit, for exporting described first media data.
Information processing method described in the embodiment of the present invention and electronic equipment, can get the fisrt feature information of main body, and determine second feature information based on described fisrt feature information, and then based on described second feature acquisition of information to the first media data; Here, because the first attribute information of described fisrt feature information representation is identical or contrary with the second attribute information of described second feature information representation, so, the embodiment of the present invention can realize the current state based on main body, i.e. the first attribute information, export the first media data targetedly, described first media data exported is mated with the first attribute information of main body described under current state, therefore, the embodiment of the present invention can expand the application scenarios of described electronic equipment, promote the experience of user, enrich Consumer's Experience.
Accompanying drawing explanation
Fig. 1 is the realization flow schematic diagram one of embodiment of the present invention information processing method;
Fig. 2 is the realization flow schematic diagram two of embodiment of the present invention information processing method;
Fig. 3 is the realization flow schematic diagram three of embodiment of the present invention information processing method;
Fig. 4 is the realization flow schematic diagram four of embodiment of the present invention information processing method;
Fig. 5 is the realization flow schematic diagram five of embodiment of the present invention information processing method;
Fig. 6 is the structural representation one of the invention process electronic equipment;
Fig. 7 is the structural representation two of embodiment of the present invention electronic equipment;
Fig. 8 is the structural representation three of embodiment of the present invention electronic equipment;
Fig. 9 is the structural representation four of embodiment of the present invention electronic equipment.
Embodiment
In order to more at large understand feature of the present invention and technology contents, below in conjunction with accompanying drawing, realization of the present invention is described in detail, the use of appended accompanying drawing explanation only for reference, is not used for limiting the present invention.
Embodiment one
Fig. 1 is the realization flow schematic diagram one of embodiment of the present invention information processing method; Described method is applied in electronic equipment; As shown in Figure 1, described method comprises:
Step 101: the fisrt feature information obtaining main body; Described fisrt feature information is for characterizing the first attribute information of described main body;
The method recorded due to the present embodiment is applied in electronic equipment, and therefore the executive agent of step 101 is electronic equipment, and so, step 101 also can be described as: electronic equipment obtains the fisrt feature information of main body.
In the present embodiment, described electronic equipment can be specially smart mobile phone, computer, panel computer etc.
In the present embodiment, described main body can be the user of described electronic equipment, also can be individual for other except electronic equipment user.
In the present embodiment, described first attribute information can be emotion information, also can be status information; Particularly, described electronic equipment can obtain main body, as the emotion information of electronic equipment user, or obtains the status information of described electronic equipment user; Wherein, described emotion information can be at least the one in following information: happy, surprised, grieved, frightened, detest, contempt and indignation; Described status information can be sleepy and absorbed.The emotion information more than provided and status information are only for explaining the embodiment of the present invention, not for limiting the embodiment of the present invention, in actual applications, can arrange the first attribute information according to the actual requirements and arbitrarily.
Step 102: based on described fisrt feature information, determine second feature information; Described second feature information is used for the second attribute information of characterizing media data; Described second attribute information is identical or contrary with described first attribute information;
In the present embodiment, after described electronic equipment gets the fisrt feature information of described main body, described electronic equipment can determine the second feature information corresponding with described fisrt feature information based on described fisrt feature information, so, make the embodiment of the present invention can determine first media data corresponding with main body described under current state based on described second feature information, to make the fisrt feature information matches of described first media data and the described main body determined.
In the present embodiment, according to preset rules, can determine the second feature information identical with the attribute of described fisrt feature information, such as, described second attribute information that described second feature information characterizes is identical with described first attribute information; Also can according to preset rules, determine the second feature information contrary with the attribute of described fisrt feature information, such as, described second attribute information that described second feature information characterizes is contrary with described first attribute information, in actual applications, can arrange arbitrarily according to user's request, and then, meet the demand of the different user under different conditions, promote Consumer's Experience.
Step 103: according to described second feature acquisition of information first media data, corresponding with described second attribute information with the 3rd attribute information making described first media data corresponding;
In the present embodiment, described electronic equipment can based on described second feature acquisition of information to the first media data, and then export the first media data, described first media data is perceived to enable described main body, and the 3rd attribute information of described first media data of described main body perception is corresponding with described second attribute information and match with described first attribute information, namely according to preset rules, described 3rd attribute information is identical or contrary with described first attribute information.
In the present embodiment, described first attribute information, described second attribute information and described 3rd attribute information are emotion information; Particularly, in the present embodiment, described first attribute information, the second attribute information and the 3rd attribute information all can be at least the one in following information: happy, surprised, grieved, frightened, detest, contempt and indignation.
Step 104: export described first media data.
In the present embodiment, described first media data can be specially at least one in following data: voice data, video data and image.
In the present embodiment, due to described first attribute information, second attribute information and the 3rd attribute information can characterize emotion information, so the electronic equipment described in the embodiment of the present invention can based on described main body, the emotion information of such as electronic equipment user, determine the second feature information for characterizing emotion information, and then the emotion information of the first media data to be output is determined according to described second feature information, so, make the emotion information of the first media data to be output can with the emotion information matches of the described electronic equipment user under current state, and then realize the object of hommization output, promote Consumer's Experience, enrich Consumer's Experience.
In addition, in the present embodiment, due to the second attribute information of the second feature information that the fisrt feature information of foundation main body is determined, with the first attribute information of described fisrt feature information can identical also can not be identical, again because described first attribute information and described second attribute information can characterize emotion information, so, information processing method described in the embodiment of the present invention can be current according to user emotion information, determine the first media data to be output that the emotion information current with user is identical, also the first media data to be output that the emotion information current with user is contrary can be determined, so, can under different conditions, realize the object of the emotion being regulated user by media data.
Embodiment two
Fig. 2 is the realization flow schematic diagram two of embodiment of the present invention information processing method; Described method is applied in electronic equipment; As shown in Figure 2, described method comprises:
Step 201: the fisrt feature information obtaining main body; Described fisrt feature information is for characterizing the first attribute information of described main body;
The method recorded due to the present embodiment is applied in electronic equipment, and therefore the executive agent of step 201 is electronic equipment, and so, step 201 also can be described as: electronic equipment obtains the fisrt feature information of main body.
In the present embodiment, described electronic equipment can be specially smart mobile phone, computer, panel computer etc.
In the present embodiment, described main body can be the user of described electronic equipment, also can be individual for other except electronic equipment user.
In the present embodiment, described first attribute information can be emotion information, also can be status information; Particularly, described electronic equipment can obtain main body, as the emotion information of electronic equipment user, or obtains the status information of described electronic equipment user; Wherein, described emotion information can be at least the one in following information: happy, surprised, grieved, frightened, detest, contempt and indignation; Described status information can be sleepy and absorbed.The emotion information more than provided and status information are only for explaining the embodiment of the present invention, not for limiting the embodiment of the present invention, in actual applications, can arrange the first attribute information according to the actual requirements and arbitrarily.
Step 202: based on described fisrt feature information, determine second feature information; Described second feature information is used for the second attribute information of characterizing media data; Described second attribute information is identical or contrary with described first attribute information;
In the present embodiment, when media data is for characterizing voice data, described electronic equipment obtains the first voice data according to described second feature information in the audio resource of self, such as, described electronic equipment can be current according to main body emotion information, get first voice data identical or identical with current emotion information, and then export described first voice data, to realize the object of the emotion state regulating described main body;
Or, when described electronic equipment and network are in connection status, described electronic equipment obtains the first voice data in the data resource of the network be connected to, such as, the emotion information that described electronic equipment is current according to described main body, obtains in the data resource of network, as buffer memory or download the first voice data, and then export described first voice data, to realize the object of the emotion state regulating described main body;
Here, the 3rd attribute information of described first voice data that gets of described electronic equipment is corresponding with described second attribute information; Further, because described second attribute information is identical or identical with described first attribute information, so described 3rd attribute information is identical or contrary with described first attribute information.
In the present embodiment, after described electronic equipment gets the fisrt feature information of described main body, described electronic equipment can determine the second feature information corresponding with described fisrt feature information based on described fisrt feature information, so, make the embodiment of the present invention can determine first voice data corresponding with main body described under current state based on described second feature information, to make the fisrt feature information matches of described first voice data and the described main body determined.
In the present embodiment, according to preset rules, can determine the second feature information identical with the attribute of described fisrt feature information, such as, described second attribute information that described second feature information characterizes is identical with described first attribute information; Also can according to preset rules, determine the second feature information contrary with the attribute of described fisrt feature information, such as, described second attribute information that described second feature information characterizes is contrary with described first attribute information, in actual applications, can arrange arbitrarily according to user's request, and then, meet the demand of the different user under different conditions, promote Consumer's Experience.
Step 203: obtain the first voice data according in described second feature information audio resource in the electronic device; Or, in data resource, obtain the first voice data according to described second feature information, corresponding with described second attribute information with the 3rd attribute information making described first voice data corresponding;
In the present embodiment, described electronic equipment can based on described second feature acquisition of information to the first voice data, and then export the first voice data, described first voice data is perceived to enable described main body, and the 3rd attribute information of described first voice data of described main body perception is corresponding with described second attribute information and match with described first attribute information, namely according to preset rules, described 3rd attribute information is identical or contrary with described first attribute information.
In the present embodiment, described first attribute information, described second attribute information and described 3rd attribute information are emotion information; Particularly, in the present embodiment, described first attribute information, the second attribute information and the 3rd attribute information all can be at least the one in following information: happy, surprised, grieved, frightened, detest, contempt and indignation.
Step 204: export described first voice data.
In the present embodiment, due to described first attribute information, second attribute information and the 3rd attribute information can characterize emotion information, so the electronic equipment described in the embodiment of the present invention can based on described main body, the emotion information of such as electronic equipment user, determine the second feature information for characterizing emotion information, and then the emotion information of the first voice data to be output is determined according to described second feature information, so, make the emotion information of the first voice data to be output can with the emotion information matches of the described electronic equipment user under current state, and then realize the object of hommization output, promote Consumer's Experience, enrich Consumer's Experience.
In addition, in the present embodiment, due to the second attribute information of the second feature information that the fisrt feature information of foundation main body is determined, with the first attribute information of described fisrt feature information can identical also can not be identical, again because described first attribute information and described second attribute information can characterize emotion information, so, information processing method described in the embodiment of the present invention can be current according to user emotion information, determine the first voice data to be output that the emotion information current with user is identical, also the first voice data to be output that the emotion information current with user is contrary can be determined, so, can under different conditions, realize the object of the emotion being regulated user by voice data.
Embodiment three
Fig. 3 is the realization flow schematic diagram three of embodiment of the present invention information processing method; Described method is applied to electronic equipment; As shown in Figure 3, described method comprises:
Step 301: at least one the first subcharacter information obtaining main body; Described first subcharacter information is for characterizing the Information Monitoring of the described main body collected; Described Information Monitoring comprises at least one in following information: image information, audio-frequency information and video information;
The method recorded due to the present embodiment is applied in electronic equipment, and therefore the executive agent of step 301 is electronic equipment, and so, step 301 also can be described as: electronic equipment obtains at least one first subcharacter information of main body.
In the present embodiment, described electronic equipment can be specially smart mobile phone, computer, panel computer etc.
In the present embodiment, described electronic equipment connects or includes collecting unit, and described collecting unit can be realized by the first-class collecting device of shooting.
In the present embodiment, described main body can be the user of described electronic equipment, also can be individual for other except electronic equipment user.
In the present embodiment, when described Information Monitoring is used for token image information, described first subcharacter information can characterize at least one in following information: the facial expression information of described main body, limbs information and gesture information.
In the present embodiment, after described electronic equipment gets at least one first subcharacter information of main body, fisrt feature information can be determined according at least one first subcharacter information described; Such as, described electronic equipment can determine the fisrt feature information of first attribute information that can characterize described main body according to the facial expression information of the described main body got, limbs information, gesture information, audio-frequency information and video information etc., so, for based on the first attribute information of described main body, described electronic equipment determines that the first media data lays the foundation.
Step 302: determine fisrt feature information according at least one first subcharacter information described; Described fisrt feature information is for characterizing the first attribute information of described main body;
In the present embodiment, described first attribute information can be emotion information, also can be status information; Particularly, described electronic equipment can obtain main body, as the emotion information of electronic equipment user, or obtains the status information of described electronic equipment user; Wherein, described emotion information can be at least the one in following information: happy, surprised, grieved, frightened, detest, contempt and indignation; Described status information can be sleepy and absorbed.The emotion information more than provided and status information are only for explaining the embodiment of the present invention, not for limiting the embodiment of the present invention, in actual applications, can arrange the first attribute information according to the actual requirements and arbitrarily.
Step 303: based on described fisrt feature information, determine second feature information; Described second feature information is used for the second attribute information of characterizing media data; Described second attribute information is identical or contrary with described first attribute information;
In the present embodiment, after described electronic equipment gets the fisrt feature information of described main body, described electronic equipment can determine the second feature information corresponding with described fisrt feature information based on described fisrt feature information, so, make the embodiment of the present invention can determine first media data corresponding with main body described under current state based on described second feature information, to make the fisrt feature information matches of described first media data and the described main body determined.
In the present embodiment, according to preset rules, can determine the second feature information identical with the attribute of described fisrt feature information, such as, described second attribute information that described second feature information characterizes is identical with described first attribute information; Also can according to preset rules, determine the second feature information contrary with the attribute of described fisrt feature information, such as, described second attribute information that described second feature information characterizes is contrary with described first attribute information, in actual applications, can arrange arbitrarily according to user's request, and then, meet the demand of the different user under different conditions, promote Consumer's Experience.
Step 304: according to described second feature acquisition of information first media data, corresponding with described second attribute information with the 3rd attribute information making described first media data corresponding;
In the present embodiment, described electronic equipment can based on described second feature acquisition of information to the first media data, and then export the first media data, described first media data is perceived to enable described main body, and the 3rd attribute information of described first media data of described main body perception is corresponding with described second attribute information and match with described first attribute information, namely according to preset rules, described 3rd attribute information is identical or contrary with described first attribute information.
In the present embodiment, described first attribute information, described second attribute information and described 3rd attribute information are emotion information; Particularly, in the present embodiment, described first attribute information, the second attribute information and the 3rd attribute information all can be at least the one in following information: happy, surprised, grieved, frightened, detest, contempt and indignation.
Step 305: export described first media data.
In the present embodiment, described first media data can be specially at least one in following data: voice data, video data and image.
In the present embodiment, due to described first attribute information, second attribute information and the 3rd attribute information can characterize emotion information, so the electronic equipment described in the embodiment of the present invention can based on described main body, the emotion information of such as electronic equipment user, determine the second feature information for characterizing emotion information, and then the emotion information of the first media data to be output is determined according to described second feature information, so, make the emotion information of the first media data to be output can with the emotion information matches of the described electronic equipment user under current state, and then realize the object of hommization output, promote Consumer's Experience, enrich Consumer's Experience.
In addition, in the present embodiment, due to the second attribute information of the second feature information that the fisrt feature information of foundation main body is determined, with the first attribute information of described fisrt feature information can identical also can not be identical, again because described first attribute information and described second attribute information can characterize emotion information, so, information processing method described in the embodiment of the present invention can be current according to user emotion information, determine the first media data to be output that the emotion information current with user is identical, also the first media data to be output that the emotion information current with user is contrary can be determined, so, can under different conditions, realize the object of the emotion being regulated user by media data.
Embodiment four
Fig. 4 is the realization flow schematic diagram four of embodiment of the present invention information processing method; Described method is applied in electronic equipment; As shown in Figure 4, described method comprises:
Step 401: obtain the interaction data in historical data corresponding to main body;
The method recorded due to the present embodiment is applied in electronic equipment, and therefore the executive agent of step 401 is electronic equipment, and so, step 401 also can be described as: electronic equipment obtains the interaction data in historical data corresponding to main body.
In the present embodiment, described electronic equipment can be specially smart mobile phone, computer, panel computer etc.
In the present embodiment, when described main body is the user of described electronic equipment, described electronic equipment can get the interaction data of historical data corresponding to described main body in a period of time in the storage unit of self or in data resource, as short message etc., so, described electronic equipment is made to determine the fisrt feature information of described main body based on described interaction data.
Step 402: the fisrt feature information determining described main body according to described interaction data; Described fisrt feature information is for characterizing the first attribute information of described main body;
In the present embodiment, described first attribute information can be emotion information, also can be status information; Particularly, described electronic equipment can obtain main body, as the emotion information of electronic equipment user, or obtains the status information of described electronic equipment user; Wherein, described emotion information can be at least the one in following information: happy, surprised, grieved, frightened, detest, contempt and indignation; Described status information can be sleepy and absorbed.The emotion information more than provided and status information are only for explaining the embodiment of the present invention, not for limiting the embodiment of the present invention, in actual applications, can arrange the first attribute information according to the actual requirements and arbitrarily.
Step 403: based on described fisrt feature information, determine second feature information; Described second feature information is used for the second attribute information of characterizing media data; Described second attribute information is identical or contrary with described first attribute information;
In the present embodiment, after described electronic equipment gets the fisrt feature information of described main body, described electronic equipment can determine the second feature information corresponding with described fisrt feature information based on described fisrt feature information, so, make the embodiment of the present invention can determine first media data corresponding with main body described under current state based on described second feature information, to make the fisrt feature information matches of described first media data and the described main body determined.
In the present embodiment, according to preset rules, can determine the second feature information identical with the attribute of described fisrt feature information, such as, described second attribute information that described second feature information characterizes is identical with described first attribute information; Also can according to preset rules, determine the second feature information contrary with the attribute of described fisrt feature information, such as, described second attribute information that described second feature information characterizes is contrary with described first attribute information, in actual applications, can arrange arbitrarily according to user's request, and then, meet the demand of the different user under different conditions, promote Consumer's Experience.
Step 404: according to described second feature acquisition of information first media data, corresponding with described second attribute information with the 3rd attribute information making described first media data corresponding;
In the present embodiment, described electronic equipment can based on described second feature acquisition of information to the first media data, and then export the first media data, described first media data is perceived to enable described main body, and the 3rd attribute information of described first media data of described main body perception is corresponding with described second attribute information and match with described first attribute information, namely according to preset rules, described 3rd attribute information is identical or contrary with described first attribute information.
In the present embodiment, described first attribute information, described second attribute information and described 3rd attribute information are emotion information; Particularly, in the present embodiment, described first attribute information, the second attribute information and the 3rd attribute information all can be at least the one in following information: happy, surprised, grieved, frightened, detest, contempt and indignation.
Step 405: export described first media data.
In the present embodiment, described first media data can be specially at least one in following data: voice data, video data and image.
In the present embodiment, due to described first attribute information, second attribute information and the 3rd attribute information can characterize emotion information, so the electronic equipment described in the embodiment of the present invention can based on described main body, the emotion information of such as electronic equipment user, determine the second feature information for characterizing emotion information, and then the emotion information of the first media data to be output is determined according to described second feature information, so, make the emotion information of the first media data to be output can with the emotion information matches of the described electronic equipment user under current state, and then realize the object of hommization output, promote Consumer's Experience, enrich Consumer's Experience.
In addition, in the present embodiment, due to the second attribute information of the second feature information that the fisrt feature information of foundation main body is determined, with the first attribute information of described fisrt feature information can identical also can not be identical, again because described first attribute information and described second attribute information can characterize emotion information, so, information processing method described in the embodiment of the present invention can be current according to user emotion information, determine the first media data to be output that the emotion information current with user is identical, also the first media data to be output that the emotion information current with user is contrary can be determined, so, can under different conditions, realize the object of the emotion being regulated user by media data.
Embodiment five
Fig. 5 is the realization flow schematic diagram four of embodiment of the present invention information processing method; Described method is applied in electronic equipment; As shown in Figure 5, described method comprises:
Step 501: the fisrt feature information obtaining main body; Described fisrt feature information is for characterizing the first attribute information of described main body;
The method recorded due to the present embodiment is applied in electronic equipment, and therefore the executive agent of step 501 is electronic equipment, and so, step 501 also can be described as: electronic equipment obtains the fisrt feature information of main body.
In the present embodiment, described electronic equipment can be specially smart mobile phone, computer, panel computer etc.
In the present embodiment, described main body can be the user of described electronic equipment, also can be individual for other except electronic equipment user.
In the present embodiment, described first attribute information can be emotion information, also can be status information; Particularly, described electronic equipment can obtain main body, as the emotion information of electronic equipment user, or obtains the status information of described electronic equipment user; Wherein, described emotion information can be at least the one in following information: happy, surprised, grieved, frightened, detest, contempt and indignation; Described status information can be sleepy and absorbed.The emotion information more than provided and status information are only for explaining the embodiment of the present invention, not for limiting the embodiment of the present invention, in actual applications, can arrange the first attribute information according to the actual requirements and arbitrarily.
In one embodiment, the fisrt feature information of described acquisition main body, comprising:
Obtain at least one first subcharacter information of main body; Described first subcharacter information is for characterizing the Information Monitoring of the described main body collected; Described Information Monitoring comprises at least one in following information: image information, audio-frequency information and video information;
Fisrt feature information is determined according at least one first subcharacter information described.
In another specific embodiment, the fisrt feature information of described acquisition main body, comprising:
Obtain the interaction data in historical data corresponding to main body;
The fisrt feature information of described main body is determined according to described interaction data.
Step 502: judge whether described fisrt feature information meets preset rules, obtains the first judged result;
In the present embodiment, according to preset rules, can determine the second feature information identical with the attribute of described fisrt feature information, such as, described second attribute information that described second feature information characterizes is identical with described first attribute information; Also can according to preset rules, determine the second feature information contrary with the attribute of described fisrt feature information, such as, described second attribute information that described second feature information characterizes is contrary with described first attribute information, in actual applications, can arrange arbitrarily according to user's request, and then, meet the demand of the different user under different conditions, promote Consumer's Experience; Particularly, after described electronic equipment gets described fisrt feature information, judge whether described fisrt feature information meets described preset rules, when meeting described preset rules, determine the second feature information that attribute information is identical with described first attribute information, when not meeting described preset rules, determine the second feature information that attribute information is contrary with described first attribute information.
In the present embodiment, after described electronic equipment gets the fisrt feature information of described main body, described electronic equipment can determine the second feature information corresponding with described fisrt feature information based on described fisrt feature information, so, make the embodiment of the present invention can determine first media data corresponding with main body described under current state based on described second feature information, to make the fisrt feature information matches of described first media data and the described main body determined.
Step 503: when described first judged result characterize described fisrt feature information meet described preset rules time, determine described second feature information, to make the second attribute information of described second feature information representation identical with the first attribute information of described fisrt feature information representation; Described second feature information is used for the second attribute information of characterizing media data;
Step 504: when described first judged result characterize described fisrt feature information do not meet described preset rules time, determine described second feature information, to make the second attribute information of described second feature information representation contrary with the first attribute information of described fisrt feature information representation; Described second feature information is used for the second attribute information of characterizing media data;
Step 505: according to described second feature acquisition of information first media data, corresponding with described second attribute information with the 3rd attribute information making described first media data corresponding;
In the present embodiment, described electronic equipment can based on described second feature acquisition of information to the first media data, and then export the first media data, described first media data is perceived to enable described main body, and the 3rd attribute information of described first media data of described main body perception is corresponding with described second attribute information and match with described first attribute information, namely according to preset rules, described 3rd attribute information is identical or contrary with described first attribute information.
In the present embodiment, described first attribute information, described second attribute information and described 3rd attribute information are emotion information; Particularly, in the present embodiment, described first attribute information, the second attribute information and the 3rd attribute information all can be at least the one in following information: happy, surprised, grieved, frightened, detest, contempt and indignation.
Step 506: export described first media data.
In the present embodiment, described first media data can be specially at least one in following data: voice data, video data and image.
In the present embodiment, due to described first attribute information, second attribute information and the 3rd attribute information can characterize emotion information, so the electronic equipment described in the embodiment of the present invention can based on described main body, the emotion information of such as electronic equipment user, determine the second feature information for characterizing emotion information, and then the emotion information of the first media data to be output is determined according to described second feature information, so, make the emotion information of the first media data to be output can with the emotion information matches of the described electronic equipment user under current state, and then realize the object of hommization output, promote Consumer's Experience, enrich Consumer's Experience.
In addition, in the present embodiment, due to the second attribute information of the second feature information that the fisrt feature information of foundation main body is determined, with the first attribute information of described fisrt feature information can identical also can not be identical, again because described first attribute information and described second attribute information can characterize emotion information, so, information processing method described in the embodiment of the present invention can be current according to user emotion information, determine the first media data to be output that the emotion information current with user is identical, also the first media data to be output that the emotion information current with user is contrary can be determined, so, can under different conditions, realize the object of the emotion being regulated user by media data.
Embodiment six
Fig. 6 is the structural representation one of embodiment of the present invention electronic equipment; As shown in Figure 6, described electronic equipment comprises:
First acquiring unit 61, for obtaining the fisrt feature information of main body; Described fisrt feature information is for characterizing the first attribute information of described main body;
Determining unit 62, for based on described fisrt feature information, determines second feature information; Described second feature information is used for the second attribute information of characterizing media data; Described second attribute information is identical or contrary with described first attribute information;
Second acquisition unit 63 is for according to described second feature acquisition of information first media data, corresponding with described second attribute information with the 3rd attribute information making described first media data corresponding;
Output unit 64, for exporting described first media data.
Those skilled in the art are to be understood that, the function of each processing unit in the electronic equipment of the embodiment of the present invention, can refer to the associated description of aforementioned information disposal route and understand, each processing unit in the electronic equipment of the embodiment of the present invention, realizing by realizing the mimic channel of the function described in the embodiment of the present invention, also can be realized by the operation of the software of the function of execution described in the embodiment of the present invention on intelligent terminal.
Embodiment seven
Fig. 7 is the structural representation two of embodiment of the present invention electronic equipment; As shown in Figure 7, described electronic equipment comprises:
First acquiring unit 61, for obtaining the fisrt feature information of main body; Described fisrt feature information is for characterizing the first attribute information of described main body;
Determining unit 62, for based on described fisrt feature information, determines second feature information; Described second feature information is used for the second attribute information of characterizing media data; Described second attribute information is identical or contrary with described first attribute information;
Second acquisition unit 63 is for according to described second feature acquisition of information first media data, corresponding with described second attribute information with the 3rd attribute information making described first media data corresponding;
Output unit 64, for exporting described first media data.
In the present embodiment, described second acquisition unit 63, also for obtaining the first voice data according in described second feature information audio resource in the electronic device; Or,
Also for obtaining the first voice data according to described second feature information in data resource.
In the present embodiment, described first acquiring unit 61, comprising:
First sub-acquiring unit 611, for obtaining at least one first subcharacter information of main body; Described first subcharacter information is for characterizing the Information Monitoring of the described main body collected; Described Information Monitoring comprises at least one in following information: image information, audio-frequency information and video information;
First sub-determining unit 612, for determining fisrt feature information according at least one first subcharacter information described.
Those skilled in the art are to be understood that, the function of each processing unit in the electronic equipment of the embodiment of the present invention, can refer to the associated description of aforementioned information disposal route and understand, each processing unit in the electronic equipment of the embodiment of the present invention, realizing by realizing the mimic channel of the function described in the embodiment of the present invention, also can be realized by the operation of the software of the function of execution described in the embodiment of the present invention on intelligent terminal.
Embodiment eight
Fig. 8 is the structural representation three of embodiment of the present invention electronic equipment; As shown in Figure 8, described electronic equipment comprises:
First acquiring unit 61, for obtaining the fisrt feature information of main body; Described fisrt feature information is for characterizing the first attribute information of described main body;
Determining unit 62, for based on described fisrt feature information, determines second feature information; Described second feature information is used for the second attribute information of characterizing media data; Described second attribute information is identical or contrary with described first attribute information;
Second acquisition unit 63 is for according to described second feature acquisition of information first media data, corresponding with described second attribute information with the 3rd attribute information making described first media data corresponding;
Output unit 64, for exporting described first media data.
In the present embodiment, described second acquisition unit 63, also for obtaining the first voice data according in described second feature information audio resource in the electronic device; Or,
Also for obtaining the first voice data according to described second feature information in data resource.
In the present embodiment, described first acquiring unit 61, comprising:
Second sub-acquiring unit 613, for obtaining the interaction data in historical data corresponding to main body;
Second sub-determining unit 614, for determining the fisrt feature information of described main body according to described interaction data.
Those skilled in the art are to be understood that, the function of each processing unit in the electronic equipment of the embodiment of the present invention, can refer to the associated description of aforementioned information disposal route and understand, each processing unit in the electronic equipment of the embodiment of the present invention, realizing by realizing the mimic channel of the function described in the embodiment of the present invention, also can be realized by the operation of the software of the function of execution described in the embodiment of the present invention on intelligent terminal.
Embodiment nine
Fig. 9 is the structural representation four of embodiment of the present invention electronic equipment; As shown in Figure 9, described electronic equipment comprises:
First acquiring unit 61, for obtaining the fisrt feature information of main body; Described fisrt feature information is for characterizing the first attribute information of described main body;
Determining unit 62, for based on described fisrt feature information, determines second feature information; Described second feature information is used for the second attribute information of characterizing media data; Described second attribute information is identical or contrary with described first attribute information;
Second acquisition unit 63 is for according to described second feature acquisition of information first media data, corresponding with described second attribute information with the 3rd attribute information making described first media data corresponding;
Output unit 64, for exporting described first media data.
In the present embodiment, described first acquiring unit 61, comprising:
First sub-acquiring unit 611, for obtaining at least one first subcharacter information of main body; Described first subcharacter information is for characterizing the Information Monitoring of the described main body collected; Described Information Monitoring comprises at least one in following information: image information, audio-frequency information and video information;
First sub-determining unit 612, for determining fisrt feature information according at least one first subcharacter information described.
In the present embodiment, described first acquiring unit 61, also comprises:
Second sub-acquiring unit 613, for obtaining the interaction data in historical data corresponding to main body;
Second sub-determining unit 614, for determining the fisrt feature information of described main body according to described interaction data.
In the present embodiment, described determining unit 62, comprising:
Judgment sub-unit 621, for judging whether described fisrt feature information meets preset rules, obtains the first judged result;
3rd sub-determining unit 622, for determining the second characteristic information based on described first judged result.
In the present embodiment, described 3rd sub-determining unit 622, also for characterize when described first judged result described fisrt feature information meet described preset rules time, determine described second feature information, to make the second attribute information of described second feature information representation identical with the first attribute information of described fisrt feature information representation.
In the present embodiment, described 3rd sub-determining unit 622, also for characterize when described first judged result described fisrt feature information do not meet described preset rules time, determine described second feature information, to make the second attribute information of described second feature information representation contrary with the first attribute information of described fisrt feature information representation.
In the present embodiment, described second acquisition unit 63, also for obtaining the first voice data according in described second feature information audio resource in the electronic device; Or,
Also for obtaining the first voice data according to described second feature information in data resource.
Those skilled in the art are to be understood that, the function of each processing unit in the electronic equipment of the embodiment of the present invention, can refer to the associated description of aforementioned information disposal route and understand, each processing unit in the electronic equipment of the embodiment of the present invention, realizing by realizing the mimic channel of the function described in the embodiment of the present invention, also can be realized by the operation of the software of the function of execution described in the embodiment of the present invention on intelligent terminal.
In several embodiments that the application provides, should be understood that disclosed equipment and method can realize by another way.Apparatus embodiments described above is only schematic, such as, the division of described unit, be only a kind of logic function to divide, actual can have other dividing mode when realizing, and as: multiple unit or assembly can be in conjunction with, maybe can be integrated into another system, or some features can be ignored, or do not perform.In addition, the coupling each other of shown or discussed each ingredient or direct-coupling or communication connection can be by some interfaces, and the indirect coupling of equipment or unit or communication connection can be electrical, machinery or other form.
The above-mentioned unit illustrated as separating component or can may not be and physically separates, and the parts as unit display can be or may not be physical location, namely can be positioned at a place, also can be distributed in multiple network element; Part or all of unit wherein can be selected according to the actual needs to realize the object of the present embodiment scheme.
In addition, each functional unit in various embodiments of the present invention can all be integrated in a processing unit, also can be each unit individually as a unit, also can two or more unit in a unit integrated; Above-mentioned integrated unit both can adopt the form of hardware to realize, and the form that hardware also can be adopted to add SFU software functional unit realizes.
One of ordinary skill in the art will appreciate that: all or part of step realizing said method embodiment can have been come by the hardware that programmed instruction is relevant, aforesaid program can be stored in a computer read/write memory medium, this program, when performing, performs the step comprising said method embodiment; And aforesaid storage medium comprises: movable storage device, ROM (read-only memory) (ROM, Read Only Memory), random access memory (RAM, Random Access Memory), magnetic disc or CD etc. various can be program code stored medium.
Or, if the above-mentioned integrated unit of the present invention using the form of software function module realize and as independently production marketing or use time, also can be stored in a computer read/write memory medium.Based on such understanding, the technical scheme of the embodiment of the present invention can embody with the form of software product the part that prior art contributes in essence in other words, this computer software product is stored in a storage medium, comprises some instructions and performs all or part of of method described in each embodiment of the present invention in order to make a computer equipment (can be personal computer, server or the network equipment etc.).And aforesaid storage medium comprises: movable storage device, ROM (read-only memory) (ROM, Read Only Memory), random access memory (RAM, Random Access Memory), magnetic disc or CD etc. various can be program code stored medium.
The above; be only the specific embodiment of the present invention, but protection scope of the present invention is not limited thereto, is anyly familiar with those skilled in the art in the technical scope that the present invention discloses; change can be expected easily or replace, all should be encompassed within protection scope of the present invention.Therefore, protection scope of the present invention should be as the criterion with the protection domain of described claim.

Claims (15)

1. an information processing method, comprising:
Obtain the fisrt feature information of main body; Described fisrt feature information is for characterizing the first attribute information of described main body;
Based on described fisrt feature information, determine second feature information; Described second feature information is used for the second attribute information of characterizing media data; Described second attribute information is identical or contrary with described first attribute information;
According to described second feature acquisition of information first media data, corresponding with described second attribute information with the 3rd attribute information making described first media data corresponding;
Export described first media data.
2. method according to claim 1, is characterized in that, described first attribute information, described second attribute information and described 3rd attribute information are emotion information.
3. method according to claim 1, is characterized in that, the fisrt feature information of described acquisition main body, comprising:
Obtain at least one first subcharacter information of main body; Described first subcharacter information is for characterizing the Information Monitoring of the described main body collected; Described Information Monitoring comprises at least one in following information: image information, audio-frequency information and video information;
Fisrt feature information is determined according at least one first subcharacter information described.
4. method according to claim 1, is characterized in that, the fisrt feature information of described acquisition main body, comprising:
Obtain the interaction data in historical data corresponding to main body;
The fisrt feature information of described main body is determined according to described interaction data.
5. the method according to any one of Claims 1-4, is characterized in that, described based on described fisrt feature information, determines second feature information, comprising:
Judge whether described fisrt feature information meets preset rules, obtains the first judged result;
The second characteristic information is determined based on described first judged result.
6. method according to claim 5, is characterized in that, described based on described first judged result determination second feature information, comprising:
When described first judged result characterize described fisrt feature information meet described preset rules time, determine described second feature information, to make the second attribute information of described second feature information representation identical with the first attribute information of described fisrt feature information representation.
7. method according to claim 5, is characterized in that, described based on described first judged result determination second feature information, comprising:
When described first judged result characterize described fisrt feature information do not meet described preset rules time, determine described second feature information, to make the second attribute information of described second feature information representation contrary with the first attribute information of described fisrt feature information representation.
8. method according to claim 1, is characterized in that, described according to described second feature acquisition of information first media data, comprising:
The first voice data is obtained according in described second feature information audio resource in the electronic device; Or,
In data resource, the first voice data is obtained according to described second feature information.
9. an electronic equipment, comprising:
First acquiring unit, for obtaining the fisrt feature information of main body; Described fisrt feature information is for characterizing the first attribute information of described main body;
Determining unit, for based on described fisrt feature information, determines second feature information; Described second feature information is used for the second attribute information of characterizing media data; Described second attribute information is identical or contrary with described first attribute information;
Second acquisition unit is for according to described second feature acquisition of information first media data, corresponding with described second attribute information with the 3rd attribute information making described first media data corresponding;
Output unit, for exporting described first media data.
10. electronic equipment according to claim 9, is characterized in that, described first acquiring unit, comprising:
First sub-acquiring unit, for obtaining at least one first subcharacter information of main body; Described first subcharacter information is for characterizing the Information Monitoring of the described main body collected; Described Information Monitoring comprises at least one in following information: image information, audio-frequency information and video information;
First sub-determining unit, for determining fisrt feature information according at least one first subcharacter information described.
11. electronic equipments according to claim 9, is characterized in that, described first acquiring unit, comprising:
Second sub-acquiring unit, for obtaining the interaction data in historical data corresponding to main body;
Second sub-determining unit, for determining the fisrt feature information of described main body according to described interaction data.
12. electronic equipments according to any one of claim 9 to 11, it is characterized in that, described determining unit, comprising:
Judgment sub-unit, for judging whether described fisrt feature information meets preset rules, obtains the first judged result;
3rd sub-determining unit, for determining the second characteristic information based on described first judged result.
13. electronic equipments according to claim 12, it is characterized in that, described 3rd sub-determining unit, also for characterize when described first judged result described fisrt feature information meet described preset rules time, determine described second feature information, to make the second attribute information of described second feature information representation identical with the first attribute information of described fisrt feature information representation.
14. electronic equipments according to claim 12, it is characterized in that, described 3rd sub-determining unit, also for characterize when described first judged result described fisrt feature information do not meet described preset rules time, determine described second feature information, to make the second attribute information of described second feature information representation contrary with the first attribute information of described fisrt feature information representation.
15. electronic equipments according to claim 9, is characterized in that, described second acquisition unit, also for obtaining the first voice data according in described second feature information audio resource in the electronic device; Or,
Also for obtaining the first voice data according to described second feature information in data resource.
CN201510233084.8A 2015-05-08 2015-05-08 A kind of information processing method and electronic equipment Active CN104899251B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510233084.8A CN104899251B (en) 2015-05-08 2015-05-08 A kind of information processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510233084.8A CN104899251B (en) 2015-05-08 2015-05-08 A kind of information processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN104899251A true CN104899251A (en) 2015-09-09
CN104899251B CN104899251B (en) 2019-04-26

Family

ID=54031914

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510233084.8A Active CN104899251B (en) 2015-05-08 2015-05-08 A kind of information processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN104899251B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101261646A (en) * 2008-04-11 2008-09-10 北京中星微电子有限公司 Image selection method and device
JP2009266005A (en) * 2008-04-25 2009-11-12 Clarion Co Ltd Image retrieval method, image retrieval program, music player, and article for music retrieval
CN101836219A (en) * 2007-11-01 2010-09-15 索尼爱立信移动通讯有限公司 Generating music playlist based on facial expression
CN103024710A (en) * 2012-11-26 2013-04-03 广东欧珀移动通信有限公司 Online music playing method
CN103019369A (en) * 2011-09-23 2013-04-03 富泰华工业(深圳)有限公司 Electronic device and method for playing documents based on facial expressions
CN103873512A (en) * 2012-12-13 2014-06-18 深圳市赛格导航科技股份有限公司 Method for vehicle-mounted wireless music transmission based on face recognition technology

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101836219A (en) * 2007-11-01 2010-09-15 索尼爱立信移动通讯有限公司 Generating music playlist based on facial expression
CN101261646A (en) * 2008-04-11 2008-09-10 北京中星微电子有限公司 Image selection method and device
JP2009266005A (en) * 2008-04-25 2009-11-12 Clarion Co Ltd Image retrieval method, image retrieval program, music player, and article for music retrieval
CN103019369A (en) * 2011-09-23 2013-04-03 富泰华工业(深圳)有限公司 Electronic device and method for playing documents based on facial expressions
CN103024710A (en) * 2012-11-26 2013-04-03 广东欧珀移动通信有限公司 Online music playing method
CN103873512A (en) * 2012-12-13 2014-06-18 深圳市赛格导航科技股份有限公司 Method for vehicle-mounted wireless music transmission based on face recognition technology

Also Published As

Publication number Publication date
CN104899251B (en) 2019-04-26

Similar Documents

Publication Publication Date Title
CN103456314B (en) A kind of emotion identification method and device
CN110532169A (en) Interface testing case generation method, device, computer equipment and storage medium
CN105446146A (en) Intelligent terminal control method based on semantic analysis, system and intelligent terminal
CN105681546A (en) Voice processing method, device and terminal
CN105120063A (en) Volume prompting method of input voice and electronic device
CN108847229A (en) A kind of information interacting method and terminal based on voice assistant
CN104428759A (en) Information processing device, server, information processing method, and information processing system
CN107368333A (en) Wearable device collocation method and system and configuration system based on graphical programming
CN104994223A (en) Text message editing method and device
CN110209768A (en) The problem of automatic question answering treating method and apparatus
CN108231074A (en) A kind of data processing method, voice assistant equipment and computer readable storage medium
CN105096980B (en) The method and terminal of a kind of recording
CN108447476A (en) Method and device for asking service and service resource allocation
CN104899251A (en) Information processing method and electronic device
CN104268231B (en) A kind of file access method, device and Intelligent File System
CN104866186B (en) A kind of word playback method and electronic equipment
CN105262797B (en) Music file loading method and related equipment
CN105653433B (en) The method for tracing and device of a kind of application program
CN104572193A (en) Control method among multiple apparatuses and electronic apparatus thereof
CN104765527B (en) A kind of method and mobile terminal for showing setting options
CN104572265A (en) Application running control method and mobile terminal equipment
CN106203538B (en) The individual character disassembling method and device of electronic handwriting
CN109241073A (en) A method of allow artificial intelligence robot that there is human emotion
CN105553815B (en) A kind of application recommended method and user terminal
CN104765616B (en) A kind of method and system for automatically generating I/O model

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant