CN110209264A - A kind of behavioral data processing system and method - Google Patents

A kind of behavioral data processing system and method Download PDF

Info

Publication number
CN110209264A
CN110209264A CN201910242486.2A CN201910242486A CN110209264A CN 110209264 A CN110209264 A CN 110209264A CN 201910242486 A CN201910242486 A CN 201910242486A CN 110209264 A CN110209264 A CN 110209264A
Authority
CN
China
Prior art keywords
information
user
interaction
sensor
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910242486.2A
Other languages
Chinese (zh)
Other versions
CN110209264B (en
Inventor
钟炜凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201910242486.2A priority Critical patent/CN110209264B/en
Publication of CN110209264A publication Critical patent/CN110209264A/en
Application granted granted Critical
Publication of CN110209264B publication Critical patent/CN110209264B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H13/00Toy figures with self-moving parts, with or without movement of the toy as a whole
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

This application provides a kind of behavioral data processing system and methods, wherein this method comprises: intelligent image and Virtual Reality equipment;Wherein: the intelligence image, the interaction feature information for being made by built-in sensor senses user to the intelligent image, and the interaction feature information is sent to the VR equipment;The VR equipment, for receiving the interaction feature information;Database is determined based on the interaction feature information and the interaction prestored, determines user mutual behavior;Based on the interbehavior of the user, corresponding voice messaging and/or image information are exported.Can be higher to the identification accuracy of the behavioral data of user with the behavioral data of interaction process user between intelligent image and VR display device in the above method, and then improve user and simulate social experience.

Description

A kind of behavioral data processing system and method
Technical field
This application involves mixed reality technical fields, in particular to a kind of system and method for behavioral data processing.
Background technique
The operating pressure of today's society people gradually increases, and old man, child and personal social time is accompanied also accordingly to subtract It is few, while the technologies such as virtual reality (Virtual Reality, VR), mixed reality (Mixed Reality, MR) are greatly developed, Simulation social activity increasingly becomes the required function of current people.
During present day analog is social, the interbehavior of user is sentenced quasi- single, diagnostic feature is simple, discrimination precision compared with It is low, it is easy that user behavior is misjudged and judged by accident, causes more mistake interaction, user experience is poor.It is difficult to reach use The interaction requirement that family personalizes for the social height of simulation.
Summary of the invention
In view of this, the embodiment of the present application is designed to provide a kind of behavioral data processing system and method, Neng Goutong The case where crossing the interaction purpose that a variety of element collaborations determine user, reaching reduction misjudgement, erroneous judgement, improves user and simulates social body It tests.
In a first aspect, the embodiment of the present application provides a kind of behavioral data processing system, comprising: intelligent image shows with virtual Real VR equipment;Wherein:
The intelligence image, the interaction feature for being made by built-in sensor senses user to the intelligent image Information, and the interaction feature information is sent to the VR equipment;
The VR equipment, for receiving the interaction feature information;Based on the interaction feature information and the friendship prestored Mutually determine database, determines user mutual behavior;Based on the interbehavior of the user, export corresponding voice messaging and/or Image information.
In a kind of possible embodiment, the built-in sensor includes at least one of lower sensor:
Pressure sensor;Temperature sensor;Humidity sensor;
The intelligence image, for obtaining the interaction feature information using following step:
Receive the pressure data that built-in pressure sensor uploads;And/or receive the temperature that built-in temperature sensor uploads Degree evidence;And/or receive the humidity data that built-in wet equipment degree sensor uploads.
In a kind of possible embodiment, the interaction feature information further includes that user makes interaction to the intelligent image Interaction locations information when movement;
The intelligence image, is also used to obtain the interaction locations information using following step:
It determines the mark for uploading the sensor of the interaction feature information, based on the mark of determining sensor, determines every The location information of a sensor being triggered;
According to the location information of each of determining sensor being triggered, determine that user makes interaction to the intelligent image Interaction locations information when movement.
In a kind of possible embodiment, the interaction determines to be stored with the corresponding interaction of default behavior in database Characteristic information;
The VR equipment, for determining the interbehavior of the user using following manner:
The received interaction feature information is carried out with the interaction feature information for determining to store in database that interacts Matching determines that the interaction determines the highest target interaction feature information of matching degree in database;
The corresponding default behavior of the target interaction feature information is determined as to the interbehavior of the user.
In a kind of possible embodiment, the VR equipment, for determined using following manner output voice messaging and/ Or image information:
Based on the mapping relations between pre-stored pre-set user interbehavior and default output information, it is determining with it is described The corresponding default output information of the interbehavior of user, the default output information include voice messaging and/or image information.
In a kind of possible embodiment, the VR equipment is also used to:
Capture the voice to be identified of user;
The voice to be identified of the user is determined that the voice stored in database matches with the voice prestored, is determined The voice determines the highest target voice of matching degree in database;
The target voice is determined as user's interactive voice;
Based on the mapping relations between the pre-set user interactive voice and default output information stored in advance, it is determining with it is described The corresponding default output information of user's interactive voice, the default output information include voice messaging and/or image information.
In a kind of possible embodiment, the VR equipment is also used to:
Capture the limb action information to be identified of user;
When the limbs for detecting user are in contact with intelligent image, by the limb action information to be identified of the user Determine that the limb action characteristic information stored in database matches with the limb action prestored, determines that the limb action determines The highest target limb action of matching degree in database;When user's limb action information to be identified is that contact is preceding default Between user's limbs trace information and limb motion velocity information in section;
The target limb action is determined as user's interaction limb action;
Based on store in advance pre-set user interaction limb action and default output information between mapping relations, determine with The corresponding default output information of the interactive limb action of the user, the default output information includes voice messaging and/or image Information.
In a kind of possible embodiment, the intelligence image is also used to:
The current terminal's status information of the intelligent image is sent to the VR equipment;
The VR equipment, is also used to:
Receive the current terminal's status information of the intelligent image;
Based on the terminal's status information and the virtual appearance information of preset intelligent image, the void of the intelligent image is generated Quasi- image.
Second aspect, the embodiment of the present application also provide a kind of behavioral data processing method, this method comprises:
The interaction feature information that intelligent image is made by built-in sensor senses user;
The interaction feature information is sent to VR equipment, so that the VR equipment can be based on the interaction feature information And the interaction prestored determines database, determines user mutual behavior;Based on the interbehavior of the user, corresponding language is exported Message breath and/or image information.
In a kind of possible embodiment, the interaction that intelligent image is made by built-in sensor senses user Characteristic information, comprising:
Receive the pressure data that built-in pressure sensor uploads;And/or receive the temperature that built-in temperature sensor uploads Degree evidence;And/or receive the humidity data that built-in wet equipment degree sensor uploads.
In a kind of possible embodiment, the interaction feature information further includes that user makes interaction to the intelligent image Interaction locations information when movement:
The interaction feature information that intelligent image is made by built-in sensor senses user, further includes:
It determines the mark for uploading the sensor of the interaction feature information, based on the mark of determining sensor, determines every The location information of a sensor being triggered;
According to the location information of each of determining sensor being triggered, determine that user makes interaction to the intelligent image Interaction locations information when movement.
The third aspect, the embodiment of the present application also provide a kind of behavioral data processing method, this method comprises:
Receive the interaction feature information that intelligent image is sent;
Database is determined based on the interaction feature information and the interaction prestored, determines the interbehavior of user;
Based on the interbehavior of the user, corresponding voice messaging and/or image information are exported.
Fourth aspect, the embodiment of the present application also provide a kind of behavioral data processing unit, comprising:
Detection module, the interaction for being made by built-in sensor senses user to the behavioral data processing unit Characteristic information;
Sending module, for the interaction feature information to be sent to the VR equipment.
In a kind of possible embodiment, the detection module is also used to: receiving the pressure that built-in pressure sensor uploads Force data;And/or receive the temperature data that built-in temperature sensor uploads;And/or receive built-in wet equipment degree sensing The humidity data that device uploads.
In a kind of possible embodiment, the detection module is also used to: determining the biography for uploading the interaction feature information The mark of sensor determines the location information for the sensor being each triggered based on the mark of determining sensor.
In a kind of possible embodiment, the sending module is also used to: being sent user and is filled to behavioral data processing Set interaction locations information when making interaction movement, wherein the interaction locations information is that the detection module is based on being touched What the location information of the sensor of hair determined.
5th aspect, the embodiment of the present application also provide another behavioral data processing unit, which includes:
Receiving module, the interaction feature information sent for receiving affiliated intelligent image;
Contrast module determines user's for determining database based on the interaction feature information and the interaction prestored Interbehavior;
Output module exports corresponding voice messaging and/or image information for the interbehavior based on the user.
6th aspect, the embodiment of the present application also provide a kind of electronic equipment, comprising: processor, memory and bus, it is described Memory is stored with the executable machine readable instructions of the processor, when electronic equipment operation, the processor with it is described By bus communication between memory, the machine readable instructions execute above-mentioned second aspect or the when being executed by the processor Behavioral data processing method in the possible embodiment of any one of two aspects, or execute the above-mentioned third aspect or third The step of behavioral data processing method in aspect in any possible embodiment.
7th aspect, the embodiment of the present application also provide a kind of computer readable storage medium, the computer-readable storage medium Computer program is stored in matter, which executes times of above-mentioned second aspect or second aspect when being run by processor Behavioral data processing method in a kind of what possible embodiment, or execute any in the above-mentioned third aspect or the third aspect The step of planting the behavioral data processing method in possible embodiment.
The embodiment of the present application provides a kind of behavioral data processing system and method, can pass through the built-in sensing of intelligent image Device captures interaction feature information when user and intelligent image generation interactive action, based on the interaction feature information captured, really Determine the interbehavior of user, finally based on the interbehavior of user, corresponding voice and/or image information is determined, by VR equipment Corresponding voice, image information are exported to user, to realize the simulation interactive process between user and intelligent image.
It, can be according to the distinct interaction movement of user, voice etc. when being interacted with intelligent image by above system A variety of elements separately or cooperatively determine the behavior of user, accurately determine the interaction purpose of user, and then provide corresponding voice And/or image information improves the social degree that personalizes of simulation as responding, and strengthens the immersion experience of user.And it is existing Have that the simulation that carries out in technology is social, substantially based on single fixed instruction and judges element, interaction results are also relatively fixed, Inflexible, misjudgement, False Rate are high.By above system, a variety of interactive actions of user can be determined, improve the standard of judgement True property.
To enable the above objects, features, and advantages of the application to be clearer and more comprehensible, preferred embodiment is cited below particularly, and cooperate Appended attached drawing, is described in detail below.
Detailed description of the invention
Technical solution in ord to more clearly illustrate embodiments of the present application, below will be to needed in the embodiment attached Figure is briefly described, it should be understood that the following drawings illustrates only some embodiments of the application, therefore is not construed as pair The restriction of range for those of ordinary skill in the art without creative efforts, can also be according to this A little attached drawings obtain other relevant attached drawings.
Fig. 1 shows a kind of structural framing of behavioral data processing system provided by the embodiment of the present application;
Fig. 2 shows a kind of flow charts of behavioral data processing method provided by the embodiment of the present application;
Fig. 3 shows the flow chart of another kind behavioral data processing method provided by the embodiment of the present application;
Fig. 4 shows a kind of schematic diagram of behavioral data processing unit provided by the embodiment of the present application;
Fig. 5 shows the schematic diagram of another kind behavioral data processing unit provided by the embodiment of the present application;
Fig. 6 shows the structural schematic diagram of a kind of electronic equipment provided by the embodiments of the present application.
Specific embodiment
To keep the purposes, technical schemes and advantages of the embodiment of the present application clearer, below in conjunction with the embodiment of the present application Middle attached drawing, the technical scheme in the embodiment of the application is clearly and completely described, it is clear that described embodiment is only It is some embodiments of the present application, instead of all the embodiments.The application being usually described and illustrated herein in the accompanying drawings is real The component for applying example can be arranged and be designed with a variety of different configurations.Therefore, below to the application's provided in the accompanying drawings The detailed description of embodiment is not intended to limit claimed scope of the present application, but is merely representative of the selected reality of the application Apply example.Based on embodiments herein, those skilled in the art institute obtained without making creative work There are other embodiments, shall fall in the protection scope of this application.
Currently, with accompany robot, intelligent image to realize simulation social activity in, main behavioral data tupe packet It includes: by key control, fixed voice control, fixed action command control etc..Above-mentioned behavioral data processing system is carrying out behavior number When according to processing, user behavior purpose is judged based on single element, is given a response according to fixed judgement.
However, in simulation social activity, the interbehavior numerous types of user, in some cases, the interbehavior of user It simultaneously include a variety of determinant factors such as voice, movement.Only user behavior is determined with single element at this time, it may appear that with The misjudgement of family interaction purpose, erroneous judgement situation, make mistake reply, and user experience is caused to decline, and is unable to satisfy the requirement of user.
Based on this, the embodiment of the present application provides a kind of behavioral data processing system and method.Behavioral data processing system When detecting that user interacts with intelligent image, the interaction feature of interbehavior is captured by intelligent image built-in sensors Information, captures interactive voice by microphone, user's limb action information is captured by camera or motion sensing control device, based on more A element collaboration determines the interaction purpose of user.Compared with prior art, the application judgement precision is higher, misjudgement False Rate drop It is low also, stronger by the corresponding voice of VR equipment output and/or image information telepresenc, it can more bring and immerse to user Formula experience meets user's requirement social to simulation.
Below in conjunction with attached drawing in the application, the technical solution in the application is clearly and completely described, it is clear that Described embodiments are only a part of embodiments of the present application, instead of all the embodiments.Usually retouched in attached drawing here The component for the application for stating and showing can be arranged and be designed with a variety of different configurations.Therefore, below to mentioning in the accompanying drawings The detailed description of the embodiments herein of confession is not intended to limit claimed scope of the present application, but is merely representative of this The selected embodiment of application.Based on embodiments herein, those skilled in the art are in the premise for not making creative work Under every other embodiment obtained, shall fall in the protection scope of this application.
It should also be noted that similar label and letter indicate similar terms in following attached drawing, therefore, once a certain Xiang Yi It is defined in a attached drawing, does not then need that it is further defined and explained in subsequent attached drawing.
Embodiment one
The embodiment of the present application provides a kind of processing system of behavioral data, which can be applied to based on VR technology In company type intelligence image system, for handling the action behavior data of user, corresponding voice and/or image information are exported. It is the behavioral data processing system 100 provided for the embodiment of the present application one as shown in Figure 1, comprising: intelligent image 101 and VR are set Standby 102, the communication mode between intelligent image 101 and VR equipment 102 includes but is not limited to: ultra wide band (Ultra wideband, UWB), radio frequency identification (Radio Frequency Identification, RFID), bluetooth (Blue tooth) etc., certainly Other communication modes can be used.Wherein:
The intelligence image 101, the interaction for being made by built-in sensor senses user to the intelligent image Characteristic information, and the interaction feature information is sent to the VR equipment 102;
Here, intelligent image 101, which can be, includes the humanoid of electronic equipment, half humanoid, doll with what any materials made Shape image.Interaction feature information of the user in interactive process is captured based on internal sensor, and is sent to the VR equipment 102, to realize the judgement to user mutual behavior.
Above-mentioned interaction feature information includes the running state information and trigger data of sensor.Therefore, intelligent image 101 is real When obtain the health information and trigger data of itself each built-in sensors.
Above-mentioned built-in sensors include at least one of pressure sensor, temperature sensor, humidity sensor.Not simultaneous interpretation The trigger data of sensor includes:
The triggering of pressure sensor starts time, triggering pressure size and triggering duration;The triggering of temperature sensor Start time, triggering temperature height and triggering duration;Triggering starting time, triggering humidity size and the touching of humidity sensor Send out the duration.
The health information of the sensor includes, and the number of probes being triggered in an interactive action and is triggered Sensor location information.Here, the location information for the sensor being triggered is the sensor mark by uploading trigger data Know determination.
The location information of the above-mentioned sensor being triggered can be obtained by following implementation:
In a kind of possible embodiment, the triggering for the sensor that is triggered that intelligent image can be received by parsing The sensor identification feature that data are triggered.Such as: the head message by extracting the data frame comprising trigger data obtains Intelligent image to the identification characteristics for sending data frame device, where the sensor being triggered according to the judgement of above-mentioned identification characteristics The location information in 101 regions.
Above-mentioned identification characteristics include: the type of sensor and the unique number of sensor.
It include storage sensors identification characteristics and its place in the intelligence image in a kind of possible embodiment The mapping table in region can determine the sensing that is triggered according to mapping table after determining the Identification feature of sensor that is triggered Location information where device.
The feature of the mark such as extracted is pressure sensor 001, then can determine pressure sensor according to mapping table No. 001 facial area for being installed on intelligent image determines that the location information for the sensor being triggered is facial area.
For the intelligence image 101 after determining interaction feature information, Xiang Suoshu VR equipment 102 sends interaction feature information.
The VR equipment 102, for receiving the interaction feature information;Based on the interaction feature information and prestore Interaction determines database, determines user mutual behavior;Based on the interbehavior of the user, export corresponding voice messaging and/ Or image information.Wherein, for image information for example including image or animation etc., the application does not limit this.
The VR equipment 102 can be VR aobvious equipment, include at least information receiver, internal processor, storage dress It sets and video/audio output equipment;Or the VR equipment of merging mobile phone, in the VR equipment of merging mobile phone, mobile phone conduct Information receiver, processor and video, audio output apparatus.
Specifically, the VR equipment 102, for determining user mutual behavior information using following manner:
After receiving the interaction feature information that intelligent image 101 is sent, can by the interaction feature information with mention The interaction stored in advance taken determines that the interaction feature information in database is matched.
In the present embodiment, store in advance interaction determine database include preset interbehavior collection and with it is described pre- The one-to-one interaction feature information collection of the interbehavior collection first set.
Specifically, the interbehavior collection may include the sample action collection of the interactive action in social activity.Wherein in social activity Interactive action can be understood as the extremity that will appear in social activity.The reason of above-mentioned extremity occur is based on Different society relationship includes and is not limited to showing tender care for, kissing between lover, embracing between friend and shakes hands.Interaction feature letter The interaction feature information that breath is concentrated is the health information and trigger data that above-mentioned distinct interaction acts corresponding sensor.
In the embodiment of the present application, after receiving the interaction feature information that the intelligent image 101 is sent, transfer in advance The interaction of storage determines the interaction feature information collection in database.The interaction feature information and interaction that the intelligent image is sent The interaction feature information that characteristic information is concentrated is matched one by one, determines the highest interaction feature information of similarity for target interaction Characteristic information.
It is according to the corresponding relationship of interaction feature information collection and interbehavior collection, the target interaction feature information is corresponding Default interbehavior in interbehavior collection is determined as the interbehavior that the user carries out.
Further according to the mapping relations between preset user mutual behavior and default output information, output is with the user's The corresponding voice of interbehavior and/or image information.
In some embodiments of the present application, it is also based on artificial intelligence technology when determining user mutual behavior and carries out in advance It surveys, for example, by using the mode of machine learning, obtains the prediction for predicting user mutual behavior using the training of training sample set Then model can predict received interaction feature information input into trained obtained prediction module, output prediction Obtained user mutual behavior.Specific prediction model for example can using convolutional neural networks model etc., the application to this simultaneously It does not limit.
In a kind of possible embodiment, when the interbehavior for determining user is to stroke head, VR equipment can be passed through The image information that 102 outputs are blinked, blushed.Wherein, image information can be exported by the built-in screen for carrying output equipment, can also To be exported by the mobile phone for connecting VR glasses.
The VR equipment 102 can be also used for the speech act data of processing user, export corresponding voice, image letter Breath.
The VR equipment 102 is used to capture the voice to be identified of user;By the voice to be identified of the user with prestore Voice determines that the voice stored in database is matched, and determines that the voice determines the highest target of matching degree in database Voice;The target voice is determined as user's interactive voice;Based on the pre-set user interactive voice stored in advance with preset it is defeated Mapping relations between information out determine default output information corresponding with user's interactive voice, the default output letter Breath includes voice messaging and/or image information.
The VR equipment 102 captures the voice messaging to be identified of user by microphone, is capturing the language to be identified After message breath, transfers pre-stored voice and determine that the voice messaging stored in database is matched.
In the present embodiment, the voice that stores in advance determine database include preset interactive voice collection and with it is described pre- The one-to-one interactive voice feature collection of the interactive voice collection first set.
Specifically, the interactive voice collection may include the dialogue sample set of the interactive voice in social activity.Interactive voice is special Sign includes tone, languages and semantic information.
The described matching of this possible embodiment voice to be identified and voice determine database, determine user's interaction language Matching interaction feature information described in the process of sound and previous possible embodiment determines user with judgement database is interacted The process of interbehavior is similar, carries out not repeating here.
The VR equipment 102 can also handle the limb action behavioral data of user, export corresponding voice, image letter Breath.
The VR equipment 102 is used to capture the limb action information of user;When the limbs for detecting user and intelligent image 101 when being in contact, and the limb action information of the user is determined that the limbs stored in database move with the limb action prestored It is matched as characteristic information, determines that the limb action determines the highest target limb action of matching degree in database;It will The target limb action is determined as the interaction limb action of user;Based on store in advance pre-set user interaction limb action with Mapping relations between default output information determine default output information corresponding with user's interactive voice, described default Output information includes voice messaging and/or image information.
The VR equipment 102 captures the limb action of user by camera or motion sensing control device, is detecting user's When limbs are in contact with the intelligent image 101, determine user's limbs trace information before contacting in preset time period and Limb motion velocity information and contact position information are limb action information to be identified.In the particular embodiment, it is detecting To user limbs be in contact with the intelligent image 101 before for the limb motion trace information and limbs speed in 0.5 second or 1 second The location information that degree information and limbs are in contact with intelligent image 101 is limb action information to be identified.
After determining the limb action information to be identified, transfers pre-stored limb action and determine in database in advance The limb action characteristic information first set is matched.
In the present embodiment, the limb action that stores in advance determine database include preset interactive limb action collection and Preset the one-to-one limb action characteristic information collection of limb action collection is interacted with described.
Specifically, limb action characteristic information include limb motion trace information, limb motion velocity information and limbs with The location information that the intelligence image 101 is in contact.
The described matching of the present embodiment limb action to be identified and limb action determine database, determine user's interaction Described in the process of limb action and previous possible embodiment matching interaction feature information with interact judgement database it is true The process for determining user mutual behavior is similar, carries out not repeating here.
In a kind of possible embodiment, before user and intelligent image interact, first have to defeated in VR equipment The virtual image of current intelligent image out, specifically includes:
Intelligent image sends current terminal's status information and gives VR equipment, which includes intelligent image The inertia directional information of location information, intelligent image velocity information and intelligent image;VR equipment receives the end that intelligent image is sent Status information is held, and based on the virtual appearance information of preset intelligent image, generates the virtual image of the intelligent image.
The terminal's status information can install electronic equipment by the body and articular portion in the intelligent image and obtain It arrives.The electronic equipment includes one of locator, gyroscope and accelerometer or a variety of.
Using aforesaid way, based on interactive voice, limb action and a variety of different elements of sensor-triggered data individually or The common interactive action for determining user effectively can make differentiation to distinct interaction behavior when handling user behavior data, To make the response of correct personification.It avoids because judging the very few erroneous judgement caused to user mutual behavior of element, and introduces VR Technology allows the voice of output and/or image information to have higher personification, compared with simulation social activity in the prior art, this The above scheme that application provides can effectively improve the social immersion experience of simulation, meet user demand.
Embodiment two
Referring to shown in Fig. 2, the embodiment of the present application also provides a kind of behavioral data processing method, and this method is above-mentioned behavior number It executes, includes the following steps: according to the intelligent image in processing system
Step S201, the interaction feature information that the intelligent image is made by built-in sensor senses user.
Step S202, the interaction feature information is sent to the VR equipment, so that the VR equipment can be based on institute The interaction stating interaction feature information and prestoring determines database, determines user mutual behavior;Interaction row based on the user To export corresponding voice messaging and/or image information.
In a kind of possible embodiment, the interaction that intelligent image is made by built-in sensor senses user Characteristic information, comprising:
Receive the pressure data that built-in pressure sensor uploads;And/or receive the temperature that built-in temperature sensor uploads Degree evidence;And/or receive the humidity data that built-in wet equipment degree sensor uploads.
In a kind of possible embodiment, the interaction feature information further include: user makes friendship to the intelligent image Mutually interaction locations information when movement;
In specific implementation, the interaction feature information that intelligent image is made by built-in sensor senses user, It specifically includes:
It determines the mark for uploading the sensor of the interaction feature information, based on the mark of determining sensor, determines every The location information of a sensor being triggered;
According to the location information of each of determining sensor being triggered, determine that user makes interaction to the intelligent image Interaction locations information when movement.
Specific execution process about above-mentioned behavioral data processing method can refer to related intelligence in behavioral data processing system Can image equipment associated description, not reinflated explanation here.
Embodiment three
Referring to shown in Fig. 3, the embodiment of the present application also provides a kind of behavioral data processing method, and this method is above-mentioned behavior number It executes, includes the following steps: according to the VR equipment in processing system
Step S301, the interaction feature information that intelligent image is sent is received;
Step S302, database is determined based on the interaction feature information and the interaction prestored, determines the interaction of user Behavior;
Step S303, the interbehavior based on the user, exports corresponding voice messaging and/or image information.
Specific execution process about above-mentioned behavioral data processing method can refer to related VR in behavioral data processing system The associated description of equipment, here not reinflated explanation.
Example IV
Referring to shown in Fig. 4, the embodiment of the present application also provides a kind of behavioral data processing unit 400, which includes:
Detection module 401, for being made by built-in sensor senses user to the behavioral data processing unit 400 Interaction feature information;
Sending module 402, for the interaction feature information to be sent to the VR equipment, so that the VR equipment can Database is determined based on the interaction feature information and the interaction prestored, determines user mutual behavior;Based on the user's Interbehavior exports corresponding voice messaging and/or image information.
In a kind of possible embodiment, the detection module 401, by built-in sensor senses user to behavior When the interaction feature information that data processing equipment 400 is made, it is specifically used for:
Receive the pressure data that built-in pressure sensor uploads;And/or receive the temperature that built-in temperature sensor uploads Degree evidence;And/or receive the humidity data that built-in wet equipment degree sensor uploads.
In a kind of possible embodiment, the detection module 401, by built-in sensor senses user to behavior When the interaction feature information that data processing equipment is made, be also used to determine: user makes the behavioral data processing unit 400 Interaction locations information when interactive action:
The interaction feature information that the behavior processing unit 400 is made by built-in sensor senses user, Further include:
It determines the mark for uploading the sensor of the interaction feature information, based on the mark of determining sensor, determines every The location information of a sensor being triggered.
According to the location information of each of determining sensor being triggered, determine user to the behavioral data processing unit Make interaction locations information when interactive action.
Embodiment five
Referring to Figure 5, the embodiment of the present application also provides a kind of behavioral data processing unit 500, which includes:
Receiving module 501, the interaction feature information sent for receiving affiliated intelligent image;
Contrast module 502 determines user for determining database based on the interaction feature information and the interaction prestored Interbehavior;
Output module 503 exports corresponding voice messaging and/or image letter for the interbehavior based on the user Breath.
Embodiment six
Fig. 6 shows a kind of electronic equipment 600 provided by the embodiment of the present application, including processor 601, memory 602, Bus 603, the processor 601 and memory 602 are connected by bus 603;Processor 601 is for executing in memory 602 The executable module of storage, such as computer program.
Wherein, memory 602 may include high-speed random access memory (Random Access Memory, RAM), It may further include nonvolatile memory (non-volatile memory), for example, at least a magnetic disk storage.
It is total that bus 603 can be industry standard architecture (Industry Standard Architecture, ISA) Line, external equipment interconnection (Peripheral Component Interconnect, PCI) bus expand industrial standard summary Structure (Extended Industry Standard Architecture, EISA) bus etc..The bus can be divided into address Bus, data/address bus, control bus etc..Only to be indicated with a four-headed arrow in Fig. 6 convenient for indicating, it is not intended that only A piece bus or a type of bus.
Wherein, memory 602 is for storing program, and the processor 603 executes the journey after receiving and executing instruction Sequence, the behavioral data processing method that aforementioned any embodiment of the embodiment of the present invention discloses can be applied in processor 601, or It is realized by processor 601.
Processor 601 may be a kind of IC chip, the processing capacity with signal.It is above-mentioned during realization Each step of method can be completed by the integrated logic circuit of the hardware in processor 601 or the instruction of software form.On The processor 601 stated can be general processor, including central processing unit (Central Processing Unit, CPU), net Network processor (Network Processor, NP) etc.;It can also be digital signal processor (Digital Signal Processing, DSP), it is specific integrated circuit (Application Specific Integrated Circuit, ASIC), existing At programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic device, discrete Door or transistor logic, discrete hardware components.It may be implemented or execute the disclosed each side in the embodiment of the present invention Method, step and logic diagram.General processor can be microprocessor or the processor is also possible to any conventional processing Device etc..
The step of method in conjunction with disclosed in the embodiment of the present invention, can be embodied directly in hardware decoding processor and execute At, or in decoding processor hardware and software module combination execute completion.Software module can be located at random access memory, This fields such as flash memory, read-only memory, programmable read only memory or electrically erasable programmable memory, register maturation In storage medium.The storage medium is located at memory 602, and processor 601 reads the information in memory 602, in conjunction with its hardware The step in the behavioral data processing method of above-described embodiment two is completed, or executes the behavioral data processing side of above-described embodiment three The step of method.
The computer program product of behavioral data processing method provided by the embodiment of the present application, including store program generation The computer readable storage medium of code, the instruction that said program code includes can be used for executing previous methods as described in the examples Behavioral data processing method, specific implementation can be found in embodiment of the method, and details are not described herein.
It is apparent to those skilled in the art that for convenience and simplicity of description, the system of foregoing description It with the specific work process of device, can refer to corresponding processes in the foregoing method embodiment, details are not described herein.In the application In provided several embodiments, it should be understood that disclosed systems, devices and methods, it can be real by another way It is existing.The apparatus embodiments described above are merely exemplary, for example, the division of the unit, only a kind of logic function It can divide, there may be another division manner in actual implementation, in another example, multiple units or components can combine or can collect At another system is arrived, or some features can be ignored or not executed.Another point, shown or discussed mutual coupling Conjunction or direct-coupling or communication connection can be the indirect coupling or communication connection by some communication interfaces, device or unit, It can be electrical property, mechanical or other forms.
The unit as illustrated by the separation member may or may not be physically separated, aobvious as unit The component shown may or may not be physical unit, it can and it is in one place, or may be distributed over multiple In network unit.It can select some or all of unit therein according to the actual needs to realize the mesh of this embodiment scheme 's.
It, can also be in addition, each functional unit in each embodiment of the application can integrate in one processing unit It is that each unit physically exists alone, can also be integrated in one unit with two or more units.
It, can be with if the function is realized in the form of SFU software functional unit and when sold or used as an independent product It is stored in the executable non-volatile computer-readable storage medium of a processor.Based on this understanding, the application Technical solution substantially the part of the part that contributes to existing technology or the technical solution can be with software in other words The form of product embodies, which is stored in a storage medium, including some instructions use so that One computer equipment (can be personal computer, server or the network equipment etc.) executes each embodiment institute of the application State all or part of the steps of method.And storage medium above-mentioned includes: USB flash disk, mobile hard disk, read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic or disk etc. is various to deposit Store up the medium of program code.
Finally, it should be noted that embodiment described above, the only specific embodiment of the application, to illustrate the application Technical solution, rather than its limitations, the protection scope of the application is not limited thereto, although with reference to the foregoing embodiments to this Shen It please be described in detail, those skilled in the art should understand that: anyone skilled in the art Within the technical scope of the present application, it can still modify to technical solution documented by previous embodiment or can be light It is readily conceivable that variation or equivalent replacement of some of the technical features;And these modifications, variation or replacement, do not make The essence of corresponding technical solution is detached from the spirit and scope of the embodiment of the present application technical solution, should all cover the protection in the application Within the scope of.Therefore, the protection scope of the application shall be subject to the protection scope of the claim.

Claims (12)

1. a kind of behavioral data processing system characterized by comprising intelligent image and Virtual Reality equipment;Wherein:
The intelligence image, the interaction feature for being made by built-in sensor senses user to the intelligent image are believed Breath, and the interaction feature information is sent to the VR equipment;
The VR equipment, for receiving the interaction feature information;Sentenced based on the interaction feature information and the interaction prestored Determine database, determines user mutual behavior;Based on the interbehavior of the user, corresponding voice messaging and/or image are exported Information.
2. behavioral data processing system according to claim 1, which is characterized in that the built-in sensor includes following At least one of sensor:
Pressure sensor;Temperature sensor;Humidity sensor;
The intelligence image, for obtaining the interaction feature information using following step:
Receive the pressure data that built-in pressure sensor uploads;And/or receive the temperature number that built-in temperature sensor uploads According to;And/or receive the humidity data that built-in wet equipment degree sensor uploads.
3. behavioral data processing system according to claim 2, which is characterized in that the interaction feature information further includes using Interaction locations information when interactive action is made to the intelligent image in family;
The intelligence image, is also used to obtain the interaction locations information using following step:
Determine that the mark for uploading the sensor of the interaction feature information determines each quilt based on the mark of determining sensor The location information of the sensor of triggering;
According to the location information of each of determining sensor being triggered, determine that user makes interactive action to the intelligent image When interaction locations information.
4. behavioral data processing system according to claim 1, which is characterized in that the interaction determines to store in database There is the corresponding interaction feature information of default behavior;
The VR equipment, for determining the interbehavior of the user using following manner:
The received interaction feature information is determined that the interaction feature information stored in database matches with described interact, Determine that the interaction determines the highest target interaction feature information of matching degree in database;
The corresponding default behavior of the target interaction feature information is determined as to the interbehavior of the user.
5. behavioral data processing system according to claim 1, which is characterized in that the VR equipment, for using following Mode determines the voice messaging and/or image information of output:
Based on the mapping relations between pre-stored pre-set user interbehavior and default output information, the determining and user The corresponding default output information of interbehavior, the default output information includes voice messaging and/or image information.
6. behavioral data processing system according to claim 1, which is characterized in that the VR equipment is also used to:
Capture the voice to be identified of user;
The voice to be identified of the user and the voice prestored are determined that the voice stored in database matches, described in determination Voice determines the highest target voice of matching degree in database;
The target voice is determined as user's interactive voice;
Based on the mapping relations between the pre-set user interactive voice and default output information stored in advance, the determining and user The corresponding default output information of interactive voice, the default output information include voice messaging and/or image information.
7. behavioral data processing system according to claim 1, which is characterized in that the VR equipment is also used to:
Capture the limb action information to be identified of user;
When the limbs for detecting user are in contact with intelligent image, by the limb action information to be identified of the user and in advance The limb action deposited determines that the limb action characteristic information stored in database matches, and determines that the limb action determines data The highest target limb action of matching degree in library;User's limb action information to be identified is preset time period before contacting The location information that interior user's limbs trace information and limb motion velocity information and limbs is in contact with the intelligent image;
The target limb action is determined as user's interaction limb action;
Based on the mapping relations between the pre-set user interaction limb action stored in advance and default output information, it is determining with it is described The corresponding default output information of user's interaction limb action, the default output information includes voice messaging and/or image information.
8. behavioral data processing system according to claim 1, which is characterized in that the intelligence image is also used to:
The current terminal's status information of the intelligent image is sent to the VR equipment;
The VR equipment, is also used to:
Receive the current terminal's status information of the intelligent image;
Based on the terminal's status information and the virtual appearance information of preset intelligent image, the virtual shape of the intelligent image is generated As.
9. a kind of behavioral data processing method, which is characterized in that the described method includes:
The interaction feature information that intelligent image is made by built-in sensor senses user;
The interaction feature information is sent to Virtual Reality equipment, so that the VR equipment can be based on the interaction feature Information and the interaction prestored determine database, determine user mutual behavior;Based on the interbehavior of the user, output is corresponded to Voice messaging and/or image information.
10. behavioral data processing method according to claim 9, which is characterized in that described to be examined by built-in sensor Survey the interaction feature information that user makes intelligent image, comprising:
Receive the pressure data that built-in pressure sensor uploads;And/or receive the temperature number that built-in temperature sensor uploads According to;And/or receive the humidity data that built-in wet equipment degree sensor uploads.
11. behavioral data processing method according to claim 10, which is characterized in that the interaction feature information further includes User makes interaction locations information when interactive action to the intelligent image:
The interaction feature information that intelligent image is made by built-in sensor senses user, further includes:
Determine that the mark for uploading the sensor of the interaction feature information determines each quilt based on the mark of determining sensor The location information of the sensor of triggering;
According to the location information of each of determining sensor being triggered, determine that user makes interactive action to the intelligent image When interaction locations information.
12. a kind of behavioral data processing method, which is characterized in that the described method includes:
Receive the interaction feature information that intelligent image is sent;
Database is determined based on the interaction feature information and the interaction prestored, determines the interbehavior of user;
Based on the interbehavior of the user, corresponding voice messaging and/or image information are exported.
CN201910242486.2A 2019-03-28 2019-03-28 Behavior data processing system and method Active CN110209264B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910242486.2A CN110209264B (en) 2019-03-28 2019-03-28 Behavior data processing system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910242486.2A CN110209264B (en) 2019-03-28 2019-03-28 Behavior data processing system and method

Publications (2)

Publication Number Publication Date
CN110209264A true CN110209264A (en) 2019-09-06
CN110209264B CN110209264B (en) 2022-07-05

Family

ID=67785235

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910242486.2A Active CN110209264B (en) 2019-03-28 2019-03-28 Behavior data processing system and method

Country Status (1)

Country Link
CN (1) CN110209264B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN206906843U (en) * 2017-06-30 2018-01-19 深圳光启合众科技有限公司 The control device and robot of robot
CN107639620A (en) * 2017-09-29 2018-01-30 西安交通大学 A kind of control method of robot, body feeling interaction device and robot
CN107643820A (en) * 2016-07-20 2018-01-30 郎焘 The passive humanoid robots of VR and its implementation method
KR20180077974A (en) * 2016-12-29 2018-07-09 유한책임회사 매드제너레이터 Vr-robot synchronize system and method for providing feedback using robot

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107643820A (en) * 2016-07-20 2018-01-30 郎焘 The passive humanoid robots of VR and its implementation method
KR20180077974A (en) * 2016-12-29 2018-07-09 유한책임회사 매드제너레이터 Vr-robot synchronize system and method for providing feedback using robot
CN206906843U (en) * 2017-06-30 2018-01-19 深圳光启合众科技有限公司 The control device and robot of robot
CN107639620A (en) * 2017-09-29 2018-01-30 西安交通大学 A kind of control method of robot, body feeling interaction device and robot

Also Published As

Publication number Publication date
CN110209264B (en) 2022-07-05

Similar Documents

Publication Publication Date Title
US8819596B2 (en) Gesture control system
CN106055088B (en) The air of interactive wearable device writes and gesture system
CN103959282B (en) For the selective feedback of text recognition system
LaViola Jr 3d gestural interaction: The state of the field
KR20210022498A (en) Pose prediction with recurrent neural networks
JP2020507835A5 (en)
WO2021136131A1 (en) Information recommendation method and related device
CN102693007A (en) Gesture detection and recognition
CN102985897A (en) Efficient gesture processing
CN109446961A (en) Pose detection method, device, equipment and storage medium
CN112527113A (en) Method and apparatus for training gesture recognition and gesture recognition network, medium, and device
CN109725699A (en) Recognition methods, device and the equipment of identification code
CN108318042A (en) Navigation mode-switching method, device, terminal and storage medium
CN108568820A (en) Robot control method and device, electronic equipment and storage medium
CN109525697A (en) Contact person shares and the method, apparatus and terminal of display
CN111158487A (en) Man-machine interaction method for interacting with intelligent terminal by using wireless earphone
CN109710332A (en) Processing method, equipment and the computer readable storage medium of boarding application
KR102476619B1 (en) Electronic device and control method thereof
CN113537122A (en) Motion recognition method and device, storage medium and electronic equipment
CN110209264A (en) A kind of behavioral data processing system and method
CN112181148A (en) Multimodal man-machine interaction method based on reinforcement learning
CN115100689B (en) Object detection method and device, electronic equipment and storage medium
CN115291786A (en) False touch judgment method and device based on machine learning and storage medium
CN105608469A (en) Image resolution determination method and device
CN115645929A (en) Method and device for detecting plug-in behavior of game and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant