CN107918518A - Interactive operation method, apparatus, terminal device and operating system - Google Patents

Interactive operation method, apparatus, terminal device and operating system Download PDF

Info

Publication number
CN107918518A
CN107918518A CN201610887827.8A CN201610887827A CN107918518A CN 107918518 A CN107918518 A CN 107918518A CN 201610887827 A CN201610887827 A CN 201610887827A CN 107918518 A CN107918518 A CN 107918518A
Authority
CN
China
Prior art keywords
user
response
business object
interactive
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610887827.8A
Other languages
Chinese (zh)
Inventor
陈艺梵
张永超
熊薰园
卢珊
周堡英
甘志文
罗祺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN201610887827.8A priority Critical patent/CN107918518A/en
Publication of CN107918518A publication Critical patent/CN107918518A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Abstract

The embodiment of the present application provides a kind of interactive operation method, apparatus, terminal device and operating system, and the method includes:Display interface, detects the operation information based on interface;Business object is shown in the interface according to the operation information, and the response operation to user's operation is performed using the business object;The business object is interactive for being carried out with user, and the business object includes the interactive image based on business datum structure, and the response operation includes the interactive vivid action performed.User can be aided in by interactive response in interface, the response performed needed for user is interactive, so as to effectively control and aid in user's using terminal equipment.

Description

Interactive operation method, apparatus, terminal device and operating system
Technical field
This application involves field of terminal technology, more particularly to a kind of interactive operation method, a kind of interactive operation device, one Kind terminal device and a kind of operating system for terminal device.
Background technology
With the development of terminal technology, more and more user's using terminals, and terminal also provides what is increasingly enriched Function services.Therefore have also been developed some learning softwares to use for children, i.e., children can use learning software in terminal Learnt, as school-ager using learning software auxiliary lessons study, preschool child using learning software study general knowledge Etc. knowledge.Learning software is often not merely only mounted with terminal, the amusement functions such as some audio and video, game can be also provided.
But and children are often unable to control time of study and amusement due to various reasons, part preschool child by In will not write be also required to parent auxiliary could be learnt, the use for terminal learning functionality does not control effectively.
The content of the invention
The embodiment of the present application is the technical problem to be solved is that a kind of interactive operation method is provided, with effective ancillary equipment Use.
It is used for correspondingly, the embodiment of the present application additionally provides a kind of interactive operation device, a kind of terminal device and one kind The operating system of terminal device, to ensure the realization of the above method and application.
To solve the above-mentioned problems, this application discloses a kind of interactive operation method, including:Display interface, detection are based on The operation information at interface;Business object is shown in the interface according to the operation information, and is held using the business object Response operation of the row to user's operation;The business object is used to carry out interaction with user, and the business object includes being based on industry The interaction image of data of being engaged in structure, the response operation include the action that the interactive image performs.
Optionally, the response operation further includes:Voice output responds.
Optionally, the operation information of the detection based on interface, including:Determine to grasp based on the user's operation at the interface Make information, the operation information is detected.
Optionally, according to the operation information business object is shown in the interface, including:According to the operation information Business datum is obtained, parsing is carried out to the business datum and shows corresponding business object.
Optionally, the response performed using the business object to user's operation is operated, including:Receiving the use When family operates, the interactive image display input sensed activation response is called.
Optionally, the input sensed activation response includes:Action response is listened attentively to, or, checks action response.
Optionally, the response performed using the business object to user's operation is operated, including:Based on the user Operation determines the first response message, and the interactive vivid display action feedback response is called according to first response message, and Export voice feedback response.
Optionally, the user's operation includes inquiry operation;Acting feedback response includes the first action of successful inquiring instead Feedback response, and/or, inquire about failure second acts feedback response;Voice feedback response includes corresponding first language of successful inquiring Sound feedback response, and/or, inquire about the second voice feedback response of failure.
Optionally, the operation information is detected, including:Detect whether the operation information meets filter condition.
Optionally, the response performed using the business object to user's operation is operated, including:According to the operation Information determines that the filter condition corresponds to the second response message, calls the interactive vivid display action prompting response, and export Voice prompt responds.
Optionally, the operation information is detected, including:The input side of user's operation is corresponded to the operation information Formula is detected;The response operation includes:Output response corresponding with the input mode.
Optionally, the input mode includes:Phonetic entry mode and/or character input modes.
Optionally, further include:The user information of user is pre-set, according to the user information by the user and described Business object is bound.
Optionally, the user and the business object are bound according to the user information, including:According to user information The corresponding interactive image of the business object is set, so that interaction be vivid and the characteristic matching of the user.
Optionally, the feature includes the age, and the corresponding interactive image of the business object follows the time to update, to keep With the age-matched of the user.
Optionally, further include:When going out user intent information according to input information None- identified, using the business object The input of user described in response operation guide is performed to identify user view.
Optionally, further include:The corresponding scene information of the business object is downloaded from server sync.
Optionally, the scene information includes:Model data, animation curve table, material information and scenario parameters information.
Optionally, further include:According to predetermined manner and scene information, the business object and the business object are configured Response operation.
Optionally, the interactive image includes the interactive images of 3D.
Optionally, the operating system applied to network machine top box and/or TV.
Optionally, applied to children education field.
The embodiment of the present application also discloses a kind of interactive operation device, including:Display module, for display interface, and According to operation information business object is shown in the interface;Detection module, for detecting the operation information based on interface;Response Module, for performing the response operation to user's operation using the business object;The business object is used to carry out with user Interaction, the business object include the interaction image based on business datum structure, and the response operation includes the interactive image The action of execution.
The embodiment of the present application also discloses a kind of terminal device, including:Processor, display;The display, is coupled to The processor, display interface, and according to operation information business object is shown in the interface;The processor, detection Operation information based on interface;The response performed using the business object to user's operation is operated;The business object is used for Interactive with user's progress, the business object includes including institute based on the interactive vivid of business datum structure, the response operation State the action that interactive image performs.
The embodiment of the present application also discloses a kind of operating system for terminal device, including:Display unit, shows boundary Face, and according to operation information business object is shown in the interface;Detection unit, detects the operation information based on interface; Response unit, the response performed using the business object to user's operation are operated;The business object is used to carry out with user Interaction, the business object include the interaction image based on business datum structure, and the response operation includes the interactive image The action of execution.
Compared with prior art, the embodiment of the present application includes advantages below:
In the embodiment of the present application, after display interface, the operation information based on interface can be detected, then according to the behaviour Make information and business object is shown in interface, which is used to carry out interaction with user, including is built based on business datum Interaction image, and the business object can be used to perform the response operation to user's operation, response operation includes described The action that interactive image performs, you can to aid in user by interactive response in interface, the response performed needed for user is interactive, So as to effectively control and aid in user's using terminal equipment.
Brief description of the drawings
Fig. 1 is the interactive display signal of one kind of the embodiment of the present application;
Fig. 2 is a kind of step flow chart of interactive operation embodiment of the method for the application;
Fig. 3 is a kind of step flow chart of interactive operation method of the application one embodiment;
Fig. 4 is the interactive display signal of another kind of the embodiment of the present application;
Fig. 5 is a kind of step flow chart of interactive operation method of the application another embodiment;
Fig. 6 is another interaction display signal of the embodiment of the present application;
Fig. 7 is a kind of interactive service schematic diagram of the embodiment of the present application;
Fig. 8 is a kind of structure diagram of interactive operation device embodiment of the application;
Fig. 9 is the hardware architecture diagram for the terminal device that one embodiment of the application provides;
Figure 10 is the hardware architecture diagram for the terminal device that another embodiment of the application provides;
Figure 11 is a kind of schematic diagram of operating system for terminal device of the application.
Embodiment
It is below in conjunction with the accompanying drawings and specific real to enable the above-mentioned purpose of the application, feature and advantage more obvious understandable Mode is applied to be described in further detail the application.
In the embodiment of the present application, terminal device refers to the terminal device with multimedia function, these equipment support sound Frequently, the function of video, data etc..The terminal device has touch-screen, including intelligent mobile terminal such as intelligence in the present embodiment Can mobile phone, tablet computer energy, intelligent wearable device or with equipment such as the smart television of touch-screen, personal computers. The terminal device can use various intelligent operating systems, such as IOS, Android, cloud OS, TVOS.
For the ease of child resistant terminal, the embodiment of the present application provides a kind of interactive operation method, can use interaction Vivid assisting child operating terminal, required study, amusement function are provided for children, and can be passed through interaction and be prompted children couple In the use of terminal, reasonable arrangement usage time and it control effectively to terminal.The embodiment of the present application uses intelligence with user Exemplified by energy TV, smart television can install network high definition set top box, be such as suitable for the TV box of children, can also use intelligence The TV operation system of energy, so as to show interactive vivid assisting child operation on TV.
Reference Fig. 1, shows a kind of interactive display schematic diagram of the embodiment of the present application.
It can be shown on the screen of terminal and carry out interactive business object with user, which includes being based on business The interaction image of data structure, so that interactive image can be two dimension or three-dimensional animation image etc..3D cartoons as shown in Figure 1 are mutual Dynamic image wear nightwear, prompts that user's current time is later to sleep, except through the interactive images of 3D appearance and move Work can also export corresponding suggestion voice outside being prompted user, understand easy to user.Wherein, business datum is interaction The data of business, can include scene information of the interaction image under various scenes, so that interactive image can be in various fields Different appearances and action etc. are shown under scape, so that effectively to the various feedbacks needed for user and interaction.
In the embodiment of the present application, which can trigger display according to various default modes, such as open eventually The business object is shown in desktop, behind end alternatively, showing business pair after user inputs, after some behaviors of user are monitored As so that triggering in several ways is so that the display of business object is more flexible.After business object is shown, it can also adopt Response operation, such as the action by the interactive image of display and output voice are performed with business object, when prompting user's use Between it is longer should rest, blessing user's birthday, happy holiday etc..
In the embodiment of the present application, interactive image can use the interactive images of 3D, and the interactive images of 3D are based on 3D model data structures Build, wherein, 3D model datas can be developed based on various modes, then rendered and generated the interactive images of corresponding 3D, such as based on DirectX, OpenGL (Open Graphics Library) etc..By taking OpenGL as an example, which define one across programming language, The professional graphic package interface API of cross-platform programming interface specification, is a work(for 3-D view (or two dimensional image) Can be powerful, call convenient underlying graphics storehouse.OpenGL is an open 3-D graphic software kit, it is independently of window system And operating system, the application program developed based on it can be transplanted very easily between various platforms, OpenGL is used Simplicity, it is efficient.
So as to interactive vivid in the 3D of display business object, the model data that can be developed based on OpenGL, is called corresponding Interface API model data memory is rendered, the interactive image of the 3D that renders then is shown in interface, it is each by what is rendered Two field picture, 3D interactions image can show various interactive actions on interface.
Reference Fig. 2, shows a kind of step flow chart of interactive operation embodiment of the method for the application.
Step 202, display interface, detects the operation information based on interface.
Step 204, business object is shown in the interface according to the operation information, and is held using the business object Response operation of the row to user's operation.The business object is used to carry out interaction with user, and the business object includes being based on industry The interaction image of data of being engaged in structure, the response operation include the action that the interactive image performs
In the embodiment of the present application, the mode that various triggerings show business object can be preset, such as when terminal starts, it is soft When part is closed, switched, when receiving user's input, when detecting predetermined registration operation, therefore predetermined manner includes actively triggering And/or passive wake-up.Active triggering mode refers to the mode based on user's operation triggering, such as switching on and shutting down, software exchange, connects Receive user's input etc.;The passive mode for waking up the mode for referring to waking up based on policer operation, i.e. non-user active triggering, such as Monitor user's usage time is long, the time is later etc..Business object can be shown after triggering is received, and such as display is based on 3D The interactive images of 3D of model data structure.
User can show corresponding interface in using terminal equipment on terminal device, which provides corresponding service Function, such as applied to children education field, then on the interface the various various service functions of display controls, such as search for control, Management control of audio and video control, study auxiliary control and parent etc., such as in a kind of example interface shown in Fig. 1, control bag Include:Search for, see animation, listen nursery rhymes, gain knowledge, juvenile's variety, animation star, parent's channel etc..User can be on the surface Operated, generate corresponding operation information for user's operation, then show business in the interface according to operation information Object, and operated using the response that the business object is performed to user's operation.Response operation includes the dynamic of interactive image execution Make, further include the response operation such as output voice, so that by picture and auditory tone cues user, understand easy to user.
Wherein, the difference based on operation information, the response on interface performed by corresponding business object and business object Operation is also different, such as the operation information for receiving user's operation input, interactive image display can be used to get this defeated The response operation entered, is such as listening to or is watching response operation input by user.And for example user, which inputs, searches a certain cartoon Afterwards, the response message based on query result performs response operation.And for example for user's birthday, then interactive image can be shown log date Show response action of birthday greeting etc..
To sum up, after display interface, the operation information based on interface can be detected, then according to the operation information at interface Middle display business object, the business object are used to carry out interaction image that is interactive, including building based on business datum with user, and And the business object can be used to perform the response operation to user's operation, response operation includes what the interactive image performed Action, you can to aid in user by interactive response in interface, the response performed needed for user is interactive, so as to effectively control System and auxiliary user's using terminal equipment.
The embodiment of the present application can be based on business object display and response, carried out in user's using terminal various auxiliary Help, control operation.Such as in 0-14 Sui children in use, can be looked into by the interactive images of 3D and the interactive auxiliary that provides of children Ask, monitoring such as uses at the function, so as to monitor children's reasonable employment terminal while assisting child effectively uses.
Wherein, business object can be directed to different inputs and perform different responses, in order to which children understand.Wherein, ring Should operate including:The action response of interactive image, and/or, voice output response, in the embodiment of the present application, using interactive image as It is described exemplified by the interactive images of 3D.The various actions that the action response of the interactive images of 3D includes performed by the interactive images of 3D are rung Answer, such as sleep acts, listens attentively to action, game action etc., phase can also be configured for 3D interactions are vivid by understanding for the ease of user The graphic appearance answered, for example, different response action 3D interaction images wear the different accessory of different clothes, configuration (it is such as ball, Scooter etc.).And business object can also carry out the response of voice output, that is, corresponding voice data be played, easy to children Understand.Can be interactive by voice mode for also illiterate preschool child, so that aiding in the children at each age effectively makes Use terminal.
In the embodiment of the present application, operation information can be determined based on the user's operation at the interface, the operation is believed Breath is detected, that is, receives user's operation of the user at interface, and operation information is determined according to user's operation, such as monitoring Input can determine that operation information to receive the operation information of input, or determines to grasp based on information such as dates input by user Make information;After providing the user with some function services based on user's operation, operation information, example can be determined based on the offer of service Such as operation information includes playing video and playing duration information, so as to be monitored to the corresponding of user.So as to behaviour When being detected as information, the input mode that the operation information corresponds to user's operation can be detected, is, for example, phonetic entry mode Then can subsequently carry out voice response, and for example the date in operation information, using temporal informations such as durations so that policer operation is believed Whether breath meets filter condition etc..
So as to obtain business datum based on the detection to operation information, it is corresponding that parsing display is then carried out to business datum Business object, i.e., the detection to operation information obtains business scenario, calls the corresponding scene information of the business scenario, according to scene Information calling model data and other auxiliary informations render each two field picture, so as to show business object on interface.
The embodiment of the present application can be based on user's operation and perform various responses and user's operation is monitored, so as to have Effect is aided in user.
In one example, it can be responded, can be completed since input until obtaining input based on user's operation Feedback result afterwards shows the response action of interactive image, so as to aid in user to perform required operation.
Reference Fig. 3, shows a kind of interactive operation method and step flow chart of another embodiment of the application.
Step 302, display interface.
Step 304, when receiving the input of user, it is detected to inputting corresponding operation information.
User can show corresponding interface, user can hold under the interface in using terminal equipment on terminal device The various operations of row, so as to receive corresponding user's operation.User can be inputted required with acquisition on the surface Function services, terminal can receive user's input by various modes in the process of running, such as user directly inputs voice, then The voice data of typing can be received as input information, and for example user's progress word input then can be by word of reception etc. Character string is as input information.The default input mode includes phonetic entry mode, character input system etc., so that based on use The input and user interaction that householder moves, by understanding that input information identifies the information of user view, to aid in user's operation.Can be with The input mode is detected, so as to perform output response corresponding with input mode in response.
Step 306, the interactive image display input sensed activation response is called.
When user by inputting actively trigger, input the information such as voice data, character of user can be received String etc., is triggered based on input and wakes up business object, so as to call the first business scenario of input triggering when receiving input, so The scene information of first business scenario is obtained afterwards, calls API to obtain the model data of interactive image, fortune based on the scene information Dynamic trace information and other scenario parameters, are then based on various scenario parameters and draw and render the interactive image of each frame of generation, so Interaction image is shown on interface based on temporal information afterwards, so that interactive image can show input sensed activation on interface Response, i.e., shown interactive image can show the action for perceiving input on interface so that user sees system pair In the reception of input, improve interactive.Sensed activation response is inputted in the embodiment of the present application to refer to perceiving use for characterization The response of family input, since input mode is there are a variety of, there is also a variety of response messages, input for the response of input sensed activation Sensed activation response includes:Action response is listened attentively to, or, checks action response.It can be performed for phonetic entry and listen attentively to action sound Should, such as when user inputs, the action that the interactive images of 3D will be manually placed on ear is shown on interface, can also show 3D interactions The action that image inclines one's ear to, is listening her to speak shown in Fig. 4 so that user can intuitively experience the interactive image of 3D cartoons.Pin The character inputs of the typewritings such as word can be performed and check action response, such as the action of the interactive image probe viewings of display 3D, Show large-eyed action of the interactive images of 3D etc., symbolize the interactive images of 3D and seeing the content that user writes.
Business object is shown so as to input actively triggering based on user, and can input hour hands in user The operation of input is responded, and is made the reception that user perceives input interactive to improve, is in children's use, children can see To the action of the interactive image of 3D cartoons when speaking, feel that the interactive image of 3D cartoons is listening him to speak, so as to improve children Enthusiasm and usage experience, and the custom exchanged with people can be formed.
Step 308, the first response message is determined based on the user's operation.
Step 310, the interactive vivid display action feedback response is called according to first response message, and exports language Sound feedback response.
After the input information of user is received, processing can also be identified to the input information, such as to typing Voice data carries out speech recognition, and semantics recognition etc. is carried out to the character string of input.Such as phonetic entry is identified into text envelope Breath, then carries out the processing such as segmenting and identifies semantic information, obtain corresponding recognition result to file.
Then corresponding first response message, such as recognition result are determined according to recognition result to search cartoon " happiness ocean Ocean ", then can perform search operation, obtain lookup result, and the scene information generation for searching scene is combined according to the lookup result First response message, first response message include the data such as the action of business object execution and the voice of output, Ran Houyi Call business object to perform response operation according to first response message, that is, export voice feedback response, and display action feedback Response is so as to provide field feedback.
Wherein, the result based on inquiry operation generates the first response message, and query result includes:Successful inquiring (is searched To information needed), inquiry failure (not inquiring information needed), it is interactive in the embodiment of the present application based on different query results Image can provide different responses, that is, acting feedback response includes the first action feedback response of successful inquiring, inquiry failure The second action feedback response, voice feedback response includes the response of successful inquiring corresponding first voice feedback, inquiry fails Second voice feedback responds.So as to export different voices based on different query results, and interactive image display is different Action, i.e., the first response message based on different query results match different business scenarios, call business scenario corresponding Business datum matching response operation.
Such as in the scene of above-mentioned lookup " full of joy ", inquiring cartoon《Like sheep sheep and ash too wolf》Afterwards, Ke Yisheng Into the first response message of successful inquiring, the second business scenario of successful inquiring is then called, is then based on the second business field Rendering the interactive images of 3D and playing for scape, so as to the action that please be watched by the interactive image displays of 3D, and can generate language The voice data of sound feedback response, broadcasts voice prompt " playing at once full of joy ", then jumps to and plays corresponding video.
In another example, can be responded based on user's operation, from user's operation is received until responding user During operation performs service function, it is also necessary to the use to terminal is monitored, and particularly some children's autocontrol forces are poor, The more rational using terminal of monitoring management children can be passed through.Such as monitoring avoids usage time long, monitoring avoids night Used Deng the time of having a rest, therefore some filter conditions can be set to filter the operation behavior of user and carry out making for monitor terminal With.
One of which filter condition is time monitoring condition, and the time includes the monitoring such as date, clock to absolute time Time etc., further includes the monitoring to relative time and such as uses duration.It is as follows for a kind of example of time monitoring filtering:
Reference Fig. 5, shows another interactive operation method and step flow chart of another embodiment of the application.
Step 502, during display interface, detect whether the operation information meets filter condition.
User can be monitored the time in using terminal, including time monitoring operates, uses duration policer operation. Such as operation information including temporal information etc., then whether the date that can monitor the same day is festivals or holidays or user equipment after powering Birthday, commemoration day etc., it is the operation information for meeting filter condition to meet any date.When the temporal information can also include Clock information, i.e., monitor when in use the current time whether beyond setting usage time, such as 7 points to 9 points of every night To allow usage time, using to meet the operation of filter condition after filter condition, or 10 points of every night is then met Information.And for example, can be to the when progress of the service function after being responded for user's operation and corresponding service function is provided Row monitoring, such as set when it is a length of 2 it is small when, continued more than two hours viewing then meet filter condition.
For the operation behavior that meets filter condition is monitored, business object can be actively waken up to be carried to user Show.
Step 504, according to the operation information business object is shown in the interface.
Step 506, determine that the filter condition corresponds to the second response message according to the operation information.
Step 508, the interactive vivid display action prompting response is called, and exports voice prompt response.
After detecting and meeting the operation behavior of filter condition, business scenario can be determined according to the filter condition, is generated Corresponding second response message of the business scenario, can wake up business object at this time, show the business object, and use business pair As performing second response message, i.e., corresponding action prompt response, and output voice prompt are shown using interactive image Response.
Such as birthday, red-letter day etc. are monitored, the voice prompt of " happy holiday " " happy birthday " can be exported, and show 3D interaction images spread flower, hold in both hands the action such as birthday cake.At night after 10 points in use, action as shown in Figure 1 can be shown Prompting responds and plays the voice prompt of " to sleep, going to bed early and get up early quickly to grow up ".Using it is long when can be with Show that 3D interactions image as shown in Figure 6 steps on scooter by as far as near action, and play voice " that sees is somewhat long, rest Voice prompt once ".
The filter condition of the embodiment of the present application further includes other filter conditions, which can also be actively defeated with user Enter to combine to filter the demand of user, such as the cartoon that children require to see is not suitable for its age, can prompt " to look at other Cartoon ".
Above-mentioned active triggering, passive wake up etc. can wake up display business object under mode, different triggering modes with And demand etc. all corresponds to different business scenarios, scene information is called to carry out interactive vivid display based on business scenario, therefore The embodiment of the present application can also be the scene information that business object configures various businesses scene, so as to be held under different scenes The different response operation of row.Such as show difference in the case where searching the various scenes such as feedback, rest prompting, festivals or holidays blessing, chat Action, broadcast different voice messaging and the different image appearance of display etc..
In a kind of example as shown in Figure 7, the scene information of business object, the scene information can be set in the server Include the general information of various businesses scene, such as the information such as graphic appearance, action, then server can be by the scene of renewal Synchronizing information is downloaded the business object from server sync and is corresponded to smart machine such as television set, TV box etc., i.e. equipment Scene information.Business object can use resident service to run corresponding objects example, can handle intent requests, right The operation action of interface application is filtered, then the interactive image of starting-window widget displays, for the aobvious of business object Showing can carry out rendering display based on scene information, by calling corresponding scene information to render the interactive shape of display to scene management As.Wherein, the scene information includes:Model data, animation curve table, material information and scenario parameters information.Model data It is the data for generating the interactive images of 3D, main animating image such as Fig. 1 of the interactive images of 3D can be generated based on the model data In Little Bear.Animation curve table is the movement locus for determining the interactive images of 3D, i.e., interactive based on animation curve table generation 3D The response such as action that image performs.Material information is used to determine outside the information, such as clothes, stage property etc. such as the accessory of the interactive images of 3D See information.Scenario parameters information is the parameter information under business scenario, such as software and hardware parameter etc..
Therefore the embodiment of the present application can determine business scenario according to default input mode, then obtain the field under the scene Scape information, the response operation according to scene information configuration service object and the business object.I.e. according to user's operation, operation Information, user's operation correspond at least one matching business scenario in input information, response message, and it is corresponding to obtain the business scenario Scene information, determines that the display information of business object, and combining response information determine action response etc. according to the scene information, So as to give a variety of feedbacks of user, improve interactive.So as to after a certain business scenario is met, the scene be loaded and led to API Calls scene information is crossed, interactive shape is rendered based on model data, animation curve table, material information and scenario parameters information As, and the animation frame of the interactive image of each frame 3D is rendered based on animation management, the interactive images of 3D are played based on the animation frame Animated actions, so as to show the response action that the interactive images of 3D perform on interface.And corresponding language is played under the scene Sound data, by animation, voice come with user interaction.
The embodiment of the present application is in order to preferably with user interaction, can obtain user information, then using the user's information User and business object are bound.Different responses is fed back so as to which business object can be directed to different user, meets user Individual demand.
In the embodiment of the present application, by user for exemplified by 0-14 Sui children, using 3D it is interactive it is vivid be used as business object, for Family provides study, the auxiliary and management function of amusement.When user uses phonetic entry, 3D interaction images, which occur, uses sound (output audio) and body language (display action), so that the interactive images of 3D and the interaction of children more vividly and are readily appreciated that, And vivid body language can follow the Age Characteristics of image, then voice of arranging in pairs or groups so that user is easier to understand and receives.
Therefore can registered user's information in terminal in advance, the user's information include User ID, name, the age, gender, Preference etc., can also set the pet name when so that 3D interactive images are more easy to be understood by children when interacting with children.By business object and When user information is bound, the interactive images of the corresponding 3D of the business object can be set according to user information, so that described The interactive vivid and characteristic matchings of the user of 3D.As children's configuration more meets its age, the interactive image of the 3D cartoons of gender, Such as the interactive image of 3D cartoons of 0-6 Sui is more lovely, and the interactive image of the 3D cartoons of 7-14 Sui is more abstract.
The interaction mode of the interactive images of 3D can also be set according to information such as age of user, genders, which can To be configured in scene information, so that the interactive images of 3D are when performing response operation, for same type of response action and defeated The language form for going out voice is respectively provided with difference, performs the response for being easier to be understood and received by child age.As children will search Cartoon when not finding, can performing the action shaken the head and output for the underage child of 0-6 Sui, " XX is without we see Point is other ", and for the children of 7-14 Sui can perform apology act and export " it is sorry not find XX ", so as to be more easy to Empathize with children, received by children., can be according to year residing for user so as to contemporary of the interactive images of 3D as child The growth feature of age section, gives matched animated actions and voice feedback.
Also, the corresponding 3D interactions image of the business object follows the time to update, to be kept for the age with the user Matching.When i.e. according to the setting 3D of user's birthday by information interaction images, which can occur not according to the different ages Same image, the cartoon character such as from baby to Childhood, so that the interactive images of the 3D also can be with the mobile progress of time Change so that the interactive images of 3D are grown up together with children, become the playfellow to grow up together with children.Such as when user was from 6 years old 3D interactions image will grow up together with children when growing to 7 years old, and for example as region corresponds to the interactive images of change 3D in season The image appearances such as different clothing can be configured.So as to which the interactive images of 3D carry out careful configuration according to the division of age bracket, with Child growth, realizes that the interactive images of 3D and desktop UI can be with the effects of user's Growing.
Children can provide the feedback of response so that use when learning using TV, entertaining based on the phonetic entry of children When family (such as 0-14 Sui child) uses voice, corresponding voice has the animation of interactive image, is helped by action and expression Help child to understand, and can support manipulation instruction/chat of user etc..It is various by active triggering and passive wake-up etc. Mode triggers the display and response of the interactive images of 3D, can carry out active care to the children used, such as enter at night Desktop, viewing time be long, daily viewing time-out, completes learning tasks, celebrates a birthday, under the scene such as red-letter day, shows 3D interaction shapes As and export voice response, encourage and cultivate child by the interactive reasoning of imagcs and form good studying and living custom, and And children are allowed to understand operation and feedback.
In the embodiment of the present application, the interactive images of 3D provide different responses, including action and voice etc. according to different scenes, Consequently facilitating children understand, interaction is better achieved equipment is reasonably used with assisting child.
When entering desktop at night, vivid animation occur the interactive images of 3D and prompt " to sleep, ability of going to bed early and get up early Quickly grow up ";When viewing time is long, it may appear that the interactive images of 3D are simultaneously prompted " that sees is somewhat long, has a rest ";Daily The interactive images of 3D and prompt that " hi, welcomes, and to keep certain distance with TV when first entering desktop!”;Every time/daily Viewing duration reaches when reaching the standard grade, and the interactive images of 3D occurs and prompts " time is to next time (tomorrow) sees ";On the day of birthday/red-letter day The interactive images of relevant 3D are also had into desktop and prompt for the first time.
In addition to above-mentioned business scenario, the embodiment of the present application also sets the language model of each age group for the interactive images of 3D, Topic that can also be chatted and chat with user so as to the interactive images of 3D etc. is easy to be received by children, such as when user says:It is " modern Its total marks of the examination is very good, and teacher praises me ", 3D interaction images occur the expression of happiness and export the voice of encouragement and praise Data, can also show clap hands, the action than praises such as thumbs.
So as to when being applied to children education field, by interactive vivid the interacting with children of 3D, user is helped to form with this Good living and studying custom, and influence how user exchanges with people.
When child age is smaller, it is possible that speaking with a lisp, being unable to situations such as expressed intact is intended to, therefore the application Embodiment can guide children accurately to express the demand intention of oneself.I.e. when foundation input information None- identified goes out user view During information, the business object is used to perform the input of user described in response operation guide to identify user view.Receive The input information is identified after the input information of user, when the content identified can not complete to understand the intention of user, Confirm that None- identified goes out user intent information, can then be used based on the voice data of the information generation guiding identified The interactive images of the 3D perform response action and export the voice data of the guiding, required to guide children further to give expression to Content, further identifies hence for input content, guides watch for children to express one's ideas figure step by step, so as to feed back interior needed for children Hold.Such as it is " sheep sheep sheep sheep " to receive the text that voice data identifies, user view guiding watch for children can be analyzed needed for Content, such as output " it is full of joy whether XX wants to see ", guide the expression of children.
So as to which the embodiment of the present application can be supported to play the interactive images of 3D using opengl, image reads aloud consolidating for backstage configuration Accordatura frequency or tts voices etc..It can support, chat interactive with the interactive vivid animations of 3D, show various expressions or search knot Fruit.Handle the vivid animation of multiple priorities, including interrupt, preemptive type, normal priority, negligible priority etc..From the background Can be with configuring animations parameter, number that the animation file included the use of, animation play, speed, animation recording file, animation reads aloud Tts texts etc..Scenario parameters can also be configured, the animation list included the use of, background music, background picture, applicable user At the age, whether the size of cartoon interface, position, transparency, have focus etc..And to the file encryption of backstage upload.Client can Animation and the configured list of scene are obtained from server-side to network, download, decompression encryption file, renewal is to locally.
It should be noted that for embodiment of the method, in order to be briefly described, therefore it is all expressed as to a series of action group Close, but those skilled in the art should know, the embodiment of the present application and from the limitation of described sequence of movement, because according to According to the embodiment of the present application, some steps can use other orders or be carried out at the same time.Secondly, those skilled in the art also should Know, embodiment described in this description belongs to preferred embodiment, and involved action not necessarily the application is implemented Necessary to example.
On the basis of above-described embodiment, the embodiment of the present application additionally provides a kind of interactive operation device.
With reference to Fig. 8, show a kind of structure diagram of interactive operation device embodiment of the application, can specifically include as follows Module:
Display module 802, business object is shown for display interface, and according to operation information in the interface.
Detection module 804, for detecting the operation information based on interface.
Respond module 806, for performing the response operation to user's operation using the business object;The business object Interactive for being carried out with user, the business object includes the interaction image based on business datum structure, the response operation bag Include the action that the interactive image performs.
Wherein, the predetermined manner includes active triggering mode and/or passive wake-up mode.The response operation includes: The action response of the interactive images of 3D, and/or, voice output response.
The detection module 804, for determining operation information based on the user's operation at the interface, believes the operation Breath is detected.
The display module 802, for according to the operation information acquisition business datum, being solved to the business datum Analysis shows corresponding business object.
The respond module 806 includes:First response submodule, for when receiving the user's operation, described in calling Interactive image display input sensed activation response.The input sensed activation response includes:Action response is listened attentively to, or, is checked dynamic Respond.
The respond module 806 includes:Second response submodule, for determining that the first response is believed based on the user's operation Breath, the interactive vivid display action feedback response is called according to first response message, and exports voice feedback response.Its In, the user's operation includes inquiry operation;Acting feedback response includes the first action feedback response of successful inquiring, and/or, Second action feedback response of inquiry failure;Voice feedback response includes corresponding first voice feedback of successful inquiring and responds, and/ Or, the second voice feedback response of inquiry failure.
The detection module 804, for detecting whether the operation information meets filter condition.
The respond module 806 includes:3rd response submodule, for determining the filtering rod according to the operation information Part corresponds to the second response message, calls the interactive vivid display action prompting response, and export voice prompt response.
The detection module 804, the input mode for corresponding to user's operation to the operation information are detected;It is described Response operation includes:Output response corresponding with the input mode.The input mode includes:Phonetic entry mode and/or Character input modes.
Binding module, for pre-setting the user information of user, according to the user information by the user and described Business object is bound.
The binding module, for setting the corresponding interactive image of the business object according to user information, so that described The interactive vivid and characteristic matching of the user.Wherein, the feature includes age, the corresponding interactive image of the business object The time is followed to update, to keep the age-matched with the user.The response operation that the business object performs is according to user's Age is set.
The respond module 806, is additionally operable to when going out user intent information according to input information None- identified, using described Business object performs the input of user described in response operation guide to identify user view.
Further include:Synchronization module, for downloading the corresponding scene information of the business object from server sync.The field Scape information includes:Model data, animation curve table, material information and scenario parameters information.
The respond module 806, is additionally operable to according to predetermined manner and scene information, configures the business object and described The response operation of business object.
In the embodiment of the present application, interactive image include the interactive vivid above device of 3D can be applied to network machine top box and/ Or in the operating system of TV, applied to children education field.
The embodiment of the present application additionally provides a kind of non-volatile readable storage medium, be stored with the storage medium one or Multiple modules (programs), when which is used in terminal device, can cause the terminal device to perform The instruction (instructions) of various method steps in the embodiment of the present application.
Fig. 9 is the hardware architecture diagram for the terminal device that one embodiment of the application provides.As shown in figure 9, the terminal is set It is standby to include input equipment 70, processor 71, output equipment 72, memory 73 and at least one communication bus 74.Communication is total Line 74 is used for realization the communication connection between element.Memory 73 may include high-speed RAM memory, it is also possible to further include non-easy The property lost storage NVM, a for example, at least magnetic disk storage, can store various programs in memory 73, for completing various places Reason function and the method and step for realizing the present embodiment.
Optionally, above-mentioned processor 71 for example can be central processing unit (Central Processing Unit, abbreviation CPU), application specific integrated circuit (ASIC), digital signal processor (DSP), digital signal processing appts (DSPD), programmable Logical device (PLD), field programmable gate array (FPGA), controller, microcontroller, microprocessor or other electronic components are real Existing, which is coupled to above-mentioned input equipment 70 and output equipment 72 by wired or wireless connection.
Optionally, above-mentioned input equipment 70 can include a variety of input equipments, such as can include user oriented user At least one of interface, device oriented equipment interface, the programmable interface of software, camera, sensor.Optionally, the face To the equipment interface of equipment can be for carry out data transmission between equipment and equipment wireline interface, can also be for setting The standby hardware insertion interface (such as USB interface, serial ports etc.) carried out data transmission between equipment;Optionally, the user oriented User interface for example can be user oriented control button, the voice-input device for receiving phonetic entry and user Receive the touch awareness apparatus (such as touch-screen with touch sensing function, Trackpad etc.) of user's touch input;Optionally, The programmable interface of above-mentioned software for example can be to edit or change the entrance of program, such as the input pin of chip for user Interface or input interface etc.;Optionally, above-mentioned transceiver can have the rf chip of communication function, at base band Manage chip and dual-mode antenna etc..The audio input device such as microphone can receive voice data.Output equipment 72 can include The output equipments such as display, sound equipment.
In the present embodiment, the processor of the terminal device includes being used to perform each module of data processing equipment in each equipment Function, concrete function and technique effect are with reference to above-described embodiment, and details are not described herein again.
Figure 10 is the hardware architecture diagram for the terminal device that another embodiment of the application provides.Figure 10 is in reality to Fig. 9 A specific embodiment during existing.As shown in figure 9, the terminal device of the present embodiment includes processor 81 and memory 82。
Processor 81 performs the computer program code that memory 82 is stored, and realizes that Fig. 1 is to Fig. 7's in above-described embodiment Data processing method.
Memory 82 is configured as storing various types of data to support the operation in terminal device.These data are shown Example includes the instruction of any application program or method for operating on the terminal device, such as message, picture, video etc..Deposit Reservoir 82 may include random access memory (random access memory, abbreviation RAM), it is also possible to further include non-volatile Property memory (non-volatile memory), for example, at least a magnetic disk storage.
Alternatively, processor 81 is arranged in processing component 80.The terminal device can also include:Communication component 83, electricity Source component 84, multimedia component 85, audio component 86, input/output interface 87 and/or sensor component 88.Terminal device has Component that body is included etc. is set according to actual demand, and the present embodiment is not construed as limiting this.
The integrated operation of the usual control terminal equipment of processing component 80.Processing component 80 can include one or more processing Device 81 carrys out execute instruction, to complete above-mentioned Fig. 1 to all or part of step of Fig. 7 methods.In addition, processing component 80 can include One or more modules, easy to the interaction between processing component 80 and other assemblies.For example, processing component 80 can include more matchmakers Module, to facilitate the interaction between multimedia component 85 and processing component 80.
Power supply module 84 provides electric power for the various assemblies of terminal device.Power supply module 84 can include power management system System, one or more power supplys, and other generate, manage and distribute the component that electric power is associated with for terminal device.
Multimedia component 85 is included in the display screen of one output interface of offer between terminal device and user.At some In embodiment, display screen can include liquid crystal display (LCD) and touch panel (TP).If display screen includes touch panel, Display screen may be implemented as touch-screen, to receive input signal from the user.Touch panel includes one or more touch Sensor is to sense the gesture on touch, slip and touch panel.The touch sensor can not only sense touch or slip The border of action, but also detection and the duration and pressure associated with the touch or slide operation.
Audio component 86 is configured as output and/or input audio signal.For example, audio component 86 includes a microphone (MIC), when terminal device is in operator scheme, during such as speech recognition mode, microphone is configured as receiving external audio signal. The received audio signal can be further stored in memory 82 or be sent via communication component 83.In some embodiments In, audio component 86 further includes a loudspeaker, for exports audio signal.
Input/output interface 87 provides interface, above-mentioned peripheral interface mould between processing component 80 and peripheral interface module Block can be click wheel, button etc..These buttons may include but be not limited to:Volume button, start button and locking press button.
Sensor component 88 includes one or more sensors, and the state for providing various aspects for terminal device is commented Estimate.For example, sensor component 88 can detect opening/closed mode of terminal device, the relative positioning of component, user and end The existence or non-existence of end equipment contact.Sensor component 88 can include proximity sensor, be configured to not any Physical contact when detect presence of nearby objects, including the distance between detection user and terminal device.In certain embodiments, The sensor component 88 can also include imaging first-class.
Communication component 83 is configured to facilitate the communication of wired or wireless way between terminal device and other equipment.Terminal Equipment can access the wireless network based on communication standard, such as WiFi, 2G or 3G, or combinations thereof.In one embodiment, It can include SIM card slot in the terminal device, which is used to be inserted into SIM card so that terminal device can log in GPRS network, is established with server by internet and communicated.
From the foregoing, it will be observed that communication component 83, audio component 86 and input/output interface involved in Figure 10 embodiments 87th, sensor component 88 can be as the implementation of the input equipment in Fig. 9 embodiments.
In a kind of terminal device of the present embodiment, the display, coupled to the processor, display interface, and According to operation information business object is shown in the interface;The processor, detects the operation information based on interface;Using institute The response that business object is performed to user's operation is stated to operate;The business object is used to carry out interactive, the business pair with user As the interaction image including being built based on business datum, the response operation includes the action that the interactive image performs.
The embodiment of the present application also provides a kind of operating system for terminal device, as shown in figure 11, the terminal device Operating system includes:Display unit 1102, detection unit 1104, response unit 1106.
Display unit 1102, display interface, and according to operation information business object is shown in the interface.
Detection unit 1104, detects the operation information based on interface.
Response unit 1106, the response performed using the business object to user's operation are operated;The business object is used Interactive in being carried out with user, the business object includes the interaction image based on business datum structure, and the response operation includes The action that the interactive image performs.
For device embodiment, since it is substantially similar to embodiment of the method, so description is fairly simple, it is related Part illustrates referring to the part of embodiment of the method.
Each embodiment in this specification is described by the way of progressive, what each embodiment stressed be with The difference of other embodiment, between each embodiment identical similar part mutually referring to.
It should be understood by those skilled in the art that, the embodiment of the embodiment of the present application can be provided as method, apparatus or calculate Machine program product.Therefore, the embodiment of the present application can use complete hardware embodiment, complete software embodiment or combine software and The form of the embodiment of hardware aspect.Moreover, the embodiment of the present application can use one or more wherein include computer can With in the computer-usable storage medium (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.) of program code The form of the computer program product of implementation.
In a typical configuration, the computer equipment includes one or more processors (CPU), input/output Interface, network interface and memory.Memory may include the volatile memory in computer-readable medium, random access memory The form such as device (RAM) and/or Nonvolatile memory, such as read-only storage (ROM) or flash memory (flash RAM).Memory is to calculate The example of machine computer-readable recording medium.Computer-readable medium includes permanent and non-permanent, removable and non-removable media can be with Realize that information stores by any method or technique.Information can be computer-readable instruction, data structure, the module of program or Other data.The example of the storage medium of computer includes, but are not limited to phase transition internal memory (PRAM), static RAM (SRAM), dynamic random access memory (DRAM), other kinds of random access memory (RAM), read-only storage (ROM), electrically erasable programmable read-only memory (EEPROM), fast flash memory bank or other memory techniques, read-only optical disc are read-only Memory (CD-ROM), digital versatile disc (DVD) or other optical storages, magnetic cassette tape, tape magnetic rigid disk storage or Other magnetic storage apparatus or any other non-transmission medium, the information that can be accessed by a computing device available for storage.According to Herein defines, and computer-readable medium does not include the computer readable media (transitory media) of non-standing, such as The data-signal and carrier wave of modulation.
The embodiment of the present application is with reference to according to the method for the embodiment of the present application, terminal device (system) and computer program The flowchart and/or the block diagram of product describes.It should be understood that it can realize flowchart and/or the block diagram by computer program instructions In each flow and/or block and flowchart and/or the block diagram in flow and/or square frame combination.These can be provided Computer program instructions are set to all-purpose computer, special purpose computer, Embedded Processor or other programmable data processing terminals Standby processor is to produce a machine so that is held by the processor of computer or other programmable data processing terminal equipments Capable instruction is produced and is used for realization in one flow of flow chart or multiple flows and/or one square frame of block diagram or multiple square frames The device for the function of specifying.
These computer program instructions, which may also be stored in, can guide computer or other programmable data processing terminal equipments In the computer-readable memory to work in a specific way so that the instruction being stored in the computer-readable memory produces bag The manufacture of command device is included, which realizes in one flow of flow chart or multiple flows and/or one side of block diagram The function of being specified in frame or multiple square frames.
These computer program instructions can be also loaded into computer or other programmable data processing terminal equipments so that Series of operation steps is performed on computer or other programmable terminal equipments to produce computer implemented processing, so that The instruction performed on computer or other programmable terminal equipments is provided and is used for realization in one flow of flow chart or multiple flows And/or specified in one square frame of block diagram or multiple square frames function the step of.
Although having been described for the preferred embodiment of the embodiment of the present application, those skilled in the art once know base This creative concept, then can make these embodiments other change and modification.So appended claims are intended to be construed to Including preferred embodiment and fall into all change and modification of the embodiment of the present application scope.
Finally, it is to be noted that, herein, relational terms such as first and second and the like be used merely to by One entity or operation are distinguished with another entity or operation, without necessarily requiring or implying these entities or operation Between there are any actual relationship or order.Moreover, term " comprising ", "comprising" or its any other variant meaning Covering non-exclusive inclusion, so that process, method, article or terminal device including a series of elements are not only wrapped Those key elements are included, but also including other elements that are not explicitly listed, or further include as this process, method, article Or the key element that terminal device is intrinsic.In the absence of more restrictions, wanted by what sentence "including a ..." limited Element, it is not excluded that also there are other identical element in the process including the key element, method, article or terminal device.
Above to a kind of interactive operation method provided herein, a kind of interactive operation device, a kind of terminal device, with And a kind of operating system for terminal device, it is described in detail, original of the specific case to the application used herein Reason and embodiment are set forth, and the explanation of above example is only intended to help to understand that the present processes and its core are thought Think;Meanwhile for those of ordinary skill in the art, according to the thought of the application, in specific embodiments and applications There will be changes, in conclusion this specification content should not be construed as the limitation to the application.

Claims (25)

  1. A kind of 1. interactive operation method, it is characterised in that including:
    Display interface, detects the operation information based on interface;
    Business object is shown in the interface according to the operation information, and is performed using the business object to user's operation Response operation;
    The business object is used to carry out interaction with user, and the business object includes the interactive shape based on business datum structure As the response operation includes the action that the interactive image performs.
  2. 2. according to the method described in claim 1, it is characterized in that, the response operation further includes:Voice output responds.
  3. 3. according to the method described in claim 1, it is characterized in that, it is described detection the operation information based on interface, including:
    Operation information is determined based on the user's operation at the interface, the operation information is detected.
  4. 4. according to the method described in claim 3, it is characterized in that, business is shown in the interface according to the operation information Object, including:
    According to the operation information acquisition business datum, parsing is carried out to the business datum and shows corresponding business object.
  5. 5. according to the method described in claim 3, it is characterized in that, described performed to user's operation using the business object Response operation, including:
    When receiving the user's operation, the interactive image display input sensed activation response is called.
  6. 6. according to the method described in claim 5, it is characterized in that, input sensed activation response includes:Listen attentively to action response, Or, check action response.
  7. 7. according to the method described in claim 3, it is characterized in that, described performed to user's operation using the business object Response operation, including:
    First response message is determined based on the user's operation, the interactive image display is called according to first response message Feedback response is acted, and exports voice feedback response.
  8. 8. the method according to the description of claim 7 is characterized in that the user's operation includes inquiry operation;Action feedback is rung It should include the first action feedback response of successful inquiring, and/or, inquire about failure second acts feedback response;Voice feedback is rung It should include the corresponding first voice feedback response of successful inquiring, and/or, inquire about the second voice feedback response of failure.
  9. 9. according to the method described in claim 3, it is characterized in that, be detected to the operation information, including:
    Detect whether the operation information meets filter condition.
  10. 10. according to the method described in claim 9, it is characterized in that, described performed to user's operation using the business object Response operation, including:
    Determine that the filter condition corresponds to the second response message according to the operation information, call the interactive vivid display action Prompting response, and export voice prompt response.
  11. 11. according to the method described in claim 3, it is characterized in that, be detected to the operation information, including:To described The input mode that operation information corresponds to user's operation is detected;
    The response operation includes:Output response corresponding with the input mode.
  12. 12. according to the method for claim 11, it is characterised in that the input mode includes:Phonetic entry mode and/or Character input modes.
  13. 13. according to the method described in claim 1, it is characterized in that, further include:
    The user information of user is pre-set, binds the user and the business object according to the user information.
  14. 14. according to the method for claim 13, it is characterised in that according to the user information by the user and the industry Business object binding, including:
    The corresponding interactive image of the business object is set according to user information, so that interaction be vivid and the spy of the user Sign matching.
  15. 15. according to the method for claim 14, it is characterised in that the feature includes the age, and the business object corresponds to Interactive image follow the time to update, to keep and the age-matched of the user.
  16. 16. according to the method described in claim 1, it is characterized in that, further include:
    When going out user intent information according to input information None- identified, response operation guide institute is performed using the business object The input of user is stated to identify user view.
  17. 17. according to the method described in claim 1, it is characterized in that, further include:
    The corresponding scene information of the business object is downloaded from server sync.
  18. 18. according to the method for claim 17, it is characterised in that the scene information includes:Model data, animation curve Table, material information and scenario parameters information.
  19. 19. according to the method for claim 18, it is characterised in that further include:
    According to predetermined manner and scene information, the response operation of the business object and the business object is configured.
  20. 20. according to any the method for claim 1-10,12-19, it is characterised in that the interactive image includes 3D interaction shapes As.
  21. 21. according to any the method for claim 1-10,12-19, it is characterised in that applied to network machine top box and/or electricity Depending on operating system.
  22. 22. according to claim 21 the method, it is characterised in that applied to children education field.
  23. A kind of 23. interactive operation device, it is characterised in that including:
    Display module, business object is shown for display interface, and according to operation information in the interface;
    Detection module, for detecting the operation information based on interface;
    Respond module, for performing the response operation to user's operation using the business object;The business object be used for User carries out interaction, and the business object includes the interaction image built based on business datum, described to respond operation including described The action that interactive image performs.
  24. A kind of 24. terminal device, it is characterised in that including:Processor, display;
    The display, coupled to the processor, display interface, and shows business according to operation information in the interface Object;
    The processor, detects the operation information based on interface;The response performed using the business object to user's operation is grasped Make;The business object is used to carry out interaction with user, and the business object includes the interaction image based on business datum structure, The response operation includes the action that the interactive image performs.
  25. A kind of 25. operating system for terminal device, it is characterised in that including:
    Display unit, display interface, and according to operation information business object is shown in the interface;
    Detection unit, detects the operation information based on interface;
    Response unit, the response performed using the business object to user's operation are operated;The business object is used for and user Progress is interactive, and the business object includes including the interaction based on the interactive vivid of business datum structure, the response operation The action that image performs.
CN201610887827.8A 2016-10-11 2016-10-11 Interactive operation method, apparatus, terminal device and operating system Pending CN107918518A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610887827.8A CN107918518A (en) 2016-10-11 2016-10-11 Interactive operation method, apparatus, terminal device and operating system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610887827.8A CN107918518A (en) 2016-10-11 2016-10-11 Interactive operation method, apparatus, terminal device and operating system

Publications (1)

Publication Number Publication Date
CN107918518A true CN107918518A (en) 2018-04-17

Family

ID=61891929

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610887827.8A Pending CN107918518A (en) 2016-10-11 2016-10-11 Interactive operation method, apparatus, terminal device and operating system

Country Status (1)

Country Link
CN (1) CN107918518A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109727597A (en) * 2019-01-08 2019-05-07 未来电视有限公司 The interaction householder method and device of voice messaging
CN110175061A (en) * 2019-05-20 2019-08-27 北京大米科技有限公司 Exchange method, device and electronic equipment based on animation
WO2020042987A1 (en) * 2018-08-29 2020-03-05 华为技术有限公司 Method and apparatus for presenting virtual robot image
CN111443905A (en) * 2019-01-16 2020-07-24 阿里巴巴集团控股有限公司 Service data processing method, device and system and electronic equipment
CN111898016A (en) * 2020-07-31 2020-11-06 北京百度网讯科技有限公司 Guide interaction method, resource database establishing method and device
CN112214115A (en) * 2020-09-25 2021-01-12 汉海信息技术(上海)有限公司 Input mode identification method and device, electronic equipment and storage medium
CN112348927A (en) * 2019-07-22 2021-02-09 菜鸟智能物流控股有限公司 Data processing method, device, equipment and machine readable medium
CN113885551A (en) * 2020-07-01 2022-01-04 丰田自动车株式会社 Information processing apparatus, information processing method, and moving object
CN111898016B (en) * 2020-07-31 2024-04-26 北京百度网讯科技有限公司 Method for guiding interaction, method and device for establishing resource database

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1372660A (en) * 2000-03-09 2002-10-02 皇家菲利浦电子有限公司 Method for interacting with a consumer electronics system
US20140125678A1 (en) * 2012-07-11 2014-05-08 GeriJoy Inc. Virtual Companion
CN104503568A (en) * 2014-12-05 2015-04-08 广东小天才科技有限公司 Implementation method and device for terminal usage rules
CN105141587A (en) * 2015-08-04 2015-12-09 广东小天才科技有限公司 Virtual doll interaction method and device
CN105389461A (en) * 2015-10-21 2016-03-09 胡习 Interactive children self-management system and management method thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1372660A (en) * 2000-03-09 2002-10-02 皇家菲利浦电子有限公司 Method for interacting with a consumer electronics system
US20140125678A1 (en) * 2012-07-11 2014-05-08 GeriJoy Inc. Virtual Companion
CN104503568A (en) * 2014-12-05 2015-04-08 广东小天才科技有限公司 Implementation method and device for terminal usage rules
CN105141587A (en) * 2015-08-04 2015-12-09 广东小天才科技有限公司 Virtual doll interaction method and device
CN105389461A (en) * 2015-10-21 2016-03-09 胡习 Interactive children self-management system and management method thereof

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11883948B2 (en) 2018-08-29 2024-01-30 Huawei Technologies Co., Ltd. Virtual robot image presentation method and apparatus
WO2020042987A1 (en) * 2018-08-29 2020-03-05 华为技术有限公司 Method and apparatus for presenting virtual robot image
CN109727597A (en) * 2019-01-08 2019-05-07 未来电视有限公司 The interaction householder method and device of voice messaging
CN111443905A (en) * 2019-01-16 2020-07-24 阿里巴巴集团控股有限公司 Service data processing method, device and system and electronic equipment
CN111443905B (en) * 2019-01-16 2023-04-11 阿里巴巴集团控股有限公司 Service data processing method, device and system and electronic equipment
CN110175061A (en) * 2019-05-20 2019-08-27 北京大米科技有限公司 Exchange method, device and electronic equipment based on animation
CN112348927A (en) * 2019-07-22 2021-02-09 菜鸟智能物流控股有限公司 Data processing method, device, equipment and machine readable medium
CN112348927B (en) * 2019-07-22 2024-04-02 菜鸟智能物流控股有限公司 Data processing method, device, equipment and machine-readable medium
CN113885551A (en) * 2020-07-01 2022-01-04 丰田自动车株式会社 Information processing apparatus, information processing method, and moving object
CN113885551B (en) * 2020-07-01 2024-03-26 丰田自动车株式会社 Information processing device, information processing method, and moving object
CN111898016A (en) * 2020-07-31 2020-11-06 北京百度网讯科技有限公司 Guide interaction method, resource database establishing method and device
CN111898016B (en) * 2020-07-31 2024-04-26 北京百度网讯科技有限公司 Method for guiding interaction, method and device for establishing resource database
CN112214115A (en) * 2020-09-25 2021-01-12 汉海信息技术(上海)有限公司 Input mode identification method and device, electronic equipment and storage medium
CN112214115B (en) * 2020-09-25 2024-04-30 汉海信息技术(上海)有限公司 Input mode identification method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
KR102306624B1 (en) Persistent companion device configuration and deployment platform
US10357881B2 (en) Multi-segment social robot
CN107918518A (en) Interactive operation method, apparatus, terminal device and operating system
US20170206064A1 (en) Persistent companion device configuration and deployment platform
WO2020083021A1 (en) Video recording method and apparatus, video playback method and apparatus, device, and storage medium
AU2014236686B2 (en) Apparatus and methods for providing a persistent companion device
CN108353103B (en) User terminal device for recommending response message and method thereof
WO2016011159A1 (en) Apparatus and methods for providing a persistent companion device
Irwin Digital media: Human–technology connection
KR101927706B1 (en) Method for recommending music for various situations and apparatus using the same
CN107005612A (en) Digital assistants warning system
CN111031386B (en) Video dubbing method and device based on voice synthesis, computer equipment and medium
CN103430217A (en) Input support device, input support method, and recording medium
CN104035995A (en) Method and device for generating group tags
CN107423106A (en) The method and apparatus for supporting more frame grammars
CN105721904B (en) The method of the content output of display device and control display device
CN108140045A (en) Enhancing and supporting to perceive and dialog process amount in alternative communication system
CN108289237A (en) Play method, apparatus, terminal and the computer readable storage medium of dynamic picture
CN106845928A (en) Wake method and device
CN107368562A (en) Display methods, device and the terminal of the page
WO2018183812A1 (en) Persistent companion device configuration and deployment platform
WO2022262560A1 (en) Image display method and apparatus, device, and storage medium
CN109348353B (en) Service processing method and device of intelligent sound box and intelligent sound box
CN116564272A (en) Method for providing voice content and electronic equipment
WO2023230629A1 (en) Engagement and synchronization using received audio or visual cues

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20180417

RJ01 Rejection of invention patent application after publication