CN108236786A - The virtual feeding method and machine animal of machine animal - Google Patents

The virtual feeding method and machine animal of machine animal Download PDF

Info

Publication number
CN108236786A
CN108236786A CN201611213583.1A CN201611213583A CN108236786A CN 108236786 A CN108236786 A CN 108236786A CN 201611213583 A CN201611213583 A CN 201611213583A CN 108236786 A CN108236786 A CN 108236786A
Authority
CN
China
Prior art keywords
food
target food
machine
animal
machine animal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201611213583.1A
Other languages
Chinese (zh)
Other versions
CN108236786B (en
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Kuang Chi Hezhong Technology Ltd
Shenzhen Guangqi Hezhong Technology Co Ltd
Original Assignee
Shenzhen Guangqi Hezhong Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Guangqi Hezhong Technology Co Ltd filed Critical Shenzhen Guangqi Hezhong Technology Co Ltd
Priority to CN201611213583.1A priority Critical patent/CN108236786B/en
Publication of CN108236786A publication Critical patent/CN108236786A/en
Application granted granted Critical
Publication of CN108236786B publication Critical patent/CN108236786B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H13/00Toy figures with self-moving parts, with or without movement of the toy as a whole
    • A63H13/02Toy figures with self-moving parts, with or without movement of the toy as a whole imitating natural actions, e.g. catching a mouse by a cat, the kicking of an animal
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/0005Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Abstract

The invention discloses the virtual feeding method and machine animal of a kind of machine animal, wherein, this method includes, and above-mentioned machine animal carries out virtual feeding in the following manner:Target food is positioned and generates positioning result;According to positioning result, above-mentioned target food is obtained;The above-mentioned target food of acquisition is digested.The present invention is solved in the relevant technologies between nobody and machine animal the technical issues of the method for virtual " feeding ".

Description

The virtual feeding method and machine animal of machine animal
Technical field
The present invention relates to field of human-computer interaction, in particular to the virtual feeding method and machine of a kind of machine animal Animal.
Background technology
At present, with the continuous destruction of environment, more and more biologies are on the verge of to disappear on the earth, and people will may forever see Less than the biology disappeared.In addition, some rare animals have been protected (such as panda), general tourists can not directly with Rare animal contacts, and for the biology (such as dinosaur) that some have become extinct, and people are even more that can not contact.With society Development, people increasingly thirst for contacting and interacting with rare animal, and the channel of reality is very limited so that people can not Touch rare animal.
With the development of emulation technology, people take up to emulate each things.The emulation of animal is gone back at present Do not record.Because animal has with attributes such as walking, sounding, knot groups, using machine animal simulation real animals often It is difficult to realize, for example, simulates the virtual feeding between machine animals and human beings.
In view of the above-mentioned problems, currently no effective solution has been proposed.
Invention content
Virtual feeding method and machine animal an embodiment of the present invention provides a kind of machine animal is at least to solve correlation In technology between nobody and machine animal the technical issues of the method for virtual " feeding ".
One side according to embodiments of the present invention provides a kind of virtual feeding method, above-mentioned applied to machine animal Machine animal carries out virtual feeding in the following manner:Target food is positioned and generates positioning result;According to above-mentioned fixed Position is as a result, obtain above-mentioned target food;The above-mentioned target food of acquisition is digested.
Further, in the case where above-mentioned target food is machine food and/or real food, above-mentioned machine animal pair Above-mentioned target food, which is positioned and generates positioning result, to be included:Above-mentioned machine animal passes through visual scanning mode and/or acquisition The mode of image positions above-mentioned target food and generates positioning result.
Further, in the case where above-mentioned target food is above-mentioned machine food, above-mentioned machine animal is to above-mentioned target Food is positioned and generates positioning result, and above-mentioned machine animal obtains above-mentioned target food and include according to positioning result:It is above-mentioned Machine animal receives the position signal that above-mentioned machine food is sent out to it;Above-mentioned machine animal according to the position signal received, Above-mentioned target food is positioned and generates positioning result;Above-mentioned machine animal is moved to above-mentioned according to positioning result, navigation Position where target food simultaneously obtains the target food.
Further, above-mentioned machine food to above-mentioned machine animal send position signal mode include it is following at least it One:Means of electromagnetic waves;Light wave mode;Sound wave mode.
Further, above-mentioned machine animal includes the above-mentioned target food digestion of acquisition:Above-mentioned machine animal will be above-mentioned In target food suction body;Or above-mentioned machine animal adsorbs target food on above-mentioned machine animal ontology.
Further, in the case where above-mentioned target food is virtual food, above-mentioned machine animal is to above-mentioned target food It is positioned and generates positioning result and included:Above-mentioned machine animal with virtual unit by communicating, to obtain above-mentioned virtual unit The location information of middle virtual food;According to the location information of acquisition, above-mentioned virtual food is positioned and generates positioning result.
Further, the communication mode that above-mentioned machine animal communicates with virtual unit includes at least one of:It is WIFI, near Field communication RF, bluetooth, Zigbee.
Further, above-mentioned machine animal to above-mentioned target food position and be included:It is above-mentioned when user sends out sound wave Machine animal identifies the sound wave, and is determined whether according to recognition result to obtain above-mentioned target food;If so, above-mentioned machine animal is opened Beginning positions above-mentioned target food.
Further, above-mentioned machine animal identifies the sound wave, and determines whether it is above-mentioned target food according to recognition result Including:Obtain the food information that above-mentioned sound wave carries;Judge above-mentioned food information food whether corresponding with above-mentioned target food Information is consistent;If so, it is determined as above-mentioned target food.
Further, it is above-mentioned to include the above-mentioned target food digestion of acquisition:Above-mentioned machine animal identifies above-mentioned target food The type of object;Above-mentioned machine animal takes corresponding feeding methods according to recognition result, and according to the type of above-mentioned machine animal Digest above-mentioned target food.
According to another aspect of the present invention, a kind of machine animal is additionally provided, including:Positioning unit, for being eaten to target Object is positioned and generates positioning result;Acquiring unit, for according to above-mentioned positioning result, obtaining above-mentioned target food;Digestion Unit, for the above-mentioned target food obtained to be digested.
Further, above-mentioned positioning unit eats above-mentioned target by visual scanning mode and/or the mode of acquisition image Object is positioned and generates positioning result, wherein, above-mentioned target food is machine food and/or real food.
Further, above-mentioned positioning unit includes:Receiving module, for receiving the position that above-mentioned machine food is sent out to it Signal;First locating module, for according to the position signal received, being positioned to above-mentioned target food and generating positioning knot Fruit;Navigation module, for according to positioning result, position that navigation is moved to where above-mentioned target food simultaneously obtains above-mentioned target food Object.
Further, above-mentioned positioning unit includes:Acquisition module, in the situation that above-mentioned target food is virtual food Under, it communicates with virtual unit, to obtain the location information of virtual food in above-mentioned virtual unit;Second locating module, for root According to the location information of acquisition, above-mentioned virtual food is positioned and generates positioning result.
Further, above-mentioned positioning unit includes:First identification module, for when user sends out sound wave, identifying the sound Wave, and determine whether it is above-mentioned target food according to recognition result;Third locating module, for being being determined according to recognition result In the case of above-mentioned target food, above-mentioned target food is positioned.
Further, above-mentioned first identification module includes:Acquisition submodule, for obtaining the food of the carrying of above-mentioned sound wave Information;Judging submodule, for judging that whether corresponding with above-mentioned target food above-mentioned food information food information be consistent;It determines Submodule, in the case where judging above-mentioned food information food information unanimous circumstances corresponding with above-mentioned target food, being determined as Above-mentioned target food.
Further, above-mentioned digestion unit includes:Second identification module, for identifying the type of above-mentioned target food;Disappear Change module, for according to recognition result, and according to the type of above-mentioned machine animal, corresponding feeding methods is taken to digest above-mentioned mesh Mark food.
Further, above-mentioned digestion module includes:Suction module, for will be in above-mentioned target food suction body;And/or Adsorption module, for adsorbing target food in above-mentioned machine animal ontology.
In embodiments of the present invention, by the way of virtual feeding, applied to machine animal, wherein, this method includes, on It states machine animal and carries out virtual feeding in the following manner:Target food is positioned and generates positioning result;According to positioning As a result, obtain above-mentioned target food;The above-mentioned target food of acquisition is digested, has been reached between people and machine animal " feeding " Purpose, since in actual life, people seldom have an opportunity to interact with rare biology or the biology to have disappeared, and pass through The mode of virtual feeding, realizes the technique effect of the interactive process of simulation people and machine animal, and then solves the relevant technologies Between middle nobody and machine animal the technical issues of the method for virtual " feeding ".
Description of the drawings
Attached drawing described herein is used to provide further understanding of the present invention, and forms the part of the application, this hair Bright illustrative embodiments and their description do not constitute improper limitations of the present invention for explaining the present invention.In the accompanying drawings,
Fig. 1 is a kind of flow chart of optional virtual feeding according to embodiments of the present invention;
Fig. 2 is a kind of schematic diagram fed optionally with machine food or real food according to embodiments of the present invention;
Fig. 3 is a kind of schematic diagram optionally with machine diet according to embodiments of the present invention;
Fig. 4 is a kind of schematic diagram fed optionally with virtual food according to embodiments of the present invention;
Fig. 5 is a kind of schematic diagram of optional machine animal according to embodiments of the present invention.
Specific embodiment
In order to which those skilled in the art is made to more fully understand the present invention program, below in conjunction in the embodiment of the present invention The technical solution in the embodiment of the present invention is clearly and completely described in attached drawing, it is clear that described embodiment is only The embodiment of a part of the invention, instead of all the embodiments.Based on the embodiments of the present invention, ordinary skill people Member's all other embodiments obtained without making creative work should all belong to the model that the present invention protects It encloses.
It should be noted that term " first " in description and claims of this specification and above-mentioned attached drawing, " Two " etc. be the object for distinguishing similar, and specific sequence or precedence are described without being used for.It should be appreciated that it uses in this way Data can be interchanged in the appropriate case, so as to the embodiment of the present invention described herein can in addition to illustrating herein or Sequence other than those of description is implemented.In addition, term " comprising " and " having " and their any deformation, it is intended that cover Cover non-exclusive include.
According to embodiments of the present invention, a kind of embodiment of the method for virtual feeding is provided.
Fig. 1 is a kind of flow chart of optional virtual feeding according to embodiments of the present invention, as shown in Figure 1, this method should For machine animal, which carries out virtual feeding, includes the following steps in the following manner:
S102 positions target food and generates positioning result;
S104 according to positioning result, obtains above-mentioned target food;
S106 digests the above-mentioned target food of acquisition.
That is, the machine animal first positions target food, the relevant informations such as the position of target food are determined, Then according to positioning result (i.e. relevant information), machine animal obtains the target food, finally, by target food in various ways Digestion.It should be noted that above-mentioned machine animal can be existing existing animal, such as panda, kangaroo, koala etc. on celestial body Deng;It can also be the animal being destroyed on celestial body, such as dinosaur;Further, it is also possible to it is to be not present on celestial body, the mankind fictionalize The various biologies come.
In embodiments of the present invention, by the way of virtual feeding, applied to machine animal, wherein, this method includes, on It states machine animal and carries out virtual feeding in the following manner:Target food is positioned;According to positioning result, above-mentioned mesh is obtained Mark food;The above-mentioned target food of acquisition is digested, has achieved the purpose that between people and machine animal " feeding ", due in reality In life, people seldom have an opportunity to interact with rare biology or the biology to have disappeared, and by way of virtual feeding, Realize the technique effect of the interactive process of simulation people and machine animal, so solve in the relevant technologies people and machine animal it Between " feeding " the technical issues of.In addition, various machine animals, can be formed machine animal garden, trip by method using the present invention It is objective mind the machine animal when, can be interacted with machine animal, such as feeding project, can to the entertaining that tourist brings, experience not The same amusement makes zoo more joyful.
Optionally, as shown in Fig. 2, in the case where above-mentioned target food is machine food and/or real food, above-mentioned machine Device animal, which positions above-mentioned target food and generates positioning result, to be included:Above-mentioned machine animal passes through its visual scanning mode And/or by way of acquiring image, above-mentioned target food is positioned and generates positioning result.
Wherein, machine food can be but not limited to ferrous material, which can simulate various true in reality Real food, such as machine vegetables, machine meat etc..Real food can be but not limited to the real-world objects such as plant leaves, paper scrap.Machine Device animal distinguishes the position of machine food and/or real food by machine vision or the method for acquiring image procossing, and then Position machine food and/or real food.
Optionally, as shown in figure 3, in the case where above-mentioned target food is above-mentioned machine food, above-mentioned machine animal pair Above-mentioned target food carries out positioning and includes:Above-mentioned machine animal receives the position signal that above-mentioned machine food is sent out to it;It is above-mentioned Machine animal positions above-mentioned target food according to the position signal received;Above-mentioned machine animal according to positioning result, Navigate to the position where above-mentioned target food.In the case that user is using machine food feeding machine animal, machine food It can be communicated by certain communication mode with machine animal, and position signal is sent out to machine animal, at this point, machine animal passes through The position signal positions and the position being moved to where machine food of navigating.
Optionally, the mode of above-mentioned machine food to above-mentioned machine animal transmission position signal includes at least one of: Means of electromagnetic waves;Light wave mode;Sound wave mode.It should be noted that machine food and machine animal communication mode are included but not It is limited to means of electromagnetic waves;Light wave mode;Sound wave mode.
Optionally, above-mentioned machine food includes the above-mentioned target food digestion of acquisition:Above-mentioned machine food is by sucking Device will be in above-mentioned target food suction body;Or above-mentioned machine food is adsorbed target food in machine by electromagnetic induction device On the ontology of device animal.It, can be with machine when machine food is ironwork or other can generate the mechanical food of electromagnetic induction Target food is adsorbed (neck of such as machine animal) or head on oneself ontology by device food by electromagnetic induction device Other positions such as portion can also be the automatic lips of machine animal, in machine food suction inlet.
Optionally, in above-mentioned target food in the case of virtual food, above-mentioned machine animal to above-mentioned target food into Row positioning includes:Above-mentioned machine animal with virtual unit by communicating, to obtain the location information of above-mentioned virtual food;According to obtaining The location information taken positions above-mentioned virtual food.
In this case, user needs to put on virtual unit, which can be but not limited to AR equipment and (increase Strong reality technology equipment), MR equipment (i.e. mixed reality technical equipment) etc., by augmented reality and mixed reality technology, It is seen that the superposition of virtual food and machine animal, with the progress that " eating " of machine animal acts, virtual food will be slow Slowly become smaller and disappear, user will see that virtual food digests this process by machine animal by virtual unit.And this is virtual The type of food is varied, can change.It should be noted that above-mentioned virtual unit is to enable users to experience and participate in void Intend the equipment in the world, be not imaginary equipment.
Optionally, as shown in figure 4, the communication mode to be communicated with virtual unit in above-mentioned machine animal include it is following at least it One:WIFI, near-field communication RF, bluetooth, Zigbee.That is, machine animal is with WIFI, near-field communication RF, bluetooth or Zigbee It communications such as a kind of (wireless communication techniques based on IEEE802.15.4 standard low energy consumption local area protocols) and virtually sets Standby communication, and the location information of virtual food is obtained, so as to carry out the action of " eating food ".The presentation mode of virtual food can be with It is varied.
Optionally, above-mentioned machine animal to above-mentioned target food position and be included:When user sends out sound wave, above-mentioned machine The device animal identification sound wave, and determined whether according to recognition result to obtain above-mentioned target food;If so, above-mentioned machine animal starts Above-mentioned target food is positioned.If determining it is not above-mentioned target food according to recognition result, machine animal keeps original State.For example, when the machine animal is panda, user can say two word of panda or other machine animals and can identify Sound wave, machine animal is by identifying that the sound wave that user sends out is judged at this time, when machine animal judges that above-mentioned mesh must be obtained When marking food, machine animal starts to position above-mentioned target food, if judgement is not the sound wave of machine animal identification, Machine animal is to the target food without positioning.In addition, according to the difference of machine animal type, different sound can be defined Wave;If dinosaur machine corresponds to dinosaur sound wave, panda machine corresponds to panda sound wave etc., and different types of machine animal passes through voice Identification is edible to discern whether the food of oneself.
Optionally, above-mentioned machine animal includes the above-mentioned target food digestion of acquisition:Above-mentioned machine animal identification is above-mentioned The type of target food;Above-mentioned machine animal takes corresponding feed side according to recognition result, and according to the type of machine animal Formula.That is, machine animal can identify the type of above-mentioned target food, such as classification of the target food of identification is paper scrap, leaf When, feeding methods of the machine animal to paper scrap and the feeding methods to leaf can take different modes or identical Feeding methods.Alternatively, according to different machine animal types, determining will be in target food suction body and/or absorption is in machine On animal ontology.
Embodiment 2
According to embodiments of the present invention, a kind of machine animal embodiment is additionally provided.Fig. 5 is according to embodiments of the present invention one The schematic diagram of the optional machine animal of kind, as shown in figure 5, a kind of machine animal is additionally provided, including:Positioning unit 20, is used for Target food is positioned and generates positioning result;Acquiring unit 40, for according to above-mentioned positioning result, obtaining above-mentioned target Food;Unit 60 is digested, for the above-mentioned target food obtained to be digested.
Optionally, positioning unit positions target food by visual scanning mode and/or the mode of acquisition image And positioning result is generated, wherein, target food is machine food and/or real food.
Optionally, positioning unit includes:Receiving module, for receiving the position signal that machine food is sent out to it;First Locating module, for according to the position signal received, being positioned to target food and generating positioning result;Navigation module, For according to positioning result, position that navigation is moved to where target food simultaneously obtains target food.
Optionally, positioning unit includes:Acquisition module, in target food in the case of virtual food, it is and virtual Equipment communicates, to obtain the location information of virtual food in virtual unit;Second locating module, for being believed according to the position of acquisition Breath, positions virtual food and generates positioning result.
Optionally, positioning unit includes:First identification module, for when user sends out sound wave, identifying the sound wave, and root Determine whether it is target food according to recognition result;Third locating module, for determining it is target food according to recognition result In the case of, machine animal positions target food.
Optionally, the first identification module includes:Acquisition submodule, for obtaining the food information of sound wave carrying;Judge son Module, for judging that whether corresponding with target food food information food information be consistent;Determination sub-module, for judging to eat Under object information food information unanimous circumstances corresponding with target food, it is determined as target food.
Optionally, digestion unit includes:Second identification module, for identifying the type of target food;Module is digested, is used for According to recognition result, and according to the type of machine animal, the mode of corresponding feed is taken to digest target food.
Optionally, digestion module includes:Suction module, for will be in above-mentioned target food suction body;And/or absorption mould Block, for adsorbing target food in above-mentioned machine animal ontology.
In embodiments of the present invention, achieved the purpose that between people and machine animal " feeding ", due in actual life, People seldom have an opportunity to interact with rare biology or the biology to have disappeared, and by way of virtual feeding, it realizes The technique effect of the interactive process of people and machine animal is simulated, and then solves in the relevant technologies and " is fed between people and machine animal The technical issues of food ".In addition, various machine animals, can be formed machine animal garden, tourist's viewing by method using the present invention It during machine animal, can interact, such as feeding project, can be experienced different to the entertaining that tourist brings with machine animal Amusement, makes zoo more joyful.
It should be noted that machine animal with the machine animal in embodiment 1 is corresponding, embodiment 2 in embodiment 2 In machine animal can be applied in embodiment 1, detailed description please refer in embodiment 1, details are not described herein.
The embodiments of the present invention are for illustration only, do not represent the quality of embodiment.
In the above embodiment of the present invention, all emphasize particularly on different fields to the description of each embodiment, do not have in some embodiment The part of detailed description may refer to the associated description of other embodiment.
In several embodiments provided herein, it should be understood that disclosed technology contents can pass through others Mode is realized.Wherein, the apparatus embodiments described above are merely exemplary, such as the division of the unit, Ke Yiwei A kind of division of logic function, can there is an other dividing mode in actual implementation, for example, multiple units or component can combine or Person is desirably integrated into another system or some features can be ignored or does not perform.Another point, shown or discussed is mutual Between coupling, direct-coupling or communication connection can be INDIRECT COUPLING or communication link by some interfaces, unit or module It connects, can be electrical or other forms.
The unit illustrated as separating component may or may not be physically separate, be shown as unit The component shown may or may not be physical unit, you can be located at a place or can also be distributed to multiple On unit.Some or all of unit therein can be selected according to the actual needs to realize the purpose of this embodiment scheme.
In addition, each functional unit in each embodiment of the present invention can be integrated in a processing unit, it can also That each unit is individually physically present, can also two or more units integrate in a unit.Above-mentioned integrated list The form that hardware had both may be used in member is realized, can also be realized in the form of SFU software functional unit.
If the integrated unit is realized in the form of SFU software functional unit and is independent product sale or uses When, it can be stored in a computer read/write memory medium.Based on such understanding, technical scheme of the present invention is substantially The part to contribute in other words to the prior art or all or part of the technical solution can be in the form of software products It embodies, which is stored in a storage medium, is used including some instructions so that a computer Equipment (can be personal computer, server or network equipment etc.) perform each embodiment the method for the present invention whole or Part steps.And aforementioned storage medium includes:USB flash disk, read-only memory (ROM, Read-Only Memory), arbitrary access are deposited Reservoir (RAM, Random Access Memory), mobile hard disk, magnetic disc or CD etc. are various can to store program code Medium.
The above is only the preferred embodiment of the present invention, it is noted that for the ordinary skill people of the art For member, various improvements and modifications may be made without departing from the principle of the present invention, these improvements and modifications also should It is considered as protection scope of the present invention.

Claims (18)

1. the virtual feeding method of a kind of machine animal, which is characterized in that the machine animal carries out virtually in the following manner Feeding:
Target food is positioned and generates positioning result;
According to the positioning result, the target food is obtained;
The target food of acquisition is digested.
2. according to the method described in claim 1, it is characterized in that, it is machine food and/or true food in the target food In the case of object, it is described target food is positioned and generates positioning result include:
The machine animal positions simultaneously the target food by visual scanning mode and/or the mode of acquisition image Generate positioning result.
3. according to the method described in claim 2, it is characterized in that, in the situation that the target food is the machine food Under, it is described that target food is positioned and generates positioning result, according to the positioning result, obtain the target food packet It includes:
The machine animal receives the position signal that the machine food is sent out to it;
The machine animal positions the target food and generates positioning result according to the position signal received;
For the machine animal according to positioning result, position that navigation is moved to where the target food simultaneously obtains the target food Object.
4. according to the method described in claim 3, it is characterized in that, the machine food sends position letter to the machine animal Number mode include at least one of:
Means of electromagnetic waves;
Light wave mode;
Sound wave mode.
5. according to the method described in claim 1, it is characterized in that, described include the target food digestion of acquisition:
The machine animal will be in the target food suction body;Or
The machine animal adsorbs target food on the machine animal ontology.
6. according to the method described in claim 1, it is characterized in that, in the target food in the case of virtual food,
It is described target food is positioned and generates positioning result include:
The machine animal with virtual unit by communicating, to obtain the location information of virtual food in the virtual unit;
According to the location information of acquisition, the virtual food is positioned and generates positioning result.
7. the according to the method described in claim 6, it is characterized in that, communication mode that the machine animal communicates with virtual unit Including at least one of:
WIFI, near-field communication RF, bluetooth, Zigbee.
8. according to the method described in claim 1, it is characterized in that, it is described to target food carry out positioning include:
When user sends out sound wave, the machine animal identifies the sound wave, and determines whether it is the target according to recognition result Food;
If so, the machine animal positions the target food.
9. according to the method described in claim 8, it is characterized in that, the machine animal identifies the sound wave, and is tied according to identification Fruit determines whether that the target food includes:
Obtain the food information that the sound wave carries;
Judge that whether corresponding with the target food food information food information be consistent;
If so, it is determined as the target food.
10. according to the method described in claim 1, it is characterized in that, described include the target food digestion of acquisition:
The machine animal identifies the type of the target food;
The machine animal takes corresponding feeding methods to digest according to recognition result, and according to the type of the machine animal The target food.
11. a kind of machine animal, which is characterized in that including:
Positioning unit, for being positioned to target food and generating positioning result;
Acquiring unit, for according to the positioning result, obtaining the target food;
Unit is digested, for the target food obtained to be digested.
12. machine animal according to claim 11, which is characterized in that the positioning unit passes through visual scanning mode And/or the mode of acquisition image, the target food is positioned and generates positioning result, wherein, the target food is Machine food and/or real food.
13. machine animal according to claim 11, which is characterized in that the positioning unit includes:
Receiving module, for receiving the position signal that the machine food is sent out to it;
First locating module, for according to the position signal received, being positioned to the target food and generating positioning knot Fruit;
Navigation module, for according to positioning result, position that navigation is moved to where the target food simultaneously obtains the target Food.
14. machine animal according to claim 11, which is characterized in that the positioning unit includes:
Acquisition module in the case of in the target food for virtual food, communicates, to obtain the void with virtual unit Propose the location information of standby middle virtual food;
Second locating module for the location information according to acquisition, positions the virtual food and generates positioning result.
15. machine animal according to claim 11, which is characterized in that the positioning unit includes:
First identification module for when user sends out sound wave, identifying the sound wave, and determines whether it is described according to recognition result Target food;
Third locating module, in the case where determining to be the target food according to recognition result, to the target food It is positioned.
16. machine animal according to claim 15, which is characterized in that first identification module includes:
Acquisition submodule, for obtaining the food information that the sound wave carries;
Judging submodule, for judging that whether corresponding with the target food food information food information be consistent;
Determination sub-module, for judging food information food information unanimous circumstances corresponding with the target food Under, it is determined as the target food.
17. machine animal according to claim 11, which is characterized in that the digestion unit includes:
Second identification module, for identifying the type of the target food;
Module is digested, for according to recognition result, and according to the type of the machine animal, corresponding feeding methods is taken to digest The target food.
18. machine animal according to claim 17, which is characterized in that the digestion module includes:
Suction module, for will be in the target food suction body;And/or
Adsorption module, for adsorbing target food in the machine animal ontology.
CN201611213583.1A 2016-12-23 2016-12-23 Virtual feeding method of machine animal and machine animal Active CN108236786B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611213583.1A CN108236786B (en) 2016-12-23 2016-12-23 Virtual feeding method of machine animal and machine animal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611213583.1A CN108236786B (en) 2016-12-23 2016-12-23 Virtual feeding method of machine animal and machine animal

Publications (2)

Publication Number Publication Date
CN108236786A true CN108236786A (en) 2018-07-03
CN108236786B CN108236786B (en) 2020-11-27

Family

ID=62704547

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611213583.1A Active CN108236786B (en) 2016-12-23 2016-12-23 Virtual feeding method of machine animal and machine animal

Country Status (1)

Country Link
CN (1) CN108236786B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110796043A (en) * 2019-10-16 2020-02-14 北京海益同展信息科技有限公司 Container detection and feeding detection method and device and feeding system

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2945320A (en) * 1957-12-23 1960-07-19 Marvin I Glass Dynamic toy simulating a feeding bird or animal
JPH08323047A (en) * 1995-05-31 1996-12-10 Iwaya Co Ltd Action toy
US5924387A (en) * 1995-11-08 1999-07-20 Schramer; D. Gregory Interactive pet toy
CN1562575A (en) * 2004-03-25 2005-01-12 上海交通大学 Robot of feeding paper for copying machine automatically
CN101239239A (en) * 2007-02-08 2008-08-13 陈灿龄 Feeding game device and method thereof
CN101648079A (en) * 2009-09-02 2010-02-17 杭州全动科技有限公司 Emotional doll
CN201931466U (en) * 2010-12-31 2011-08-17 东莞理工学院 Manipulator structure with machine vision navigating function
CN102527045A (en) * 2012-01-14 2012-07-04 李慈 Intelligent learning doll and realizing method and circuit system thereof
CN202342857U (en) * 2011-11-29 2012-07-25 梁湘婕 Rag baby
CN103522291A (en) * 2013-10-29 2014-01-22 中国人民解放军总装备部军械技术研究所 Target capturing system and method of explosive ordnance disposal robot
CN103699136A (en) * 2014-01-14 2014-04-02 河海大学常州校区 Intelligent household service robot system and service method based on leapfrogging algorithm
CN103745615A (en) * 2014-01-08 2014-04-23 赵伟 Smart camera and WiFi (wireless fidelity) location positioning-based park parking space navigation system
CN104067781A (en) * 2014-06-16 2014-10-01 华南农业大学 Virtual robot and real robot integration based picking system and method
CN203965888U (en) * 2013-12-30 2014-11-26 深圳市德宝威科技有限公司 Robot system and robot office, teaching, design, engineering, home system
CN205394558U (en) * 2015-12-23 2016-07-27 无锡吾芯互联科技有限公司 Family expenses intelligent robot that helps elderly
CN107596698A (en) * 2017-09-27 2018-01-19 深圳市天博智科技有限公司 A kind of control system and implementation method of Intelligent bionic machinery dog
CN207682392U (en) * 2017-11-13 2018-08-03 惠州壹创意互动科技有限公司 Robot with identification crowd's position functions

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2945320A (en) * 1957-12-23 1960-07-19 Marvin I Glass Dynamic toy simulating a feeding bird or animal
JPH08323047A (en) * 1995-05-31 1996-12-10 Iwaya Co Ltd Action toy
US5924387A (en) * 1995-11-08 1999-07-20 Schramer; D. Gregory Interactive pet toy
CN1562575A (en) * 2004-03-25 2005-01-12 上海交通大学 Robot of feeding paper for copying machine automatically
CN101239239A (en) * 2007-02-08 2008-08-13 陈灿龄 Feeding game device and method thereof
CN101648079A (en) * 2009-09-02 2010-02-17 杭州全动科技有限公司 Emotional doll
CN201931466U (en) * 2010-12-31 2011-08-17 东莞理工学院 Manipulator structure with machine vision navigating function
CN202342857U (en) * 2011-11-29 2012-07-25 梁湘婕 Rag baby
CN102527045A (en) * 2012-01-14 2012-07-04 李慈 Intelligent learning doll and realizing method and circuit system thereof
CN103522291A (en) * 2013-10-29 2014-01-22 中国人民解放军总装备部军械技术研究所 Target capturing system and method of explosive ordnance disposal robot
CN203965888U (en) * 2013-12-30 2014-11-26 深圳市德宝威科技有限公司 Robot system and robot office, teaching, design, engineering, home system
CN103745615A (en) * 2014-01-08 2014-04-23 赵伟 Smart camera and WiFi (wireless fidelity) location positioning-based park parking space navigation system
CN103699136A (en) * 2014-01-14 2014-04-02 河海大学常州校区 Intelligent household service robot system and service method based on leapfrogging algorithm
CN104067781A (en) * 2014-06-16 2014-10-01 华南农业大学 Virtual robot and real robot integration based picking system and method
CN205394558U (en) * 2015-12-23 2016-07-27 无锡吾芯互联科技有限公司 Family expenses intelligent robot that helps elderly
CN107596698A (en) * 2017-09-27 2018-01-19 深圳市天博智科技有限公司 A kind of control system and implementation method of Intelligent bionic machinery dog
CN207682392U (en) * 2017-11-13 2018-08-03 惠州壹创意互动科技有限公司 Robot with identification crowd's position functions

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110796043A (en) * 2019-10-16 2020-02-14 北京海益同展信息科技有限公司 Container detection and feeding detection method and device and feeding system
CN110796043B (en) * 2019-10-16 2021-04-30 北京海益同展信息科技有限公司 Container detection and feeding detection method and device and feeding system

Also Published As

Publication number Publication date
CN108236786B (en) 2020-11-27

Similar Documents

Publication Publication Date Title
US9836929B2 (en) Mobile devices and methods employing haptics
KR101832693B1 (en) Intuitive computing methods and systems
CN106462725A (en) Systems and methods of monitoring activities at a gaming venue
CN106060520B (en) A kind of display mode switching method and its device, intelligent terminal
CN101923669A (en) Intelligent adaptive design
US10019628B1 (en) Incentivizing foodstuff consumption through the use of augmented reality features
CN105632263A (en) Augmented reality-based music enlightenment learning device and method
CN107728482A (en) Control system, control process method and device
CN108744516A (en) Obtain method and apparatus, storage medium and the electronic device of location information
US20140314327A1 (en) Systems and Methods for Computer Recognition of Plush Toys
WO2006100513A1 (en) Manipulable interactive devices
CN108108996A (en) Advertisement placement method, device, computer equipment and readable medium in video
CN106406537A (en) Display method and device
CN108236786A (en) The virtual feeding method and machine animal of machine animal
Szklanny et al. Creating an interactive and storytelling educational physics app for mobile devices
CN106960475A (en) Click on processing method and processing device, storage medium and processor in the position of threedimensional model
Schraffenberger Arguably augmented reality: relationships between the virtual and the real
Tang et al. Emerging human-toy interaction techniques with augmented and mixed reality
CN108269460B (en) Electronic screen reading method and system and terminal equipment
CN108089833A (en) The method of intelligent mobile terminal and its broadcasting music, the device with store function
CN110448903A (en) Determination method, apparatus, processor and the terminal of control strategy in game
CN108681398A (en) Visual interactive method and system based on visual human
CN108833671A (en) Electronic device falls detection method and Related product
TWM497315U (en) A kind perception of touch with the augmented reality functionality and device tags
De Albuquerque et al. IoT4Fun Rapid Prototyping Toolkit for Smart Toys

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant