CN215117126U - Intelligent projection equipment - Google Patents
Intelligent projection equipment Download PDFInfo
- Publication number
- CN215117126U CN215117126U CN202120878548.1U CN202120878548U CN215117126U CN 215117126 U CN215117126 U CN 215117126U CN 202120878548 U CN202120878548 U CN 202120878548U CN 215117126 U CN215117126 U CN 215117126U
- Authority
- CN
- China
- Prior art keywords
- projection device
- signal
- virtual pet
- projection
- controller
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Images
Landscapes
- Controls And Circuits For Display Device (AREA)
Abstract
The embodiment of the utility model relates to the technical field of intelligent equipment, and discloses an intelligent projection device which comprises a projection device, a rotating device, a sensor component and a controller, wherein the rotating device is used for driving the projection device to rotate, the sensor component is used for collecting an instruction signal of a user and/or an environment signal of a real space, the controller sends a projection signal for controlling the state of a virtual pet to the projection device and sends an action signal for controlling the action of the virtual pet to the rotating device after receiving the instruction signal, thereby, the virtual pet can present different states and actions according to the instruction signal of the user, the interaction with the user is realized, the controller sends the projection signal for controlling the state of the virtual pet to the projection device and sends the action signal for controlling the action of the virtual pet to the rotating device after receiving the environment signal, therefore, the virtual pet can present different states and behaviors according to the environment, and interaction with the environment is realized.
Description
Technical Field
The embodiment of the utility model provides a relate to the smart machine technique, especially relate to an intelligent projection equipment.
Background
With the development of society, the pace of work and life of people is faster and faster, the mental stress is higher and higher, and the pet feeding can help condition mood and increase life pleasure due to lack of companionship, however, most people do not have time and energy to take care of, and the feeding is abandoned.
The electronic game pets on the market are too limited to operate on electronic screens such as mobile phones and computers, and the pets cannot move and walk in real space, do not interact with the real world, and feel too different from the real pets. Although some AI robot pets can walk in real space, the AI robot pets are expensive, and the AI robot pets have simple actions and expressions and are not abundant enough.
SUMMERY OF THE UTILITY MODEL
The embodiment of the utility model provides a main technical problem who solves provides an intelligent projection equipment for virtual pet can not only move about in real space, can also present different states and action, in order to carry out abundant mutual activity.
In order to solve the technical problem, an embodiment of the utility model provides an intelligent projection equipment, include:
the projection device is used for projecting the virtual pet to a real space;
the rotating device is used for driving the projecting device to rotate so as to enable the virtual pet to move in the real space;
a sensor component for acquiring instruction signals of a user and/or for acquiring environment signals of the real space;
the controller is respectively connected with the projection device, the rotating device and the sensor assembly, and the controller sends a projection signal to the projection device and sends a action signal to the rotating device after receiving the instruction signal, and/or the controller sends the projection signal to the projection device and sends the action signal to the rotating device after receiving the environment signal;
the projection signal is used for controlling the state of the virtual pet, and the action signal is used for controlling the action of the virtual pet.
In some embodiments, the sensor assembly comprises at least one camera.
In some embodiments, at least one of the cameras is mounted on the rotating device.
In some embodiments, the sensor assembly comprises a microphone.
In some embodiments, the sensor assembly further comprises a temperature sensor.
In some embodiments, the smart projection device further comprises a broadcaster connected to the controller.
In some embodiments, the intelligent projection device further comprises a communication module, the communication module is connected with the controller, and the communication module is used for being in communication connection with a mobile terminal.
In some embodiments, further comprises at least one of a USB interface, an audio interface, a VGA interface, a DVI interface, an HDMI interface, an HML interface, and a memory expansion interface.
In some embodiments, the projector further comprises a power supply device, and the power supply device is electrically connected with the projector, the rotating device, the sensor assembly and the controller respectively.
In some embodiments, the projection device, the rotating device, the sensor assembly and the controller are housed on the bracket.
The utility model discloses beneficial effect of embodiment: different from the prior art, the embodiment of the present invention provides an intelligent projection device, which comprises a projection device, a rotation device, a sensor component and a controller, wherein the controller is respectively connected to the projection device, the rotation device and the sensor component, the projection device is used for projecting a virtual pet to a real space, the rotation device is used for driving the projection device to rotate, so that the virtual pet moves in the real space, the sensor component is used for collecting a command signal of a user and/or an environmental signal of the real space, when the controller receives the command signal, the controller sends a projection signal for controlling the state of the virtual pet to the projection device and sends a motion signal for controlling the motion of the virtual pet to the rotation device, so that the virtual pet can present different states and behaviors according to the command signal of the user, and realize interaction with the user, after receiving the environment signal, the controller sends a projection signal for controlling the state of the virtual pet to the projection device and sends a motion signal for controlling the motion of the virtual pet to the rotating device, so that the virtual pet can present different states and behaviors according to the environment, and interaction with the environment is realized.
Drawings
One or more embodiments are illustrated by way of example in the accompanying drawings, which correspond to the figures in which like reference numerals refer to similar elements and which are not to scale unless otherwise specified.
Fig. 1 is a schematic view of an application environment of an intelligent projection apparatus according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a hardware structure of the intelligent projection apparatus provided in an embodiment of the present invention.
Detailed Description
In order to facilitate understanding of the present invention, the present invention will be described in more detail with reference to the accompanying drawings and specific embodiments. It will be understood that when an element is referred to as being "secured to" another element, it can be directly on the other element or intervening elements may also be present. When an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may be present. The terms "vertical," "horizontal," "left," "right," and the like as used herein are for descriptive purposes only.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
Please refer to fig. 1, which is a schematic diagram of an application scenario of the intelligent projection apparatus according to an embodiment of the present invention. As shown in fig. 1, the application scenario 100 includes a smart projection device 10 and a real space 20.
The real space 20 may be a user's living room or office, etc. For example, the real space 20 shown in fig. 1 includes a desk area 21, a stand 22, a flowerpot 23, a pet rest area 24, a bay window 25, and a door 26. Where the intelligent projection device 10 is placed on the support 22, it is understood that the intelligent projection device 10 may also be suspended from a ceiling (not shown) of the real space or placed on a desktop. It is understood that the placement of the smart projection device 10 is not limited as long as the smart projection device can project in the real space 20, and the real space 20 is not limited as long as the smart projection device can project in the real space, and the application scenario in fig. 1 is only an exemplary illustration, and does not limit the application scenario of the method for controlling a virtual pet.
It is understood that the smart projection device stores therein a pet status database, which includes a pet type library, an action library, a food library, a skin color library, a texture library, etc. for defining features of pets, the pet type library includes reptile pets (e.g., cats, dogs, lizards, etc.), flight pets (e.g., birds, butterflies, bees, etc.), non-reality pets (e.g., cartoon shaped pets such as magic fairies, robots), the action library is actions that the pets can perform, such as walking, circling, rolling, shaking heads, swinging tails, sleeping, etc., and the food library is foods that the pets can eat, such as bananas, apples, cakes, dried fish, etc. The skin color library provides optional skin for the virtual pet, for example, the skin color library comprises red, blue, green or flower colors and the like, and the texture library provides optional texture for the virtual pet, for example, heart-shaped, leopard-shaped, tiger-shaped, flower-shaped, wave point, zebra stripe and the like, so that a user can obtain a favorite pet appearance by designing skin color and texture. It can be understood that the pet types, actions and foods can be combined and matched with each other, for example, a kitten rolls to eat dried fish, so that animals can be simulated more truly, the virtual pet is vivid, and the possibility of feeding different pets, such as lizards in the month, dogs in the next month and the like, can be realized by changing the pet types. It can be understood that the user can update and maintain the pet state database, thereby continuously enriching the pet characteristics and improving the playability of the virtual pet. For example, the user may download the feature data from the product official network for updating, and those skilled in the art may upload the feature data made according to the open standard to the product official network for the user to download the update.
As shown in fig. 2, the intelligent projection apparatus 10 includes a projection device 11, a rotation device 12, a sensor assembly 13, and a controller 14, wherein the controller 14 is connected to the projection device 11, the rotation device 12, and the sensor assembly 13, respectively.
The smart projection device 10 may preset a virtual pet, that is, the smart projection device 10 obtains the state of the virtual pet input by the user based on the pet state database, and may generate the virtual pet. The type, skin color, texture, sound, size, age, etc. of the virtual pet may be set according to the pet status database. The virtual pet may be a crawling animal, such as a kitten, a puppy, a lizard, etc., a flying animal, such as a bird, a butterfly, a bee, etc., or some non-realistic character, such as an animation character, etc. It is understood that a default template of a virtual pet may also be set in the smart projection device 10, and a user may also directly select the default template as the virtual pet, or may customize the default template according to a preference, for example, adjusting a skin color, a texture, and the like. Therefore, the user can change the virtual pets with different styles frequently and randomly to obtain the experience of raising different pets, such as raising magic fairy in the month, raising dogs in the next month and the like.
The projection device 11 is used for projecting the virtual pet to the real space 20. It is understood that the projection device 11 may be a projector or other device having a projection function, and the virtual pet is a shadow image formed by the light projected by the projector falling in the real space 20. For example, the projector projects an image of a kitten onto a kitten shadow picture formed on the ground, i.e., a virtual pet kitten.
The rotating device 12 is used for driving the projecting device 11 to rotate, so that the virtual pet moves in the real space 20. For example, the rotating device 12 drives the projecting device 11 to rotate towards the window at a certain speed, and the virtual pet moves towards the window at a certain speed in cooperation with the circulating picture of the virtual pet in situ. It can be understood that the walking speed of the virtual pet is determined by the moving speed of the projection picture and the frequency step of the virtual pet walking animation, and the walking speed of the virtual pet is in direct proportion to the moving speed of the projection picture.
It is understood that the rotating device 12 and the projecting device 11 can be connected through an auxiliary structure, or the projecting device 11 is mounted on the rotating device 12, so that when the rotating device 12 rotates, the projecting device 11 can be driven to rotate, thereby enabling the virtual pet to move in the real space 20. It is understood that the rotating device 12 may be a stereo rotating device, for example, the rotating device 12 includes a rotating shaft and a rotating base, the rotating base can rotate around the rotating shaft, the rotating shaft can freely rotate in the real space 20, and the projecting device 11 is mounted on the rotating base, so that the projecting device 11 can realize stereo free rotation.
The sensor assembly 13 is used for acquiring command signals of a user and/or for acquiring environment signals of the real space 20. The instruction signal is sent by the user, reflects the control intention of the user on the virtual pet, and can be in different forms according to different sensors, for example, the instruction signal can be an image or a sound. The environment signal is a signal reflecting the actual environment of the real space 20, and may be an image, a radar wave, or the like.
After receiving the instruction signal, the controller 14 sends a projection signal for controlling the state of the virtual pet to the projection device 11 and sends an action signal for controlling the action of the virtual pet to the rotating device 12, so that the virtual pet can present different states and actions according to the instruction signal of the user to realize interaction with the user, and after receiving the environment signal, the controller 14 sends a projection signal for controlling the state of the virtual pet to the projection device 11 and sends an action signal for controlling the action of the virtual pet to the rotating device 12, so that the virtual pet can present different states and actions according to the environment to realize interaction with the environment.
The state of the virtual pet comprises the pet type, the skin color or the texture, and the behavior of the virtual pet comprises various actions of the virtual pet, such as walking, circling, rolling, shaking head, wagging tail, lifting feet or sleeping.
In some embodiments, the sensor assembly 13 includes at least one camera mounted on the rotating device 12, so that the at least one camera can change the direction of the viewing angle with the rotating device 12, and can acquire images of the whole real space 20 in all directions. In this embodiment, both the command signal and the environmental signal may be images.
For example, the camera captures an image that reflects the posture of the user by taking a picture of the user, and the controller 14 recognizes the image by using an existing posture recognition algorithm after receiving the image, so as to obtain the posture of the user, so that the projection device 11 can transmit a projection signal that makes the virtual pet simulate the posture of the user, so that the posture of the virtual pet projected by the projection device 11 is the posture of the user. For example, taking a virtual pet as a lizard as an example, when the user lifts the right hand, the projected virtual pet lizard lifts the right forefoot, and when the user lifts the left foot, the projected virtual pet lizard lifts the left forefoot.
In some embodiments, the instruction signal may also be an image reflecting a gesture of the user, and after receiving the image, the controller 14 recognizes the image by using an existing gesture recognition algorithm to obtain the gesture of the user, so that the rotating device 12 rotates according to the behavior signal by sending a behavior signal corresponding to the gesture to the rotating device 12, and thus the virtual pet projected by the projection device 11 presents a behavior reflected by the behavior signal. For example, when a user moves his/her hand, after a hand-moving gesture is recognized, the rotating device 12 is controlled to drive the projecting device 11 to rotate, so that the virtual pet moves from the current position to the front of the user; when the user waves the hand, after the hand waving gesture is recognized, the rotating device 12 is controlled to drive the projector to rotate, so that the virtual pet leaves from the front of the user.
In some embodiments, the environment signal may be an image reflecting the real space 20, and after receiving the image, the controller 14 recognizes the image by using an existing recognition algorithm to obtain three-dimensional information of the real space 20, where the three-dimensional information includes shapes, sizes, and the like of objects in the real space 20, and then, by sending a motion signal corresponding to the three-dimensional information to the rotating device 12, the rotating device 12 rotates according to the motion signal, so that the virtual pet projected by the projecting device 11 presents a motion reflected by the motion signal. For example, the walking path of the virtual pet bypasses barriers such as flowerpots, furniture and the like, so that the habit of the virtual pet is similar to that of a real pet, thereby increasing the reality of the virtual pet. For another example, the virtual pet is controlled to play at the window, or the virtual pet is controlled to go to the corner to sleep, etc.
It is understood that, after receiving the environment signal, a projection signal for changing the color may also be sent to the projection device 11, so that the color of the virtual pet projected by the projection device 11 is changed, for example, when the virtual pet lizard crawls onto a red wall surface, the skin color of the virtual pet lizard is controlled to become red, and when the virtual pet lizard crawls onto a certain object in green, the skin color of the virtual pet lizard is controlled to become green.
It can be understood that, after receiving the environment signal, the projection signal for interacting with the object may also be sent to the projection device 11, so that the virtual pet projected by the projection device 11 can interact with the object, for example, when the user places a banana in a specific area, the virtual pet can eat the banana, when the virtual pet encounters a toy in the walking path, the virtual pet can climb to the toy for playing, and the like. Therefore, the virtual pet interacts with the surrounding objects, and the behavior of the virtual pet is closer to that of the real pet.
In some embodiments, the sensor assembly 13 includes a microphone so that a voice signal of the user can be acquired, in this embodiment, the instruction signal is a voice signal. The voice signal reflects the state and the action of the virtual pet. For example, the voice signal may be "skin tone becomes red" corresponding to a state of "skin tone becomes red", and the voice signal may be "lying down" corresponding to an action of "lying down". After the controller 14 receives the voice signal, it can use the existing voice recognition algorithm to recognize the voice signal and obtain the corresponding state and/or action, so that the virtual pet can have the corresponding state or action by sending the projection signal corresponding to the state to the projection device 11 and/or sending the action signal corresponding to the action to the rotation device 12. For example, when the user says "hi, xiyi, i'm's return", the virtual pet lizard will be woken up, run to the doorway to meet the user's return, when the user says "hi, xiyi, which should have a meal", the virtual pet lizard will run to the place where the meal was eaten, when the user says "hi, xiyi, which becomes red", the skin color of the virtual pet lizard will appear red.
In some embodiments, the sensor component 13 further includes a temperature sensor (not shown), so that the temperature of the real space 20, that is, the ambient temperature, can be acquired in real time, and after receiving the temperature signal, the controller 14 can send a projection signal reflecting the wearing of the virtual pet to the projection device 11, so that the virtual pet projected by the projection device 11 according to the projection signal is in a state of wearing clothes suitable for the temperature, for example, when the ambient temperature is 0-10, the virtual pet wears cotton clothes, when the temperature is 10-20 degrees, the virtual pet wears outer clothes, when the temperature is above 20 degrees, the virtual pet wears skirt, and the like. In the embodiment, the wearing clothes of the virtual pet are suitable for the ambient temperature through the temperature sensor, namely, the virtual pet can interact with the ambient temperature, and the interestingness is increased.
It is noted that the controller 14 may be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), a single chip, an arm (acorn RISC machine) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination of these components.
In some embodiments, smart projection device 10 further includes a broadcaster (not shown) for coordinating the virtual pet to make a sound, so as to increase the reality of the virtual pet, for example, when the virtual pet is a kitten, the broadcaster may coordinate to play a cat sound.
In some embodiments, smart projection device 10 further includes a communication module (not shown) coupled to controller 14 for communicatively coupling to a mobile terminal. For example, the communication module may be a bluetooth module or a wifi module, etc. In this embodiment, the user can set the state and action of the virtual pet on the mobile terminal.
In some embodiments, smart projection device 10 further includes at least one of a USB interface, an audio interface, a VGA interface, a DVI interface, an HDMI interface, an HML interface, and a memory expansion interface. Accordingly, the smart projection device 10 is correspondingly extended through the above-mentioned interface. For example, the operating data of the intelligent projection device 10 may be copied through the USB interface, or an operating program may be embedded to extend the functionality of the intelligent projection device 10.
In some embodiments, the intelligent projection device 10 further includes a power supply device electrically connected to the projection device 11, the rotation device 12, the sensor assembly 13, and the controller 14, respectively, so as to provide power to each device. It will be appreciated that the power supply means may be a secondary lithium battery.
In some embodiments, the smart projection device 10 further includes a bracket (not shown), and the projection device 11, the rotating device 12, the sensor assembly 13, and the controller 14 are accommodated on the bracket. It will be appreciated that the stand may stand on the ground, a table top or be suspended from a ceiling.
To sum up, the embodiment of the present invention provides an intelligent projection device 10, which includes a projection device 11, a rotation device 12, a sensor component 13 and a controller 14, wherein the controller 14 is respectively connected to the projection device 11, the rotation device 12 and the sensor component 13, the projection device 11 is used for projecting a virtual pet to a real space 20, the rotation device 12 is used for driving the projection device 11 to rotate, so that the virtual pet moves in the real space 20, the sensor component 13 is used for collecting a command signal of a user and/or collecting an environment signal of the real space 20, after receiving the command signal, the controller 14 sends a projection signal for controlling the state of the virtual pet to the projection device 11 and sends an action signal for controlling the action of the virtual pet to the rotation device 12, so that the virtual pet can present different states and actions according to the command signal of the user, the interaction with the user is realized, and after receiving the environment signal, the controller 14 sends a projection signal for controlling the state of the virtual pet to the projection device 11 and sends a movement signal for controlling the movement of the virtual pet to the rotating device 12, so that the virtual pet can present different states and behaviors according to the environment, and the interaction with the environment is realized.
It should be noted that the preferred embodiments of the present invention are described in the specification and the drawings, but the present invention can be realized in many different forms, and is not limited to the embodiments described in the specification, and these embodiments are not provided as additional limitations to the present invention, and are provided for the purpose of making the understanding of the disclosure of the present invention more thorough and complete. Moreover, the above technical features are combined with each other to form various embodiments which are not listed above, and all the embodiments are regarded as the scope of the present invention; further, modifications and variations will occur to those skilled in the art in light of the foregoing description, and it is intended to cover all such modifications and variations as fall within the true spirit and scope of the invention as defined by the appended claims.
Claims (10)
1. An intelligent projection device, comprising:
the projection device is used for projecting the virtual pet to a real space;
the rotating device is used for driving the projecting device to rotate so as to enable the virtual pet to move in the real space;
a sensor component for acquiring instruction signals of a user and/or for acquiring environment signals of the real space;
the controller is respectively connected with the projection device, the rotating device and the sensor assembly, and the controller sends a projection signal to the projection device and sends a action signal to the rotating device after receiving the instruction signal, and/or the controller sends the projection signal to the projection device and sends the action signal to the rotating device after receiving the environment signal;
the projection signal is used for controlling the state of the virtual pet, and the action signal is used for controlling the action of the virtual pet.
2. The smart projection device of claim 1, wherein the sensor assembly comprises at least one camera.
3. The intelligent projection device of claim 2, wherein at least one of the cameras is mounted on the rotating means.
4. The smart projection device of claim 1, wherein the sensor assembly comprises a microphone.
5. The smart projection device of claim 1, wherein the sensor assembly further comprises a temperature sensor.
6. The smart projection device of claim 1, further comprising a broadcaster connected to the controller.
7. The intelligent projection device of any one of claims 1-6, further comprising a communication module, the communication module being connected to the controller, the communication module being configured to be communicatively coupled to a mobile terminal.
8. The intelligent projection device of claim 7, further comprising at least one of a USB interface, an audio interface, a VGA interface, a DVI interface, an HDM I interface, an HML interface, and a memory expansion interface.
9. The intelligent projection device of claim 1, further comprising a power supply device electrically connected to the projection device, the rotation device, the sensor assembly, and the controller, respectively.
10. The intelligent projection device of claim 1, further comprising a bracket, wherein the projection device, the rotation device, the sensor assembly, and the controller are housed on the bracket.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202120878548.1U CN215117126U (en) | 2021-04-26 | 2021-04-26 | Intelligent projection equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202120878548.1U CN215117126U (en) | 2021-04-26 | 2021-04-26 | Intelligent projection equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN215117126U true CN215117126U (en) | 2021-12-10 |
Family
ID=79272424
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202120878548.1U Expired - Fee Related CN215117126U (en) | 2021-04-26 | 2021-04-26 | Intelligent projection equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN215117126U (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113313836A (en) * | 2021-04-26 | 2021-08-27 | 广景视睿科技(深圳)有限公司 | Method for controlling virtual pet and intelligent projection equipment |
-
2021
- 2021-04-26 CN CN202120878548.1U patent/CN215117126U/en not_active Expired - Fee Related
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113313836A (en) * | 2021-04-26 | 2021-08-27 | 广景视睿科技(深圳)有限公司 | Method for controlling virtual pet and intelligent projection equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8287372B2 (en) | Interactive toy and display system | |
US11148294B2 (en) | Autonomously acting robot that maintains a natural distance | |
US11198221B2 (en) | Autonomously acting robot that wears clothes | |
US11529567B2 (en) | Robot having a changeable character | |
US8483873B2 (en) | Autonomous robotic life form | |
EP2322258B1 (en) | System and method for playsets using tracked objects and corresponding virtual worlds | |
US6736694B2 (en) | Amusement device | |
US20130095725A1 (en) | Figurine toy in combination with a portable, removable wireless computer device having a visual display screen | |
JP7298860B2 (en) | Autonomous action type robot assuming a virtual character | |
WO2000066239A1 (en) | Electronic pet system, network system, robot, and storage medium | |
DE29904916U1 (en) | Interactive toys | |
EP2744579A1 (en) | Connected multi functional system and method of use | |
CN215117126U (en) | Intelligent projection equipment | |
US20160019016A1 (en) | Augmented reality doll | |
JP2022113701A (en) | Equipment control device, equipment, and equipment control method and program | |
CN113313836B (en) | Method for controlling virtual pet and intelligent projection equipment | |
US20220126439A1 (en) | Information processing apparatus and information processing method | |
US20220297018A1 (en) | Robot, robot control method, and storage medium | |
JP2005319191A (en) | Game system, program, information storage medium, and image generating method | |
CN213400028U (en) | Household intelligent psychological education interaction system | |
US20220343132A1 (en) | Method for controlling virtual pets, and smart projection device | |
CN211410961U (en) | Intelligent cat game machine | |
WO2023037608A1 (en) | Autonomous mobile body, information processing method, and program | |
JP7556379B2 (en) | Robot, robot control method and program | |
WO2023037609A1 (en) | Autonomous mobile body, information processing method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20211210 |
|
CF01 | Termination of patent right due to non-payment of annual fee |