CN109157845B - Toy storage method and device, user terminal and toy combination - Google Patents

Toy storage method and device, user terminal and toy combination Download PDF

Info

Publication number
CN109157845B
CN109157845B CN201811017219.7A CN201811017219A CN109157845B CN 109157845 B CN109157845 B CN 109157845B CN 201811017219 A CN201811017219 A CN 201811017219A CN 109157845 B CN109157845 B CN 109157845B
Authority
CN
China
Prior art keywords
toy
homing
interface
user interface
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811017219.7A
Other languages
Chinese (zh)
Other versions
CN109157845A (en
Inventor
王晗
韩晋
张茜明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN201811017219.7A priority Critical patent/CN109157845B/en
Publication of CN109157845A publication Critical patent/CN109157845A/en
Application granted granted Critical
Publication of CN109157845B publication Critical patent/CN109157845B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H17/00Toy vehicles, e.g. with self-drive; ; Cranes, winches or the like; Accessories therefor
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H17/00Toy vehicles, e.g. with self-drive; ; Cranes, winches or the like; Accessories therefor
    • A63H17/26Details; Accessories
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H17/00Toy vehicles, e.g. with self-drive; ; Cranes, winches or the like; Accessories therefor
    • A63H17/26Details; Accessories
    • A63H17/36Steering-mechanisms for toy vehicles

Landscapes

  • Toys (AREA)

Abstract

The disclosure provides a toy storage method and device, a user terminal and a toy combination, and relates to the technical field of intelligent storage. The toy storage method comprises the following steps: detecting a homing operation in a preset application interface, wherein the preset application interface is mapped with a toy blanket and an object of a toy to be received; acquiring a homing instruction according to the detected homing operation, wherein the homing instruction at least comprises an identifier of a toy to be received and parking position information; sending the homing instruction to move the toy to be received to the toy blanket.

Description

Toy storage method and device, user terminal and toy combination
Technical Field
The disclosure relates to the field of intelligent storage, in particular to a toy storage method and device, a user terminal and a toy combination.
Background
More and more families will prepare a large number of toys for children, making toy storage a frequent daily task. Usually, the toy storage can only be carried out manually, and in this case, it takes a lot of time to store a large number of toys, which brings inconvenience to life.
Disclosure of Invention
In view of the above, the present disclosure provides a toy storage method and apparatus, a user terminal, and a toy combination to solve the problems in the related art.
According to a first aspect of the embodiments of the present disclosure, there is provided a toy storage method applied to a user terminal, the method including: detecting a homing operation in a preset application interface, wherein the preset application interface is mapped with a toy blanket and an object of a toy to be received;
acquiring a homing instruction according to the detected homing operation, wherein the homing instruction at least comprises an identifier of a toy to be received and parking position information;
sending the homing instruction to the trailer to move the toy to be received onto the toy carpet.
Optionally, the detecting a homing operation in the preset application interface includes: providing a first user interface in the preset application interface, wherein the first user interface comprises a first object corresponding to the toy to be accommodated;
detecting a trigger operation for the first object;
providing a second user interface in the preset application interface after detecting a trigger operation for the first object, the second user interface including a second object corresponding to a different area on the toy carpet.
Optionally, the detecting a homing operation in the preset application interface further includes:
providing a second user interface in the preset application interface after detecting a trigger operation for the first object, the second user interface including a second object corresponding to a different area on the toy carpet;
detecting a trigger operation for the second object.
Optionally, obtaining a homing instruction according to the detected homing operation includes:
acquiring the corresponding identification according to the detected first target object;
acquiring corresponding parking position information according to the detected second target object;
and generating the homing instruction according to the identification and the parking position information.
Optionally, before providing the second user interface in the preset application interface, the method further includes:
providing an input interface for inputting docking rules in the preset application interface;
detecting a parking rule input operation in the input interface to obtain a parking rule; the docking rules include: and the corresponding relation between the preset characteristic information of the toy to be stored and the preset parking position on the toy blanket.
Optionally, the second user interface comprises: and the corresponding relation between the second object corresponding to the preset parking position and the preset characteristic information.
Optionally, the providing a second user interface in the preset application interface includes: determining preset characteristic information of a first target object;
matching the parking rule according to the preset feature information, and determining an optional parking position corresponding to the first target object;
and displaying a second object corresponding to the selectable parking position in the second user interface.
Optionally, the detecting a homing operation in the preset application interface includes: providing a third user interface in the preset application interface, wherein the third user interface comprises a third object;
the third object corresponds to different historical parking rules stored in a parking rule base, and the parking rules comprise the identification of the toy to be stored and parking position information corresponding to the toy to be stored on the toy blanket;
detecting a trigger operation for the third object.
Optionally, the third user interface includes a third object comprising: and displaying the third object in the third user interface according to the record of the historical parking rule and the preset priority.
Optionally, obtaining a homing instruction according to the detected homing operation includes: acquiring the corresponding historical docking rule according to the detected third target object,
and generating the homing instruction according to the historical parking rule.
According to a second aspect of the embodiments of the present disclosure, there is provided a toy storage method for use with a trailer, the method including: the trailer receives a homing instruction, wherein the homing instruction at least comprises an identifier of a toy to be received and parking position information;
and the trailer moves the toy to be received corresponding to the identifier to a target position, wherein the target position corresponds to the position on the toy blanket corresponding to the parking position information.
Optionally, the trailer moves the toy to be received corresponding to the identifier to a target position, including: and the trailer identifies the toy to be accommodated according to the identification and is connected with the toy to be accommodated.
Optionally, the trailer moves the toy to be received corresponding to the identifier to a target position, including: the trailer acquires the distance between the real-time position of the trailer and the target position,
and judging whether the distance is smaller than or equal to a preset parking distance, if so, stopping the movement of the trailer, and releasing the connection with the toy so as to park the toy at the target position.
According to a third aspect of the embodiments of the present disclosure, there is provided a toy storage method applied to a toy, including:
receiving a homing instruction, wherein the homing instruction at least comprises parking position information;
and moving to a target position according to the homing instruction, wherein the target position is a position corresponding to the parking position information on the toy blanket.
According to a fourth aspect of the embodiments of the present disclosure, there is provided a toy storage device including:
the detection module is used for detecting homing operation in the preset application interface, and the preset application interface is mapped with a toy blanket and an object of a toy to be received;
the first obtaining module is used for obtaining a homing instruction according to the detected homing operation, and the homing instruction at least comprises an identifier of a toy to be received and parking position information;
and the sending module is used for sending the homing instruction so as to move the toy to be received to the toy blanket.
Optionally, the detection module comprises: the first interface unit is used for providing a first user interface on the preset application interface, and the first user interface comprises a first object corresponding to the toy to be received;
a first detection unit configured to detect a trigger operation for the first object.
Optionally, the detection module further comprises:
a second interface unit, configured to provide a second user interface at the preset application interface after the first detection unit detects the trigger operation for the first object, where the second user interface includes second objects corresponding to different areas on the toy carpet;
a second detection unit configured to detect a trigger operation for the second object.
Optionally, the first obtaining module includes: the first acquisition unit is used for acquiring the corresponding identification according to the detected first target object;
the second acquisition unit is used for acquiring corresponding parking position information according to the detected second target object;
and the first instruction generating unit is used for generating a homing instruction according to the identification and the parking position information.
Optionally, the method further comprises: the interface module is used for providing an input interface for inputting the docking rule in the preset application interface before the homing operation is detected in the preset application interface;
a second obtaining module, configured to detect a docking rule input operation in the input interface, and obtain the docking rule, where the docking rule includes: and the corresponding relation between the preset characteristic information of the toy to be stored and the preset parking position on the toy blanket.
Optionally, the second interface unit comprises: the first determining subunit is used for determining preset characteristic information of the first target object;
the second determining subunit is configured to match the parking rule according to the preset feature information, and determine an optional parking position corresponding to the first target object;
and the display subunit is used for displaying the second object corresponding to the selectable parking position in the second user interface.
Optionally, the detection module comprises: a third interface unit for providing a third user interface in the preset application interface, the third user interface displaying a third object,
the third object corresponds to different historical docking rules stored in a docking rule base, the docking rules including: the identification of the toy to be stored and the parking position information corresponding to the toy to be stored on the toy blanket;
a third detection unit configured to detect a trigger operation for the third object.
Optionally, the third interface unit is further configured to display the third object in the third user interface according to a preset priority according to the record of the historical landing rule.
Optionally, the obtaining module includes: the third acquisition unit is used for acquiring the corresponding historical parking rule according to the detected third target object;
and the second instruction generating unit is used for generating the homing instruction according to the historical parking rule.
According to a fifth aspect of the embodiments of the present disclosure, there is provided a user terminal, including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is operable with the executable instructions to implement the method as provided by the first aspect.
According to a sixth aspect of embodiments of the present disclosure, there is provided a toy combination comprising:
a toy blanket comprising a first chip,
a trailer, on which a second chip is arranged,
a toy, wherein a third chip is arranged on the toy,
the first chip, the second chip and the third chip implement the housing method as provided by the second aspect.
According to a seventh aspect of embodiments of the present disclosure, there is provided a toy combination comprising:
a toy blanket comprising a first chip,
a toy, wherein a third chip is arranged on the toy,
the first chip and the third chip realize the housing method of the third aspect.
According to an eighth aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium having stored thereon computer instructions which, when executed by a processor, implement the method as provided by the first aspect, or the method as provided by the second aspect.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
according to the toy storage mode, the user terminal is used for controlling the trailer to move the toy to be stored to the toy blanket, the interestingness is high, boring storage work is converted into a game process of teaching through lively activities, and meanwhile storage can be enhanced; moreover, the method is efficient and convenient, the storage efficiency can be effectively improved, and the labor amount required by the storage work is reduced.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flow chart illustrating a toy storage method according to an exemplary embodiment;
FIG. 2 is a schematic diagram illustrating a preset application interface in accordance with an illustrative embodiment;
FIG. 3 is a flowchart illustrating a detect homing operation in a preset application interface, according to an exemplary embodiment;
FIG. 4 is a schematic illustration of a first user interface shown in accordance with an exemplary embodiment;
FIG. 5 is a schematic diagram illustrating a second user interface in accordance with an exemplary embodiment;
FIG. 6 is a flow chart illustrating a method of stowing according to an exemplary embodiment;
FIG. 7 is a schematic diagram illustrating a second user interface in accordance with an exemplary embodiment;
FIG. 8 is a schematic flow chart diagram illustrating the provision of a second user interface in accordance with an exemplary embodiment;
FIG. 9 is a schematic illustration of a second user interface shown in accordance with an exemplary embodiment;
FIG. 10 is a flowchart illustrating a detect homing operation in a preset application interface, according to another exemplary embodiment;
FIG. 11 is a schematic illustration of a third user interface shown in accordance with an exemplary embodiment;
FIG. 12 is a schematic flow diagram illustrating a get homing instruction in accordance with an illustrative embodiment;
FIG. 13 is a flowchart illustrating a get homing instruction in accordance with another illustrative embodiment;
FIG. 14 is a schematic flow diagram illustrating a get homing instruction in accordance with another illustrative embodiment;
FIG. 15 is a flow chart illustrating a toy storage method according to one exemplary embodiment;
FIG. 16 is a flow chart illustrating a toy storage method according to one exemplary embodiment;
FIG. 17 is a block diagram of a toy storage device shown in accordance with an exemplary embodiment;
FIG. 18 is a block diagram of a detection module shown in accordance with an exemplary embodiment;
FIG. 19 is a block diagram illustrating a first acquisition module, according to an exemplary embodiment;
FIG. 20 is a block diagram of a toy storage device shown in accordance with another exemplary embodiment;
FIG. 21 is a block diagram illustrating a second interface element, according to an exemplary embodiment;
FIG. 22 is a block diagram of a detection module shown in accordance with an exemplary embodiment;
FIG. 23 is a block diagram illustrating a second acquisition module, according to an exemplary embodiment;
fig. 24 is a block diagram illustrating a user terminal according to an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
In the present disclosure, an execution body is referred to which includes a user terminal and a toy combination. The user terminal may be a smart phone, a Personal Digital Assistant (PDA), a tablet computer, a wearable device such as a smart watch, a smart bracelet, or the like. The toy combination includes: a toy blanket provided with a first chip, a trailer provided with a second chip, and a toy provided with a third chip. Alternatively, the toy combination comprises: the toy blanket is provided with a first chip and the toy is provided with a third chip. The chip of setting in the toy combination makes the toy combination interact with user terminal, realizes accomodating through user terminal realization toy. When the toy combination includes a trailer, the type of toy is not limited, and examples thereof include a doll, a block, a toy car, and the like. When the toy combination does not include a trailer, the toy is a mobile toy, such as a toy car or the like. For convenience of explanation, the embodiments of the present disclosure are illustrated with a toy vehicle as an example.
The user terminal can establish communication connection with the toy combination through at least one communication mode of a mobile communication network, a Wireless Local Area Network (WLAN), such as Wireless fidelity (Wifi), infrared communication and the like. In addition, in order to facilitate understanding, the present disclosure provides an application program that can be run on a user terminal, in which a toy carpet, a trailer, and a virtual object of a toy are mapped, and the user terminal controls the trailer to store the toy in the toy carpet through the application program.
FIG. 1 is a flow chart illustrating a toy storage method according to an exemplary embodiment, as shown in FIG. 1, the method including the steps of:
and 11, detecting the homing operation in a preset application interface, wherein the preset application interface is mapped with a toy blanket, a trailer and an object of the toy to be received.
The preset application interface is an operation interface in an application program. Optionally, the objects of the mapped toy carpet and the toy to be received correspond to the toy carpet and the toy to be received in the real usage scene one to one, and the relative position relationship of each object in the preset application interface reflects the relative position relationship of the toy carpet and the toy to be received in the real usage scene. The number and the distribution condition of the current toys to be accommodated are visually displayed through the preset application interface, so that a user can carry out remote accommodation, for example, the user accommodates the toys in a bedroom in a living room, the convenience of the accommodation method is improved, and the application range is expanded.
Fig. 2 is a schematic diagram of a preset application interface according to an exemplary embodiment, and the embodiment of the present disclosure does not limit the specific implementation form of mapping the toy carpet and the object of the toy to be received, for example, the mapping is represented by different color or shape blocks. Or as shown in fig. 2, the cartoon icons corresponding to the toy carpet and the toy to be received are adopted, so that the interestingness of the operation process is improved.
The homing operation includes at least determining a target toy to be stowed, and determining a target parking position of the target toy on the toy carpet. And determining the corresponding relation between the toy to be received and the parking position of the toy on the toy blanket through detecting the homing operation. Among other things, the present disclosure provides the following two alternative implementations of step 11.
In an alternative manner of step 11, referring to a flowchart of detecting a homing operation in a preset application interface shown in fig. 3 according to an exemplary embodiment, the detecting a homing operation in a preset application interface includes:
step 31, providing a first user interface in a preset application interface, wherein the first user interface comprises a first object corresponding to the toy to be received.
By providing the first user interface in the preset application interface, an operation interface for selecting the target storage toy is provided for the user. And providing the first user interface in the preset application interface in response to the preset operation. The preset operation is optionally a preset gesture operation or a trigger operation for a preset button. The first object is a triggerable virtual option button, and each option button corresponds to a different toy to be received. Fig. 4 is a schematic diagram of a first user interface, shown in fig. 4, the first object being a triggerable icon capable of characterizing a toy figure to be stowed, according to an exemplary embodiment. And optionally, providing a secondary textual explanation in the first user interface that facilitates determining the first object's correspondence with the toy to be received, such as indicating in fig. 4 below the first object the type of the collection toy to which the first object corresponds.
Optionally, the first user interface includes a first object corresponding to all of the toys to be accommodated, and comprehensively shows the distribution of the toys to be accommodated. Alternatively, the first user interface includes first objects corresponding to a part of the toys to be accommodated, in which case, a plurality of switchable first user interfaces through which the first objects corresponding to the entire toys to be accommodated can be included are provided in the preset application interface. When the number of toys to be accommodated is large, the number of the first objects in each first user interface can be reduced by providing the plurality of first user interfaces, the first objects are prevented from being difficult to distinguish due to high density of the first objects in the first user interfaces, and difficulty in achieving expected triggering operation is increased.
Step 32, detecting a trigger operation for the first object. If the fact that a certain first object in the first user interface is triggered is detected, the triggered first object is used as a first target object correspondingly, and a toy to be accommodated corresponding to the first target object is used as a target accommodating toy.
And step 33, after detecting the trigger operation for the first object, providing a second user interface in the preset application interface, wherein the second user interface comprises a second object corresponding to different areas on the toy carpet.
And providing a second user interface in the preset application interface to provide an operation interface for selecting the target parking position for the user. Optionally, after detecting the trigger operation for the first object, the preset application interface automatically switches to the second user interface. Wherein, the second object is a triggerable virtual button corresponding to different parking positions on the toy blanket, and the parking positions can be parked with points or parking areas. Fig. 5 shows a schematic view of a second user interface according to an exemplary embodiment, as shown in fig. 5, the second object is distributed at different positions of the object mapping the toy carpet, corresponding to different docking positions on the toy carpet. Optionally, a label is used to distinguish between the different second objects.
Step 34, detecting a trigger operation for the second object. And if detecting that a certain second object in the second user interface is triggered, correspondingly taking the triggered second object as a second target object, and taking a parking position corresponding to the second target object as a target parking position of the target storage toy.
Through the first optional mode, the corresponding relation between the toy to be stored and the parking position is established one by one. Through this kind of mode, can wait to accomodate the toy to every and carry out individualized accomodating, formulate different accomodating according to user's hobby and put the mode, promote the toy and accomodate the interest.
Also, it is also possible to provide a second user interface to detect an operation with respect to the second object first, and provide a first user interface to detect an operation with respect to the first object after detecting an operation with respect to the second object. In other words, the order of selecting the target toy to be stored and the target resting position is not limited, so as to further enrich the operation mode of storing the toy.
It should be noted that, on the basis of the first embodiment, referring to a flowchart of a storage method shown in fig. 6 according to an exemplary embodiment, before step 33, the toy storage method further includes:
and step 61, providing an input interface for inputting the docking rules in the preset application interface.
And providing an input interface in the preset application interface through preset operations such as triggering a preset button and the like. The input interface may be a text input interface or a voice input interface.
Step 62, detecting a parking rule input operation in an input interface to obtain a parking rule; the docking rules include: and the corresponding relation between the preset characteristic information of the toy to be stored and the preset stop position on the toy blanket.
Wherein, predetermine characteristic information can be for waiting to accomodate material, kind, shape, colour etc. of toy. Illustratively, according to the text information "park a yellow car at a position a" input by the user, preset feature information is obtained as yellow, the parking position is a, and the corresponding relationship between the yellow and the position a.
By the method, the user can classify the toys to be stored by himself, and the corresponding relation between the different types of the toys to be stored and the parking positions is established, so that the storage method and effect are enriched, and different requirements of the user are met.
In such a case, the monitoring of the homing operation in the preset application interface includes the following two embodiments:
as a first embodiment, step 11 includes only step 31 and step 32, in which case, the second user interface is not provided in the preset application interface, the trigger operation for the first object is detected only in the first user interface, and the docking rule input operation is detected in the input interface.
As a second implementation manner, a second user interface is still provided in the preset application interface, where the second user interface includes the following two implementation manners:
as an alternative, the second user interface comprises: and presetting the corresponding relation between the second object corresponding to the parking position and the preset characteristic information.
Referring to fig. 7, a schematic diagram of a second user interface according to an exemplary embodiment is shown, in which in this alternative, the second user interface directly displays a corresponding relationship between a second object and preset feature information. When the user uses the system, different second objects are distinguished through different corresponding relations, the second target object is selected according to the corresponding relations, and the target parking position is determined. The optional mode enriches the user interaction process, is particularly suitable for children, is helpful for training the children to accurately identify the characteristics of the toy such as the type, the color, the shape and the like, and plays a role in edutainment.
Alternatively, referring to the flowchart of fig. 8 illustrating the providing of the second user interface according to an exemplary embodiment, the providing of the second user interface in the preset application interface includes the following steps:
and step 81, determining characteristic information of the first target object. The characteristic information of the first target object may be a material, a type, a volume, a color, and the like. Illustratively, a first object feature information correspondence table is stored in the user terminal or the server, as shown in table 1 below. And after the first target object is determined, acquiring the characteristic information of the first target object according to the list.
TABLE 1 first object characteristic information correspondence table
First object 1 Red colour Car (R.C.) Small-sized vehicle
First object 2 Yellow colour Microbus Medium-sized vehicle
First object 3 Green colour Taxi Small-sized vehicle
…… …… …… ……
First object n Yellow colour Truck Large-sized vehicle
When the first object feature information correspondence table is stored on the server side, before step 81, the method further includes: after detecting the trigger operation for the first object, sending a first request designation to a server to acquire characteristic information of the first target object.
And step 82, according to the characteristic information of the first target object, matching the parking rule, and determining the selectable parking position corresponding to the first target object.
If the parking rule cannot be matched according to the feature information of the first target object, for example, the user inputs the parking rule as "parking a yellow car at the position a", and the first target object is a red car. And sending out a voice prompt or a text prompt for reselecting the first target object in a preset application interface.
And step 83, displaying a second object corresponding to the selectable parking position in the second user interface.
Optionally, only the second object corresponding to the selectable parking position is displayed in the second user interface. Alternatively, referring to FIG. 9, a schematic diagram of a second user interface in which a second object corresponding to a selectable parking position is distinguished from second objects corresponding to other parking positions by different brightness or color (the shaded portion identifying the different brightness or color) is shown according to an exemplary embodiment.
In the optional mode, the second user interface directly displays the optional second object, the user does not need to select the object according to the corresponding relation, the intelligent degree of the storage method is further improved, and the storage efficiency of the storage method is improved.
In an alternative mode two of step 11, referring to a flowchart of detecting a homing operation in a preset application interface shown in fig. 10 according to another exemplary embodiment, the detecting a homing operation in a preset application interface includes:
step 101, providing a third user interface in the preset application interface, wherein the third user interface comprises a third object. The third object corresponds to different historical docking rules stored in the docking rule base, and the docking rules include the identification of the toy to be received and the docking position information corresponding to the toy to be received on the toy carpet.
The identification of the toy to be received in the parking rule is determined according to the third chip on the toy to be received, and the identification corresponds to the toy to be received one by one. Also, the parking rules may be for one or a class of toys to be stowed, such as the rule "red car parked at location a"; alternatively, the parking rules are for a plurality or types of toys to be stored, such as the rules "red cars parked at location A, yellow cars parked at location B, blue cars parked at location C".
And providing a third user interface in the preset application interface in response to preset operations such as preset trigger operation or gesture operation. Optionally, the third object is a triggerable virtual key. Referring to FIG. 11, a third user interface is shown in accordance with an exemplary embodiment in which the third objects include a textual description for the user to quickly learn the content of the docking rules corresponding to each third object for selection.
Optionally, the third object is displayed in a third user interface according to a preset priority based on the record of the historical docking rules. The preset priority may be set according to the preference of the user, for example, the third objects are sorted or sequentially displayed according to the order of at least more execution times within a preset time period; or preferentially displaying the recently executed historical parking rules according to the sequence of the execution time.
Step 102, detecting a trigger operation for a third object. And if detecting that a certain third object in the third user interface is triggered, correspondingly taking the triggered third object as a third target object, and taking the historical parking rule corresponding to the third target object as a target historical parking rule.
Through embodiment two, the user can repeat the rule of berthing that has gone on before, and this mode has reduced user operation process, effectively promotes the intelligent degree of accomodating method, realizes quick, efficient toy and accomodates.
And step 12, acquiring a homing instruction according to the detected homing operation, wherein the homing instruction at least comprises an identifier of the toy to be received and parking position information.
The homing instruction is determined by the homing operation detected in step 11, and step 12 has a different embodiment, which is explained in detail below, since step 11 has the basis of the above first and second embodiments.
When step 11 is implemented in one way, referring to fig. 12, a flowchart illustrating a process of obtaining a homing instruction according to an exemplary embodiment, step 12 includes:
and step 121, acquiring a corresponding identifier according to the detected first target object. The mark corresponding to the first object is determined by a third chip arranged in the toy to be received, and the mark corresponds to the toy to be received one by one. Optionally, the user terminal or the server stores a first object identifier correspondence table, as shown in table 2 below. And acquiring the identifier of the first target object through the list according to the detected first target object.
Table 2 first object identification correspondence table
First object 1 Identification 1
First object 2 Identification 2
First object 3 Identification 3
…… ……
First object n Identification n
And step 122, acquiring corresponding parking position information according to the detected second target object. The parking position information corresponding to the second object is determined by the first chip arranged in the toy carpet, and the second object corresponds to the parking position information one by one. Illustratively, at least three first chips for measuring distance are provided in the toy carpet, and the three first chips are not collinear. The parking position information includes distances from the parking position to the three first chips. It can be understood that the position reference point is formed by the three first chips so that the parking position information of the parking position is uniquely determined.
In this case, optionally, the user terminal or the server stores a second object docking position information correspondence table, as shown in table 3 below. And acquiring the parking position information of the second target object through the list according to the detected second target object.
TABLE 3 correspondence table of the second object stop position information
Second object 1 (x1,y1,z1)
Second object 2 (x2,y2,z2)
Second object 3 (x3,y3,z3)
…… ……
Second object n (xn,yn,zn)
In addition, when the first object identifier correspondence table or the second object stop position information correspondence table is stored on the server side, before the obtaining of the identifier or stop position information, the method further includes: after detecting the first target object or the second target object, sending a request designation to the server to obtain an identifier of the first target object or docking position information of the second target object.
And step 123, generating a homing instruction according to the identification and the parking position information. Since the identification corresponds to the toy to be received one by one, and the parking position information corresponds to the parking position one by one, the homing instruction can park the desired toy to be received at a desired position.
It should be noted that, after the homing instruction is generated, the docking rule corresponding to the homing instruction may also be stored in the docking rule base, so as to implement the second alternative of step 11.
Adopt this embodiment to berth the toy of waiting to receive at the assigned position one by one, the mode of receiving that can realize is various, and it is interesting strong to accomodate the process.
Or, when step 11 adopts the first implementation manner, and the second user interface is no longer displayed on the preset application interface, referring to the flowchart of obtaining the homing instruction shown in fig. 13 according to another exemplary embodiment, step 12 includes:
step 131, acquiring a corresponding identifier according to the detected first target object;
step 132, obtaining a parking rule according to the detected input operation;
step 133, determining parking position information according to the obtained parking rule according to the first target object;
and step 134, generating a homing instruction according to the identification and the parking position information.
The sequence of step 131 and step 133 is not specifically limited, and similarly, the docking position information may be obtained according to the docking rule detected in the input interface, and then the corresponding identifier may be obtained according to the first target object detected in the first user interface. Under the condition, the personalized storage scheme can be realized on the premise of reducing the operation steps of the user, and the method is high in intelligent degree and strong in interestingness.
When the second implementation manner is adopted, referring to fig. 14, which is a schematic flowchart illustrating a process of acquiring a homing instruction according to another exemplary embodiment, step 12 includes:
and step 141, acquiring a corresponding historical parking rule according to the detected third target object.
And 142, generating a homing instruction according to the historical parking rule.
By adopting the embodiment, the historical storage mode can be repeated, the storage device has the characteristics of convenience in operation and high efficiency, and the labor amount of storage work is effectively reduced.
And step 13, sending a homing instruction to the trailer so that the trailer can move the toy to be received to the toy blanket.
Continuing to refer to fig. 1, after obtaining the homing instruction, sending to the trailer to control the trailer to move the toy to be received corresponding to the identifier to the position corresponding to the parking position information based on the identifier, and completing the receiving.
The toy that this first aspect of disclosure provided accomodates the mode, utilizes user terminal control trailer will wait to accomodate the toy and remove to the toy blanket on, and is interesting strong, and the work of accomodating that will wither converts into the recreation process of edutainment in the happy. In addition, the method is convenient to use and rapid to store, can effectively improve the storage efficiency and reduce the labor required by the storage work.
A second aspect of the present disclosure provides a toy storage method for use with a trailer, with reference to fig. 15, which is a flow chart of a toy storage method according to an exemplary embodiment, the method comprising:
and 151, the trailer receives a homing instruction, wherein the homing instruction at least comprises the identification of the toy to be received and the parking position information.
The trailer passes through the sign and confirms the toy of waiting to accomodate, confirms parking position through parking position information, realizes docking the expected toy of waiting to accomodate in expectation position department.
In one embodiment, step 151 includes: the trailer identifies the toy to be accommodated according to the identification and is connected with the toy to be accommodated.
Illustratively, the second chip of the trailer transmits a signal of a frequency band corresponding to the identifier, so that the toy to be received corresponding to the identifier receives the signal. When waiting to receive the signal that the toy received the trailer and sent, waiting to receive the toy and can feed back to the trailer with the trailer communication and connect the signal, realize that the trailer is according to the toy of waiting to receive of sign discernment.
When the trailer recognizes the toy to be received, the trailer moves to the toy to be received and establishes physical connection with the toy to be received, so that the trailer and the toy can move synchronously, and the purpose that the trailer moves the toy to be received is achieved.
And 152, the trailer moves the toy to be received corresponding to the identifier to a target position, wherein the target position corresponds to the position on the toy blanket corresponding to the parking position information.
Optionally, the trailer obtains a distance between the real-time position of the trailer and the target position, and determines whether the distance is less than or equal to a preset parking distance, and if so, the trailer stops moving and is disconnected from the toy to park the toy at the target position.
At least three non-collinear first chips are arranged on the toy carpet, and the distances from the at least three first chips to a second chip in the monitoring trailer are real-time. And because the trailer drives and waits to accomodate toy synchronous motion, consequently first chip can regard as the control of treating the distance of accomodating the toy to the distance control of second chip. When the distances acquired by the three first chips match with the corresponding distances in the parking position information, the trailer parks the toy to be accommodated at the target position. Here, the matching means that the distance acquired by the first chip in real time is smaller than or equal to a preset stopping distance, the preset stopping distance is the distance from a preset area taking the stopping position as the center to the first chip, and when the toy to be accommodated stops in the preset area, the toy to be accommodated can be considered to be accommodated and stopped in place.
A third aspect of the present disclosure provides a toy storage method, applied to a toy, with reference to fig. 16, which shows a flowchart of a toy storage method according to an exemplary embodiment, the method comprising:
step 161, receiving a homing instruction, wherein the homing instruction at least comprises parking position information;
and step 162, moving to a target position according to the homing instruction, wherein the target position is a position corresponding to the parking position information on the toy blanket.
In this manner, the toy is a self-moving toy, such as a toy car or the like, or other toy configured with a drive assembly. The homing command is also used for braking the toy or a driving component in the toy, so that the toy can move to the target parking position after receiving the homing command.
A fourth aspect of the present disclosure provides a toy storage device, see fig. 17, according to an exemplary real-time illustration of a block diagram of a toy storage device, the device comprising:
the detection module 1701 is configured to detect a homing operation in a preset application interface, where a toy carpet and an object of a toy to be received are mapped to the preset application interface.
A first obtaining module 1702, configured to obtain a homing instruction according to the detected homing operation, where the homing instruction at least includes an identifier of a toy to be received and parking position information;
and a sending module 1703 for moving the toy to be received onto the toy carpet.
Referring to fig. 18, a block diagram of a detection module is shown according to an exemplary embodiment, the detection module comprising:
a first interface unit 1801, configured to provide a first user interface on a preset application interface, where the first user interface includes a first object corresponding to a toy to be received;
a first detection unit 1802 for detecting a trigger operation for a first object.
With continued reference to fig. 18, optionally, the detection module 1701 further includes:
a second interface unit 1803, configured to provide a second user interface at the preset application interface after the first detection unit detects the trigger operation for the first object, where the second user interface includes second objects corresponding to different areas on the toy carpet;
a second detecting unit 1804, configured to detect a triggering operation for a second object.
Referring to fig. 19, a block diagram of a first acquisition module is shown, according to an exemplary embodiment, the first acquisition module comprising:
a first obtaining unit 1901, configured to obtain a corresponding identifier according to the detected first target object;
a second obtaining unit 1902, configured to obtain corresponding stop position information according to the detected second target object;
a first instruction generating unit 1903, configured to generate a homing instruction according to the identifier and the parking position information.
Referring to fig. 20, a block diagram of a toy storage device is shown according to an exemplary embodiment, the device further comprising:
an interface module 2001, configured to provide an input interface for inputting a docking rule in a preset application interface before detecting a homing operation in the preset application interface;
a second obtaining module 2002, configured to detect a docking rule input operation in the input interface, and obtain a docking rule, where the docking rule includes: and the corresponding relation between the preset characteristic information of the toy to be stored and the preset stop position on the toy blanket.
Referring to fig. 21, a block diagram of a second interface unit is shown, according to an exemplary embodiment, the second interface unit including:
a first determining subunit 2101, configured to determine preset feature information of the first target object;
a second determining subunit 2102, configured to determine, according to the preset feature information matching the docking rule, an optional docking position corresponding to the first target object;
a display subunit 2103, configured to display, in the second user interface, the second object corresponding to the selectable parking position.
Referring to FIG. 22, a block diagram of a detection module is shown according to an exemplary embodiment, the detection module comprising:
a third interface unit 2201 for providing a third user interface in the preset application interface, the third user interface displaying a third object,
the third object corresponds to different historical docking rules stored in a docking rule base, the docking rules including: the identification of the toy to be received and the parking position information corresponding to the toy to be received on the toy blanket;
a third detection unit 2202 configured to detect a trigger operation for a third object.
In one embodiment, the third interface element 2201 is further configured to display the third object in the third user interface according to a preset priority based on the record of the historic docking rules.
Referring to fig. 23, a block diagram of a second acquisition module is shown, according to an exemplary embodiment, the second acquisition module comprising:
a third obtaining unit 2301, configured to obtain a corresponding historical docking rule according to the detected third target object;
a second instruction generating unit 2302 for generating a homing instruction according to the historical landing rule.
Referring to fig. 24, which is a block diagram of a user terminal according to an example embodiment, a fifth aspect of the present disclosure provides a user terminal comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor can execute the executable instructions to implement the housing method provided by the first aspect.
Fig. 24 is a schematic diagram illustrating a structure of a user terminal 2400 according to an exemplary embodiment. For example, the apparatus 2400 may be a user device, and may be embodied as a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, a wearable device such as a smart watch, smart glasses, a smart bracelet, a smart running shoe, and the like.
Referring to fig. 24, device 2400 may include one or more of the following components: a processing component 2402, a memory 2404, a power component 2406, a multimedia component 2408, an audio component 2410, an interface for input/output (I/O) 2412, a sensor component 2414, and a communication component 2416.
Processing component 2402 generally controls overall operation of device 2400, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 2402 may include one or more processors 2420 to execute instructions to perform all or part of the steps of the methods described above. Further, processing component 2402 may include one or more modules that facilitate interaction between processing component 2402 and other components. For example, the processing component 2402 may include a multimedia module to facilitate interaction between the multimedia component 2408 and the processing component 2402.
Memory 2404 is configured to store various types of data to support operation at device 2400. Examples of such data include instructions for any application or method operating on device 2400, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 2404 may be implemented by any type of volatile or non-volatile memory device or combination thereof, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
Power supply component 2406 provides power to the various components of device 2400. Power components 2406 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for device 2400.
The multimedia component 2408 includes a screen that provides an output interface between the device 2400 and a user as described above. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of the touch or slide action but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 2408 includes a front facing camera and/or a rear facing camera. The front-facing camera and/or the back-facing camera may receive external multimedia data when device 2400 is in an operational mode, such as a capture mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 2410 is configured to output and/or input audio signals. For example, audio component 2410 may include a Microphone (MIC) configured to receive external audio signals when device 2400 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 2404 or transmitted via the communication component 2416. In some embodiments, the audio component 2410 further comprises a speaker for outputting audio signals.
I/O interface 2412 provides an interface between processing component 2402 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
Sensor component 2414 includes one or more sensors for providing various aspects of state assessment for device 2400. For example, sensor assembly 2414 may detect the open/closed state of device 2400, the relative positioning of components such as a display and keypad of device 2400, the change in position of device 2400 or a component of device 2400, the presence or absence of user contact with device 2400, the orientation or acceleration/deceleration of device 2400, and the change in temperature of device 2400. The sensor component 2414 may include a proximity sensor configured to detect the presence of a nearby object in the absence of any physical contact. The sensor assembly 2414 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 2414 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 2416 is configured to facilitate communication between the apparatus 2400 and other devices in a wired or wireless manner. Device 2400 may access a wireless network based on a communication standard, such as WiFi, 2G, 3G, 4G LTE, 5G, or a combination thereof. In an example embodiment, the communication component 2416 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 2416 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 2400 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
A sixth aspect of the present disclosure provides a toy combination comprising: a toy blanket including a first chip; the trailer is provided with a second chip; the toy is provided with a third chip. Wherein the first chip, the second chip and the third chip implement the housing method as provided in the second aspect.
A seventh aspect of the present disclosure provides a toy combination comprising: a toy blanket including a first chip; the toy is provided with a third chip. Wherein the first chip and the third chip implement the housing method as provided in the third aspect.
An eighth aspect of the present disclosure provides a computer-readable storage medium having computer instructions stored thereon, wherein the instructions, when executed by a processor, implement the method as provided in the first aspect, or the method as provided in the second aspect or the third aspect. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (23)

1. A toy storage method is applied to a user terminal and comprises the following steps:
detecting a homing operation in a preset application interface, wherein the preset application interface is mapped with a toy blanket and an object of a toy to be received;
acquiring a homing instruction according to the detected homing operation, wherein the homing instruction at least comprises an identifier of a toy to be received and parking position information;
and sending the homing instruction to the toy to be received so as to enable the toy to be received to move to the toy blanket, wherein the toy to be received is a movable toy.
2. The method of claim 1, wherein detecting a homing operation in a default application interface comprises:
providing a first user interface in the preset application interface, wherein the first user interface comprises a first object corresponding to the toy to be accommodated;
a trigger operation for the first object is detected.
3. The method of claim 2, wherein detecting a homing operation in a predetermined application interface further comprises:
providing a second user interface in the preset application interface after detecting a trigger operation for the first object, the second user interface including a second object corresponding to a different area on the toy carpet;
detecting a trigger operation for the second object.
4. The method of claim 3, wherein obtaining a homing instruction according to the detected homing operation comprises:
acquiring a corresponding identifier according to the detected first target object;
acquiring corresponding parking position information according to the detected second target object;
and generating the homing instruction according to the identification and the parking position information.
5. The method of claim 3, wherein prior to providing the second user interface in the preset application interface, the method further comprises:
providing an input interface for inputting docking rules in the preset application interface;
detecting a parking rule input operation in the input interface to obtain a parking rule; the docking rules include: and the corresponding relation between the preset characteristic information of the toy to be stored and the preset parking position on the toy blanket.
6. The method of claim 5, wherein the second user interface comprises: and the corresponding relation between the second object corresponding to the preset parking position and the preset characteristic information.
7. The method according to claim 6, wherein providing the second user interface in the preset application interface comprises:
determining characteristic information of a first target object;
according to the characteristic information, matching the parking rule, and determining an optional parking position corresponding to the first target object;
and displaying a second object corresponding to the selectable parking position in the second user interface.
8. The method of claim 1, wherein detecting a homing operation in a default application interface comprises:
providing a third user interface in the preset application interface, wherein the third user interface comprises a third object;
the third object corresponds to different historical parking rules stored in a parking rule base, and the parking rules comprise the identification of the toy to be stored and parking position information corresponding to the toy to be stored on the toy blanket;
detecting a trigger operation for the third object.
9. The method of claim 8, wherein the third user interface comprises a third object comprising:
and displaying the third object in the third user interface according to the record of the historical parking rule and the preset priority.
10. The method of claim 8, wherein obtaining a homing instruction according to the detected homing operation comprises:
acquiring the corresponding historical docking rule according to the detected third target object,
and generating the homing instruction according to the historical parking rule.
11. A toy storage method is characterized in that the toy storage method is applied to a movable toy,
the movable toy receives a homing instruction, wherein the homing instruction at least comprises parking position information;
and the movable toy moves to a target position according to the homing instruction, wherein the target position is a position corresponding to the parking position information on the toy blanket.
12. A toy storage device, comprising:
the detection module is used for detecting the homing operation in a preset application interface, and the preset application interface is mapped with a toy blanket and an object of a toy to be received;
the first obtaining module is used for obtaining a homing instruction according to the detected homing operation, and the homing instruction at least comprises an identifier of a toy to be received and parking position information;
and the sending module is used for sending the homing instruction to the toy to be received so as to enable the toy to be received to move to the toy blanket, and the toy to be received is a movable toy.
13. The apparatus of claim 12, wherein the detection module comprises:
the first interface unit is used for providing a first user interface on the preset application interface, and the first user interface comprises a first object corresponding to the toy to be received;
a first detection unit configured to detect a trigger operation for the first object.
14. The apparatus of claim 13, wherein the detection module further comprises:
a second interface unit, configured to provide a second user interface at the preset application interface after the first detection unit detects the trigger operation for the first object, where the second user interface includes second objects corresponding to different areas on the toy carpet;
a second detection unit configured to detect a trigger operation for the second object.
15. The apparatus of claim 14, wherein the first obtaining module comprises:
the first acquisition unit is used for acquiring a corresponding identifier according to the detected first target object;
the second acquisition unit is used for acquiring corresponding parking position information according to the detected second target object;
and the first instruction generating unit is used for generating a homing instruction according to the identification and the parking position information.
16. The apparatus of claim 13, further comprising:
the interface module is used for providing an input interface for inputting the docking rule in the preset application interface before the homing operation is detected in the preset application interface;
a second obtaining module, configured to detect a docking rule input operation in the input interface, and obtain the docking rule, where the docking rule includes: and the corresponding relation between the preset characteristic information of the toy to be stored and the preset parking position on the toy blanket.
17. The apparatus of claim 14, wherein the second interface unit comprises:
the first determining subunit is used for determining preset characteristic information of the first target object;
the second determining subunit is configured to match the parking rule according to the preset feature information, and determine an optional parking position corresponding to the first target object;
and the display subunit is used for displaying the second object corresponding to the selectable parking position in the second user interface.
18. The apparatus of claim 12, wherein the detection module comprises:
a third interface unit for providing a third user interface in the preset application interface, the third user interface displaying a third object,
the third object corresponds to different historical docking rules stored in a docking rule base, the docking rules including: the identification of the toy to be stored and the parking position information corresponding to the toy to be stored on the toy blanket;
a third detection unit configured to detect a trigger operation for the third object.
19. The apparatus of claim 18, wherein the third interface unit is further configured to display the third object in the third user interface according to a preset priority based on the record of the historical docking rules.
20. The apparatus of claim 18, wherein the obtaining module comprises:
the third acquisition unit is used for acquiring the corresponding historical parking rule according to the detected third target object;
and the second instruction generating unit is used for generating the homing instruction according to the historical parking rule.
21. A user terminal, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is operable with the executable instructions to implement the method of any one of claims 1 to 10.
22. A toy combination, comprising:
a toy blanket comprising a first chip,
a movable toy, wherein a third chip is arranged on the movable toy,
the first chip, the third chip, implement the housing method of claim 11.
23. A computer-readable storage medium having computer instructions stored thereon, which when executed by a processor implement the method of any one of claims 1 to 10 or the method of claim 11.
CN201811017219.7A 2018-08-31 2018-08-31 Toy storage method and device, user terminal and toy combination Active CN109157845B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811017219.7A CN109157845B (en) 2018-08-31 2018-08-31 Toy storage method and device, user terminal and toy combination

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811017219.7A CN109157845B (en) 2018-08-31 2018-08-31 Toy storage method and device, user terminal and toy combination

Publications (2)

Publication Number Publication Date
CN109157845A CN109157845A (en) 2019-01-08
CN109157845B true CN109157845B (en) 2021-08-10

Family

ID=64893757

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811017219.7A Active CN109157845B (en) 2018-08-31 2018-08-31 Toy storage method and device, user terminal and toy combination

Country Status (1)

Country Link
CN (1) CN109157845B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110561442B (en) * 2019-08-02 2021-06-01 苏州昭轩数字科技有限公司 Intelligent toy distribution and arrangement robot and working method thereof

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5387759A (en) * 1991-03-29 1995-02-07 Yamaha Corporation Automatic performance apparatus using separately stored note and technique data for reducing performance data storage requirements
US7902447B1 (en) * 2006-10-03 2011-03-08 Sony Computer Entertainment Inc. Automatic composition of sound sequences using finite state automata
CN207480612U (en) * 2017-12-05 2018-06-12 内蒙古科技大学 A kind of toy managing device
CN108209220A (en) * 2018-01-18 2018-06-29 吴静 A kind of intelligent cosmetic collecting apparatus and its method of work based on image procossing

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5387759A (en) * 1991-03-29 1995-02-07 Yamaha Corporation Automatic performance apparatus using separately stored note and technique data for reducing performance data storage requirements
US7902447B1 (en) * 2006-10-03 2011-03-08 Sony Computer Entertainment Inc. Automatic composition of sound sequences using finite state automata
CN207480612U (en) * 2017-12-05 2018-06-12 内蒙古科技大学 A kind of toy managing device
CN108209220A (en) * 2018-01-18 2018-06-29 吴静 A kind of intelligent cosmetic collecting apparatus and its method of work based on image procossing

Also Published As

Publication number Publication date
CN109157845A (en) 2019-01-08

Similar Documents

Publication Publication Date Title
CN104780654B (en) The control method of portable terminal
EP3035738B1 (en) Method for connecting appliance to network and corresponding device
US20170091551A1 (en) Method and apparatus for controlling electronic device
EP3144915A1 (en) Method and apparatus for controlling device, and terminal device
EP3099042A1 (en) Methods and devices for sending cloud card
CN106356060B (en) Voice communication method and device
EP3125093A1 (en) Method and device for application interaction
US10379602B2 (en) Method and device for switching environment picture
US9749454B2 (en) Method and device for controlling smart device
CN104391626B (en) Method and device for dynamically displaying equipment list
US10042328B2 (en) Alarm setting method and apparatus, and storage medium
US20170249513A1 (en) Picture acquiring method, apparatus, and storage medium
CN106251235A (en) Robot functional configuration system, method and device
CN106203650A (en) Call a taxi and ask sending method and device
CN104010129A (en) Image processing method, device and terminal
WO2019006768A1 (en) Parking space occupation method and device based on unmanned aerial vehicle
CN109157845B (en) Toy storage method and device, user terminal and toy combination
CN107845094A (en) Pictograph detection method, device and computer-readable recording medium
EP3104282A1 (en) Search method and search apparatus
CN107147567A (en) Position information share method, equipment and system
CN106170141A (en) Mobile terminal network changing method and device
CN103984476B (en) menu display method and device
US20180062934A1 (en) Method and apparatus for device identification
CN106020184A (en) Control method and apparatus of balance vehicle, and balance vehicle
CN111225111A (en) Function control method, function control device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant