US20190168128A1 - Interactive learning blocks - Google Patents

Interactive learning blocks Download PDF

Info

Publication number
US20190168128A1
US20190168128A1 US16/266,962 US201916266962A US2019168128A1 US 20190168128 A1 US20190168128 A1 US 20190168128A1 US 201916266962 A US201916266962 A US 201916266962A US 2019168128 A1 US2019168128 A1 US 2019168128A1
Authority
US
United States
Prior art keywords
interactive
block
blocks
task
interactive block
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/266,962
Inventor
Kenneth Madsen
Mikkel Wossner Moos
Frederik S. Nielsen
Loic De Waele
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dxtr Tactile Ivs
Original Assignee
Dxtr Tactile Ivs
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dxtr Tactile Ivs filed Critical Dxtr Tactile Ivs
Priority to US16/266,962 priority Critical patent/US20190168128A1/en
Publication of US20190168128A1 publication Critical patent/US20190168128A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H33/00Other toys
    • A63H33/04Building blocks, strips, or similar building parts
    • A63H33/042Mechanical, electrical, optical, pneumatic or hydraulic arrangements; Motors
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F3/00Board games; Raffle games
    • A63F3/04Geographical or like games ; Educational games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F3/00Board games; Raffle games
    • A63F3/04Geographical or like games ; Educational games
    • A63F3/0421Electric word or number games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F9/00Games not otherwise provided for
    • A63F9/06Patience; Other games for self-amusement
    • A63F9/12Three-dimensional jig-saw puzzles
    • A63F9/1204Puzzles consisting of non-interlocking identical blocks, e.g. children's block puzzles
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H33/00Other toys
    • A63H33/04Building blocks, strips, or similar building parts
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H33/00Other toys
    • A63H33/04Building blocks, strips, or similar building parts
    • A63H33/046Building blocks, strips, or similar building parts comprising magnetic interaction means, e.g. holding together by magnetic attraction
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H33/00Other toys
    • A63H33/26Magnetic or electric toys
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/025Services making use of location information using location based information parameters
    • H04W4/026Services making use of location information using location based information parameters using orientation information, e.g. compass
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/70Services for machine-to-machine communication [M2M] or machine type communication [MTC]
    • H05B37/0272
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/175Controlling the light source by remote control
    • H05B47/19Controlling the light source by remote control via wireless transmission
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F9/00Games not otherwise provided for
    • A63F9/24Electric games; Games using electronic circuits not otherwise provided for
    • A63F2009/2401Detail of input, input devices
    • A63F2009/2436Characteristics of the input
    • A63F2009/2442Sensors or detectors
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F9/00Games not otherwise provided for
    • A63F9/24Electric games; Games using electronic circuits not otherwise provided for
    • A63F2009/2401Detail of input, input devices
    • A63F2009/2436Characteristics of the input
    • A63F2009/2442Sensors or detectors
    • A63F2009/2444Light detector
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F9/00Games not otherwise provided for
    • A63F9/24Electric games; Games using electronic circuits not otherwise provided for
    • A63F2009/2483Other characteristics
    • A63F2009/2488Remotely playable
    • A63F2009/2489Remotely playable by radio transmitters, e.g. using RFID
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H2200/00Computerized interactive toys, e.g. dolls

Definitions

  • the present technology pertains to interactive objects, and more specifically pertains to wirelessly connected interactive blocks connected to a learning platform through a host device.
  • Interactive objects having electronic components have been developed in an attempt to enhance the appeal of these objects. Also, some attempts to represent physical objects in virtual environments have been made. However, there are not robust solutions for collecting information about how tactile objects are interacted with, analyzing the interactivity, providing feedback to a user, and assessing manual ability and dexterousness, cognitive ability, memory and retention, providing challenges based on past performance, etc.
  • Some embodiments of the present technology involve a group of interactive blocks with sensors for collecting information about how tactile objects are interacted with and with one or more communication interface for sending the interaction data to one or more other device or platform for analyzing the interactivity, providing feedback to a user, and assessing manual ability and dexterousness, cognitive ability, memory and retention, providing challenges based on past performance, etc.
  • the interactive blocks can connect to form a network of interactive blocks.
  • the interactive blocks can include one or more sensors, coupling mechanisms for connecting with neighboring blocks, a wireless communications interface for connecting with another electronic device, a communications interface for communicating with a neighboring block.
  • the interactive blocks can also include a processor and a memory for storing a unique identifier and instructions for processing sensor data, neighbor data from the neighboring interactive block, and other state data.
  • the processor can further cause the wireless communication interface to send the processed data to the electronic device.
  • the processor causes the wireless communications interface to send raw or partially processed data to the electronic device or other interactive blocks. A decision can be made regarding manner and degree of pre-processing the processor can apply to the sensor data based on timing, connections between blocks and the base station, and the capabilities of the sensor data source.
  • the interactive blocks can also include a light source, such as a field-programmable light-emitting diode (LED), and the blocks can receive a lighting instruction that causes the light source to light up.
  • a light source such as a field-programmable light-emitting diode (LED)
  • the blocks can receive a lighting instruction that causes the light source to light up.
  • the blocks have sides with translucent regions for allowing light from the light source therethrough.
  • the interactive blocks can also include a base station that receives interaction and sensor data and sends the data to an electronic device or network-based platform.
  • the base station can also be configured with a wireless (e.g., inductive) or physical (e.g., conductive) charging interface and the blocks can include a rechargeable power source an electrical contact capable of communicating power level information and receiving power from the base station.
  • an electronic device is coupled with a group of interactive blocks and the electronic device displays a task relating to the group of interactive blocks that test a wide variety of the user's mental acuity, dexterity, spatial awareness, creativity, etc.
  • the electronic device can continuously receive state data for the collection of interactive blocks that describes how a user manipulates the interactive blocks relative to each other while attempting to complete the task.
  • Some embodiments of the present technology involve displaying, in near real-time, a dynamic virtual representation of the group of interactive blocks and determining, based on the received state information, task process for the task. Based on the task process, the electronic device, the blocks, or both can provide the user with a hint for completing the task. After detecting a task completion event, the electronic device can send task information to a tactile IQ platform and receive, in return, a task score and/or a recommendation for performing one or more further tasks.
  • the task score can involve a comparison of how well the user completed the task compared to a cohort or peer group.
  • FIG. 1 illustrates an example of a group of interactive objects according to some embodiments of the present technology
  • FIG. 2 illustrates a group of interactive objects in communication with a portable electronic device according to some embodiments of the present technology
  • FIG. 3A illustrates an example of an interactive block according to some embodiments of the present technology
  • FIG. 3B illustrates a close-up view of an interactive block according to some embodiments of the present technology
  • FIGS. 4A-4C illustrate various examples of how interactive blocks can inter-couple according to some embodiments of the present technology
  • FIG. 5 shows a partially exploded view of an interactive object showing some components that can be present within interactive object according to some embodiments of the present technology
  • FIG. 6 shows a two-dimensional view of a face of carrier for an interactive object according to some embodiments of the present technology
  • FIG. 7 illustrates a group of interactive blocks connected through a main base station according to some embodiments of the present technology
  • FIG. 8A illustrates a group of interactive objects in communication with a portable electronic device according to some embodiments of the present technology
  • FIG. 8B illustrates a network-based tactile IQ platform configured to monitor user interaction with interactive blocks, collect user task data, and synthesize feedback based on the task data for the user as well as task data from a community of users according to some embodiments of the present technology
  • FIG. 9 illustrates a method of monitoring user interaction with a group of interactive blocks, sending interaction data to a tactile IQ platform, and receiving a task score from the tactile IQ platform according to some embodiments of the present technology.
  • FIGS. 10A-10B illustrate exemplary possible system embodiments.
  • the disclosed technology addresses the need in the art for robust solutions for collecting information about how tactile objects are interacted with, analyzing the interactivity, providing feedback to a user, and assessing manual ability and dexterousness, cognitive ability, memory and retention, providing challenges based on past performance, etc.
  • FIG. 1 illustrates an example of a group of interactive objects 100 ( 100 a - 100 g ).
  • the interactive objects 100 a - 100 g can be physically coupled to multiple other interactive objects 100 a - 100 g (represented by a chain link in FIG. 1 ) and, similarly, can be decoupled.
  • the coupling between interactive objects 100 a - 100 g can be accomplished through a wide variety of means, for example, a mechanical latch, magnetic attraction, adhesive, ball and socket, tube and stud, fastener, etc.
  • the physical coupling is accomplished by one interactive object 100 having a male component and a complimentary interactive object 100 having a female component; alternatively, both interactive objects 100 in a coupling can have symmetrical coupling features.
  • interactive object 100 can couple along multiple faces or features of interactive object 100 .
  • the physical coupling is accomplished or assisted by gravity and/or friction.
  • Coupling multiple interactive objects 100 can allow the creation of various three dimensional shapes, figures, designs, and arrangements.
  • FIG. 1 crudely depicts a tree.
  • interactive object 100 is a cube or cuboid with six faces.
  • interactive object 100 can be any n-faced object that can be coupled with another n-faced object, where n and m may be identical or different.
  • planar faces provide certain benefits such as retaining a degree of freedom when coupling, interactive object 100 can have at least some curved faces as well.
  • an interactive object of one configuration e.g., the number or type of faces
  • each interactive object 100 contains a processor, a sensor, a communication interface, internal circuitry, and can be configured to communicate with each other.
  • interactive object 100 a can be in communication with interactive object 100 b (represented by dashed double arrow).
  • the communication interface is wireless (e.g., sound, light, or other electromagnetic radiation such as Bluetooth, WiFi, NFC, RFID, etc).
  • interactive objects 100 share a common bus. Additionally, in some embodiments, interactive object 100 can also provide and/or receive power through the communication interface, as explained in greater detail below.
  • interactive objects 100 are only in communication with adjacent interactive objects 100 (e.g., those that are inter-coupled). Alternatively, as explained in greater detail below, interactive objects 100 can be in communication with nearby interactive objects 100 , regardless of adjacency (e.g. in a mesh network). For example, interactive object 100 d can be in communication with interactive object 100 f .
  • interactive objects 100 a - 100 g can be configured with sensors that can capture data relating its individual orientation, its position relative to another object, whether it is connected to one or more other objects, etc.
  • the interactive objects 100 a - 100 g can include one or more of an accelerometer, gyroscope, magnetometer, inertia moment unit, etc.
  • each of the interactive objects 100 a - 100 g can be configured with a communication circuitry and can communicate information pertaining to its location (e.g., its placement in space, its relative position with respect to a coordinate system, or its position relative to other interactive objects), orientation, movement (e.g., translation or rotation), coupling status, nearby interactive objects, or current state.
  • interactive object 100 can also communicate with a base station or other client device.
  • the communication circuitry can receive information from a base station or client device for controlling one or more aspect of the interactive object 100 .
  • the interactive objects 100 a - 100 g can contain light sources and a client device can send the interactive objects 100 a - 100 g a control signal for controlling the light sources, as explained in greater detail below.
  • FIG. 2 illustrates a group of interactive objects 100 a - 100 g in communication with a portable electronic device 200 (e.g., via a base station) according to some embodiments of the present technology.
  • the portable electronic device 200 can communicate with a network of interactive objects through a base station.
  • the network of interactive objects can be of various topologies (e.g., a mesh, bus, point-to-point, star, ring, tree, etc.).
  • each interactive object can independently connect to a base station and the base station can then connect to the portable electronic device 200 .
  • portable electronic device 200 communicates with only one interactive object 100 (which might, in turn, communicate with other interactive objects 100 and relay information to portable electronic device 200 ).
  • portable electronic device 200 can communicate with multiple interactive objects 100 .
  • the communication is wireless (e.g., sound, light, or other electromagnetic radiation such as Bluetooth, WiFi, NFC, RFID, etc) or wired (e.g., a physical connection such as through a wire connected to interactive object 100 or by placing interactive object on a conductive surface).
  • communication between portable electronic device 200 and interactive object 100 can be bidirectional.
  • portable electronic device 200 can send commands to interactive object 100 , e.g., to change the color of light(s) embedded in interactive object, activate a motor, play a sound, etc.
  • one or more of the interactive objects 100 a - 100 g can communicate back to the portable electronic device 200 .
  • an interactive object can send the portable electronic device 200 information describing the interactive object's 100 state, position, location, orientation, etc.
  • a first interactive object 100 can communicate information about another interactive object 100 . Accordingly, the portable electronic device 200 can learn the relative orientation and position of each of the interactive objects 100 a - 100 g .
  • the portable electronic device 200 can also execute one or more application 201 configured for bidirectional communication and interaction with the interactive objects 100 a - 100 g and for providing a user with feedback about the orientation of the interactive objects 100 a - 100 g and about a task to be performed using the interactive objects interactive objects 100 a - 100 g , as explained in greater detail below.
  • application 201 configured for bidirectional communication and interaction with the interactive objects 100 a - 100 g and for providing a user with feedback about the orientation of the interactive objects 100 a - 100 g and about a task to be performed using the interactive objects interactive objects 100 a - 100 g , as explained in greater detail below.
  • the application 201 can control (e.g., turn on, off, blink, etc.) a light source (e.g., using a Boolean toggle or instruction such as RGB, HSL, sRGB, CMYK, grayscale value, etc.) of the interactive objects 100 a - 100 g and present a task to a user (e.g. assembling the interactive objects 100 a - 100 g ) and keep track of how well (e.g. how quickly) the user performs the task.
  • the interactive objects 100 a - 100 g are manipulated by the user, the interactive objects 100 a - 100 g send back information to the application regarding their configuration relative to each other.
  • the application 201 can provide task feedback to the user. Also, as explained in greater detail below, the application 201 can be configured to communicate task performance details with a task platform over a network connection and the task platform can process the task performance details, compare the task performance details with other users' data, send the application 201 comparison results, send the application 201 suggestions for further tasks, etc.
  • the application 201 can also be configured to allow a user to perform “free play”. Free play can involve the application 201 tracking the location, orientation, connection status, etc. of a group of interactive objects 100 a - 100 g and present a virtual model of the interactive objects 100 a - 100 g on the portable electronic device 200 . Likewise, the application 201 can provide user-selectable controls for controlling one or more aspect of the interactive objects, e.g. the color of the light source in an interactive object 100 .
  • FIG. 3A illustrates an example of an interactive block 300 according to some embodiments of the present technology.
  • the interactive block 300 can have communications interface 301 for communicating with nearby interactive blocks or other devices.
  • communications interface 301 is centrally located near an edge of the interactive block 300 .
  • communications interface 301 can be located in the centroid of a face or near a corner.
  • communications interface 301 is wireless and can be located anywhere within interactive block 300 .
  • a coupling mechanism 302 can be used to physically connect two interactive blocks 300 as described above.
  • coupling mechanism is located at each corner of interactive block 300 .
  • FIG. 3B illustrates a close-up view of an interactive block 300 according to some embodiments of the present technology.
  • the communications interface 301 can include multiple contacts 304 a - 304 e (collectively, “contact 304 ”).
  • contact 304 can be exposed so that a circuit is complete when the faces of two interactive blocks 300 come in contact.
  • contacts 304 are reversible such that contacts 304 can align with similarly-ordered contacts 304 or reversed-ordered contacts 304 .
  • contacts 304 a - 304 e of one interactive block 300 can connect with, respectively, 304 a - 304 e of another interactive block 300 or with, respectively 304 e - 304 a of the other interactive block 300 .
  • interactive block 300 can have window 303 which can allow light to pass through.
  • Window 303 can be transparent, semitransparent, or translucent.
  • window 303 can include an entire face of interactive object; alternatively, window 303 can include only a portion of a face.
  • interactive block 300 can a single window 303 or multiple window portions 303 (e.g., each face of interactive block 300 can have window 303 ).
  • a single light source from within interactive object 100 can shine through multiple window portions 303 .
  • the single light source can be a field-programmable light-emitting diode (“LED”).
  • each window 303 can have distinct light sources.
  • window 303 can be a light affixed without any covering (e.g., an LED attached to the face of interactive block 300 ).
  • FIGS. 4A-4C illustrate various examples of how interactive blocks can inter-couple according to some embodiments of the present technology.
  • interactive block 401 can connect with interactive block 402 such that the facing sides of the interactive blocks 401 , 402 are completely aligned.
  • multiple coupling mechanisms 302 can couple the two interactive objects (e.g., four magnet pairs) and multiple communications interfaces 301 can interconnect the two interactive objects (e.g., four communications interface 301 pairs).
  • the interactive blocks can only connect using one communications interface 301 pair.
  • one interactive block 411 can connect with another interactive block 412 on an offset, with only half of a face being in contact.
  • this configuration can utilize only two pairs of coupling mechanisms 302 and one pair of communications interfaces 301 .
  • communications interface 301 can be reversible, this can permit interactive blocks to connect as shown in FIG. 4A and FIG. 4B while maintaining communication.
  • FIG. 4C shows another example configuration of two interactive blocks.
  • interactive block 421 can be connected with interactive block 422 at an offset with only a quarter of a face being in contact.
  • an interactive block is configured such that it can physically couple via coupling mechanism 302 even if communications interface 301 is incapable of establishing a connection between two interactive blocks.
  • FIG. 5 shows a partially exploded view of an interactive object showing some components that can be present within interactive object.
  • One or more light sources 502 can be various light emitting sources such as LEDs, light bulbs, electroluminescent wire, etc.
  • a carrier 503 can fit within the faces of interactive object 100 .
  • the carrier 503 can be made of a stiff material such as foam or plastic.
  • the carrier 503 as shown in FIG. 5 is only one side of a multi-sided carrier 503 .
  • the two carrier 503 portions in FIG. 5 can be attached, alternatively three or more portions can be combined to form a cuboid.
  • the carrier 503 can provide structure for interactive object 100 and hold components in place, such as communications interface 301 , coupling mechanism 302 , lights 502 , etc.
  • carrier 503 has a hollow portion.
  • FIG. 6 shows a two-dimensional view of a face of the carrier 503 .
  • the carrier 503 can contain a communications interface (not shown) as well as contacts 304 a - 304 e .
  • carrier 503 is or contains a printed circuit board with traces supplying power and connectivity to different components.
  • communications interface 301 can be reversible; thus, in some embodiments, contacts can be symmetrical.
  • contact 304 a can be connected to 304 e .
  • two contacts 304 can be programmatically interchangeable; for example, a processor can detect that contact 304 b should be configured for one role (e.g., output) and another contact 304 d should be configured for another role (e.g., input).
  • the carrier 503 can facilitate various communications standards such as TWI, I2C, SPI, etc. for internal communication and communication with other blocks.
  • a block can communicate information such as a universal unique identifier (UUID).
  • UUID universal unique identifier
  • the carrier 503 can contain through-holes to interconnect with components internal to carrier.
  • the carrier 503 can have cutouts 601 a - 601 d to house coupling mechanisms 302 (e.g., a spherical magnet).
  • Carrier 503 can also have an aperture 602 so that light from lights 502 can pass through to window 303 .
  • trace 608 and trace 610 provide one of positive or ground voltage to contacts 304 and other components.
  • one interactive object 100 can impact power through contacts 304 to another interactive object.
  • the interactive blocks are also configured with one or more Micro Control Units (MCU) connected to the PCB and trace and that is coupled with a power supply.
  • MCU Micro Control Unit
  • the MCU is also connected to one or more memory device has stored a unique identification number.
  • the interactive blocks are connected in a network using a base station. Alternatively, the interactive blocks can be connected using other network topologies.
  • FIG. 7 illustrates a group of interactive blocks 700 connected through a main base station 750 according to some embodiments of the present technology.
  • the main base station 750 includes a power source 752 with plug cord 753 , a processor 754 and a memory 756 storing a main program for connecting the blocks in a network, registering captured data, managing power and charging for the blocks, etc.
  • the main base station 750 can also include a communication interface 760 for connecting to a portable electronic device and/or a cloud-based platform for tracking block data and task data (explained in greater detail below).
  • the group of interactive blocks 700 can be connected together in a network.
  • the unique unit identification number of each of the blocks allows the base station to know which services are available and the base station can capture the data captured by an internal accelerometer, gyroscope, magnetometer, and/or inertia moment unit (IMU).
  • the communication interface 760 can also communicate with a display, a portable electronic device, an application in a cloud-based platform, etc.
  • the base station 750 can track how the blocks 700 are manipulated and the base station (or a connected portable device, an application of a cloud-based platform, etc.) can provide feedback to a user, e.g. by creating a virtual image of what is being build, providing task completion data, etc. and displaying the virtual image.
  • each block effectively works as a node in the network, allowing several other blocks to connect to the network through it, thereby allowing the number of attached blocks to a single network to grow to the limits of whichever protocol is the basis for the network.
  • each interactive block is embedded with a battery and charging logic and distributed simultaneous charging can be achieved through either a physical, inductive, or other wireless connection, ensuring simultaneous or consecutive recharging by the base station 750 without any form of disassembly or introduction of a cable to the object.
  • the base station 750 will send a current up to a first layer of blocks.
  • Each of the blocks 700 will manage its own battery status and alert the base station 750 if it needs power.
  • the base station 750 sends a power supply out through multiple channels to ensure the right amount of power for multiple blocks at the same time.
  • the embedded circuit of a single block will direct any input of power to the battery charger. When the battery is fully charged the circuit will lock the input to the battery in order to let the other connected blocks in the stack have more charging power.
  • battery life is prolonged through power down actions, prompted by the lack of activity in the accelerometers/motion sensors or in the connections of the blocks. This initiates an interrupt in a main program stored in memory of the base station 750 and executed by the blocks' MCU, thereby issuing a power down of the main system components. After a power down, the blocks 700 exist in a standby mode with next to no power drain is occurring. When activity is registered by the accelerometer or external connections in standby mode an interrupt activates the main program and signs the object back into the system.
  • battery life cycles are tracked through the base station and compared across all batteries.
  • the blocks can be discharged and then recharged during downtime as to preserve maximum long-term capacity for current Lithium-Ion battery technology.
  • some embodiments of the present technology involve interactive objects communicating information to an electronic device, pertaining to their location, orientation, movement (e.g., translation or rotation), coupling status, nearby interactive objects, current lighting status, etc. with a host device.
  • a host device can be a base station, a phone, tablet, other portable electronic device, laptop computer, dedicated machine, gateway, router, etc.
  • the interactive objects can communicate the information through a base station or an integrated communication module (e.g. Bluetooth).
  • FIG. 8A illustrates a group of interactive objects 850 in communication with a portable electronic device 800 according to some embodiments of the present technology.
  • portable electronic device 800 directly connects with only one interactive object (e.g., 850 3 ), which can route communications to other interactive objects 850 .
  • portable electronic device 800 can connect with multiple interactive objects 850 .
  • portable electronic device 800 connects with all available interactive objects 850 .
  • portable electronic device 800 only connects to interactive objects 850 that satisfy certain criteria, such as the closest one(s) (e.g., interactive object 850 3 and 850 7 ) which then relay communications with other interactive objects 850 .
  • a limited number of interactive objects 850 have the capability (e.g.
  • Bluetooth module with communicating with portable electronic device 800 .
  • communication between portable electronic device 800 and interactive objects 850 is accomplished via base station 803 .
  • base station 803 receives data from one entity (e.g., interactive object 850 or portable electronic device 800 ) and processes it, relaying only the processed data to another entity.
  • Communication between interactive objects 850 can be accomplished using a communications interface (e.g. communication interface 301 in FIG. 3A above).
  • interactive objects 850 share data with each neighboring interactive object 850 .
  • interactive object 850 1 can share data with interactive objects 850 2 , 850 4 , and 850 5 .
  • interactive objects 850 use data it receives to gain situational awareness. For example, if interactive object 850 5 communicates its orientation to interactive object 850 6 , interactive object 850 6 can determine its own orientation even if it otherwise lacks a module to help determine its orientation.
  • the data shared between interactive objects 850 and portable electronic device 800 can include sensor data, relative position data, status data (e.g., current state of lights), and data relayed from other interactive objects 850 .
  • the portable electronic device 850 can determine the overall shape and configuration of multiple interactive objects 850 that are connected in the network 850 .
  • the data can describe that the interactive objects 850 are forming the shape of a house and that interactive objects 850 with lights of specific colors are in various positions of the house shape.
  • multiple connected interactive objects 850 can act as one unit.
  • the present technology can address the need in the art for robust solutions for collecting information about how tactile objects are interacted with, analyzing the interactivity, providing feedback to a user, and assessing manual ability and dexterousness, cognitive ability, creativity, memory and retention, etc. and providing challenges based on past performance, etc.
  • the interactive objects can be used to perform tasks that test a user's mental acuity, spatial awareness, creativity, etc.
  • a user can be presented a task through an application 801 on the portable electronic device 800 and the user's performance can be monitored by the application 801 .
  • the application 801 can present the user with a challenge and begin a timer that measures the amount of time the user has taken to complete the challenge.
  • the application 801 can monitor the configuration of the interactive objects 850 and, when the interactive objects are in the appropriate configuration to complete the challenge, the application 801 can determine that the challenge is complete and can stop the timer.
  • Sensor data from the interactive objects 850 can describe the user's attempt at completing the task. This task data collected throughout the attempt can provide an indication as to how effective and efficient the user was at completing (or partially completing the task).
  • the task data can include a history of how the interactive objects are manipulated, connected, disconnected, reconnected, etc. over the duration of the test.
  • the test data can include information about the extent that clues are provided to the user—either through the application 801 or through the interactive objects 850 themselves (e.g. via LEDs).
  • the application 801 is configured to provide a wide variety of tasks for observing, testing, analyzing, etc. a wide variety of mental abilities, tactile and dexterity abilities, memory abilities, etc.
  • Example tasks e.g., test or challenges
  • Examples tasks include puzzles, memory tests, retention tests, tap tests, fine motor tests, dexterity texts, logical thinking tests, critical thinking tests, and other cognitive assessment tests.
  • tasks are similar to standardized and accredited tests.
  • a task might include having the user create structures such as a representation of a human out of interactive objects.
  • Some tasks can include each face of interactive objects being illuminated by one or more colors and a certain cue (e.g., a blinking face of a certain color) can indicate that the user should connect that face with a similarly colored face on another interactive object.
  • Some tasks can include noticing faces that have illuminated in sequence and reproduce the sequence by tapping those faces in the correct order.
  • Some tasks can include building a structure using interactive objects 850 based on a picture displayed in an application. In some embodiments, such a structure can require that specific interactive objects are placed in specific places with designated orientations, as indicated by colors shown on their faces.
  • Some tasks can utilize interactive features of the application in combination with interactive object to prompt a user to complete two tasks (one using interactive object and another using the electronic device) simultaneously, thus causing the user to divide their attention between two tasks. For example, a user might be asked to build a structure with interactive objects while memorizing a sentence said on a speaker of an electronic device according to user application. Some tasks can include rebuilding a structure out of interactive application from memory. For example, the user can be asked to tell a story on one day using interactive object and the next day the user will be asked to repeat the story while building the same structure. Some tasks can include having a user design a structure with interactive object; another person (e.g., a parent or observer) can then attempt to guess what the structure represents (e.g., a chair).
  • another person e.g., a parent or observer
  • user application provides a game-like environment in which the user can be presented with various challenges, games, and tasks. These challenges, games, and tasks can entertain, train, and monitor the user.
  • individual task data can be summarized individually and multiple task data summaries can be aggregated to form scores describing the user's “tactile IQ.”
  • a tactile IQ can be a measure of metrics including speed, dexterity, short term memory, long term memory, working memory, critical thinking, creativity, spatial cognition, adaption, problem solving, concentration, etc.
  • an application administering a task collects data from interactive objects, it can send raw or synthesized data to another platform for analysis.
  • the platform can process the task data and compare a user with the user's peers or other members.
  • a user's peer group can consist of users of the same age, gender, location, etc.
  • FIG. 8B illustrates a network-based tactile IQ platform 802 configured to monitor user interaction with interactive blocks 860 , collect user task data, and synthesize feedback based on the task data for the user as well as task data from a community of users.
  • a user can be administered a task relating to a group of interactive blocks 860 1 , 860 2 , . . . 860 n through a portable electronic device 814 running an application 816 .
  • the interactive blocks 860 can communicate data regarding their orientation, position, coupling status etc. to the application 816 —either directly via a communication link or through a base station.
  • the application 816 can process and package the task data and send it to the platform 802 through a network 899 for analysis.
  • the platform 802 can include a processor 806 , memory 808 , and a communication interface 810 .
  • the platform 802 can receive task data from the portable electronic device 814 through the communication interface 810 , process the data, compare the processed data to other users' data that are stored in the memory 808 , and calculate a result in the form of a pass/fail, score, tactile IQ, etc.
  • the platform 802 can then send the results back to the portable electronic device 814 for display in the application 816 .
  • Task results can provide an overview on the performance of the user and can provide suggestions to enhance the user's performance. For example, task results can provide suggestions that are based on general knowledge and data provided by other users using similar interactive objects.
  • the platform 802 can also suggest tasks that should be performed by the user to most effectively enhance their Tactile IQ. In some embodiments, platform 802 can test the ability for certain tasks to enhance a user's capabilities.
  • the platform 802 first emulates existing standardized and accredited tests in the form of games and challenges and, as the data of the platform grows, and the density of data along with it, new tests will emerge that better and more correctly measure the long line of metrics in a more natural playful setting, in whichever environment the user is.
  • the platform 802 allows a growing amount of data generated to be streamed and stored to a cloud or remote computing facility for statistical analysis. The result of this computation is presented back to the user in the backend application, where decisions on further games and challenges can be made based on the objective input from the measurements. Finally the next games a user will be presented with are a result of n! Games-played specifically by that one user, also with the results analyzed and compared to the nearest peers and optimized for optimal learning outcome.
  • FIG. 9 illustrates a method 900 of monitoring user interaction with a group of interactive blocks, sending interaction data to a tactile IQ platform, and receiving a task score from the tactile IQ platform according to some embodiments of the present technology.
  • the method 900 involves connecting an electronic device to a group of interactive blocks 905 .
  • connecting an electronic device to a group of interactive blocks 905 can involve connecting the electronic device to a base station that is itself connected to the blocks.
  • the electronic device connects to a subset of a group of interactive blocks and the blocks in the subset connect the electronic devices to the rest of the group of interactive blocks.
  • the method 900 involves the electronic device displaying tasks related to manipulating the group of interactive objects 910 .
  • tasks can be used to test a user's mental acuity, dexterity, spatial awareness, creativity, etc.
  • the method 900 involves continuously receiving state information from the group of interactive blocks 915 .
  • the state information describes the orientation of the blocks, rotations and movements of each of the blocks, the position of the blocks with respect to other blocks, the coupling status of each of the blocks, the lighting status of the blocks, etc.
  • the method 900 also involves displaying a dynamic virtual representation of the group of interactive blocks 920 in substantially real-time and based on the received state information. Also, the method 900 involves determining, based on the received state information, task process for the task 925 . For example, if the task involves building a structure within a time period, determining task process can involve determining what percent of the structure has been correctly completed at a given time.
  • the method 900 can involve determining, based on the task process, whether to provide the user with a hint for completing the task 930 .
  • providing a hint can comprise sending a lighting signal to one or more of the interactive blocks that causes the interactive blocks to light up.
  • providing a hint can involve displaying a textual or graphical hint on the display of the electronic device.
  • a task completion event can include an actual completion of a task.
  • a task completion event can involve the task ending by nature of the task not being completed according to one or more requirements, e.g. within a certain time period, with a requisite accuracy, in a more efficient manner than a previous attempt, etc.
  • the method 900 next involves sending task information to a tactile IQ platform 940 and receiving, from the tactile IQ platform, a task score and a recommendation for performing one or more further tasks 945 .
  • the method 900 can involve displaying the task score 950 on the electronic device and/or recommending further tasks 955 through the electronic device.
  • FIG. 10A and FIG. 10B illustrate exemplary possible system embodiments. The more appropriate embodiment will be apparent to those of ordinary skill in the art when practicing the present technology. Persons of ordinary skill in the art will also readily appreciate that other system embodiments are possible.
  • FIG. 10A illustrates a conventional system bus computing system architecture 1000 wherein the components of the system are in electrical communication with each other using a bus 1005 .
  • Exemplary system 1000 includes a processing unit (CPU or processor) 1010 and a system bus 1005 that couples various system components including the system memory 1015 , such as read only memory (ROM) 1020 and random access memory (RAM) 1025 , to the processor 1010 .
  • the system 1000 can include a cache of high-speed memory connected directly with, in close proximity to, or integrated as part of the processor 1010 .
  • the system 1000 can copy data from the memory 1015 and/or the storage device 1030 to the cache 1012 for quick access by the processor 1010 .
  • the cache can provide a performance boost that avoids processor 1010 delays while waiting for data.
  • These and other modules can control or be configured to control the processor 1010 to perform various actions.
  • Other system memory 1015 may be available for use as well.
  • the memory 1015 can include multiple different types of memory with different performance characteristics.
  • the processor 1010 can include any general purpose processor and a hardware module or software module, such as module 1 1032 , module 2 1034 , and module 3 1036 stored in storage device 1030 , configured to control the processor 1010 as well as a special-purpose processor where software instructions are incorporated into the actual processor design.
  • the processor 1010 may essentially be a completely self-contained computing system, containing multiple cores or processors, a bus, memory controller, cache, etc.
  • a multi-core processor may be symmetric or asymmetric.
  • an input device 1045 can represent any number of input mechanisms, such as a microphone for speech, a touch-sensitive screen for gesture or graphical input, keyboard, mouse, motion input, and so forth.
  • An output device 1035 can also be one or more of a number of output mechanisms known to those of skill in the art.
  • multimodal systems can enable a user to provide multiple types of input to communicate with the computing device 1000 .
  • the communications interface 1040 can generally govern and manage the user input and system output. There is no restriction on operating on any particular hardware arrangement and therefore the basic features here may easily be substituted for improved hardware or firmware arrangements as they are developed.
  • Storage device 1030 is a non-volatile memory and can be a hard disk or other types of computer readable media which can store data that are accessible by a computer, such as magnetic cassettes, flash memory cards, solid state memory devices, digital versatile disks, cartridges, random access memories (RAMs) 1025 , read only memory (ROM) 1020 , and hybrids thereof.
  • RAMs random access memories
  • ROM read only memory
  • the storage device 1030 can include software modules 1032 , 1034 , 1036 for controlling the processor 1010 .
  • Other hardware or software modules are contemplated.
  • the storage device 1030 can be connected to the system bus 1005 .
  • a hardware module that performs a particular function can include the software component stored in a computer-readable medium in connection with the necessary hardware components, such as the processor 1010 , bus 1005 , display 1035 , and so forth, to carry out the function.
  • FIG. 10B illustrates a computer system 1050 having a chipset architecture that can be used in executing the described method and generating and displaying a graphical user interface (GUI).
  • Computer system 1050 is an example of computer hardware, software, and firmware that can be used to implement the disclosed technology.
  • System 1050 can include a processor 1055 , representative of any number of physically and/or logically distinct resources capable of executing software, firmware, and hardware configured to perform identified computations.
  • Processor 1055 can communicate with a chipset 1060 that can control input to and output from processor 1055 .
  • chipset 1060 outputs information to output 1065 , such as a display, and can read and write information to storage device 1070 , which can include magnetic media, and solid state media, for example.
  • Chipset 1060 can also read data from and write data to RAM 1075 .
  • a bridge 1080 for interfacing with a variety of user interface components 1085 can be provided for interfacing with chipset 1060 .
  • Such user interface components 1085 can include a keyboard, a microphone, touch detection and processing circuitry, a pointing device, such as a mouse, and so on.
  • inputs to system 1050 can come from any of a variety of sources, machine generated and/or human generated.
  • Chipset 1060 can also interface with one or more communication interfaces 1090 that can have different physical interfaces.
  • Such communication interfaces can include interfaces for wired and wireless local area networks, for broadband wireless networks, as well as personal area networks.
  • Some applications of the methods for generating, displaying, and using the GUI disclosed herein can include receiving ordered datasets over the physical interface or be generated by the machine itself by processor 1055 analyzing data stored in storage 1070 or 1075 . Further, the machine can receive inputs from a user via user interface components 1085 and execute appropriate functions, such as browsing functions by interpreting these inputs using processor 1055 .
  • exemplary systems 1000 and 1050 can have more than one processor 1010 or be part of a group or cluster of computing devices networked together to provide greater processing capability.
  • the present technology may be presented as including individual functional blocks including functional blocks comprising devices, device components, steps or routines in a method embodied in software, or combinations of hardware and software.
  • the computer-readable storage devices, mediums, and memories can include a cable or wireless signal containing a bit stream and the like.
  • non-transitory computer-readable storage media expressly exclude media such as energy, carrier signals, electromagnetic waves, and signals per se.
  • Such instructions can comprise, for example, instructions and data which cause or otherwise configure a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. Portions of computer resources used can be accessible over a network.
  • the computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, firmware, or source code. Examples of computer-readable media that may be used to store instructions, information used, and/or information created during methods according to described examples include magnetic or optical disks, flash memory, USB devices provided with non-volatile memory, networked storage devices, and so on.
  • Devices implementing methods according to these disclosures can comprise hardware, firmware and/or software, and can take any of a variety of form factors. Typical examples of such form factors include laptops, smart phones, small form factor personal computers, personal digital assistants, and so on. Functionality described herein also can be embodied in peripherals or add-in cards. Such functionality can also be implemented on a circuit board among different chips or different processes executing in a single device, by way of further example.
  • the instructions, media for conveying such instructions, computing resources for executing them, and other structures for supporting such computing resources are means for providing the functions described in these disclosures.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Educational Technology (AREA)
  • Signal Processing (AREA)
  • Business, Economics & Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)
  • Toys (AREA)

Abstract

Interactive blocks with sensors for collecting information about how tactile objects are interacted with and with one or more communication interface for sending the interaction data to one or more other device or platform for analyzing the interactivity, providing feedback to a user, and assessing manual ability and dexterousness, cognitive ability, memory and retention, providing challenges based on past performance, learning curve, etc.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application claims priority to U.S. provisional application No. 62/061,621, filed on Oct. 8, 2014, entitled “TACTILE INTELLIGENCE PLATFORM”, which is expressly incorporated by reference herein in its entirety.
  • TECHNICAL FIELD
  • The present technology pertains to interactive objects, and more specifically pertains to wirelessly connected interactive blocks connected to a learning platform through a host device.
  • BACKGROUND
  • Human interaction with blocks, toy bricks, and other objects is essentially ubiquitous across societies and cultures. This interaction is encouraged at a very young age to help develop motor skills, learn spatial relationships and basic physics, etc. Even after youth, tactile play is encouraged to entertain, develop further manual dexterity, express artistic and creative skills, develop logical and other cognitive acumen, etc.
  • Interactive objects having electronic components have been developed in an attempt to enhance the appeal of these objects. Also, some attempts to represent physical objects in virtual environments have been made. However, there are not robust solutions for collecting information about how tactile objects are interacted with, analyzing the interactivity, providing feedback to a user, and assessing manual ability and dexterousness, cognitive ability, memory and retention, providing challenges based on past performance, etc.
  • SUMMARY
  • Additional features and advantages of the disclosure will be set forth in the description which follows, and in part will be obvious from the description, or can be learned by practice of the herein disclosed principles. The features and advantages of the disclosure can be realized and obtained by means of the instruments and combinations particularly pointed out in the appended claims. These and other features of the disclosure will become more fully apparent from the following description and appended claims, or can be learned by the practice of the principles set forth herein.
  • Some embodiments of the present technology involve a group of interactive blocks with sensors for collecting information about how tactile objects are interacted with and with one or more communication interface for sending the interaction data to one or more other device or platform for analyzing the interactivity, providing feedback to a user, and assessing manual ability and dexterousness, cognitive ability, memory and retention, providing challenges based on past performance, etc. The interactive blocks can connect to form a network of interactive blocks.
  • The interactive blocks can include one or more sensors, coupling mechanisms for connecting with neighboring blocks, a wireless communications interface for connecting with another electronic device, a communications interface for communicating with a neighboring block. The interactive blocks can also include a processor and a memory for storing a unique identifier and instructions for processing sensor data, neighbor data from the neighboring interactive block, and other state data. The processor can further cause the wireless communication interface to send the processed data to the electronic device. In some embodiments, the processor causes the wireless communications interface to send raw or partially processed data to the electronic device or other interactive blocks. A decision can be made regarding manner and degree of pre-processing the processor can apply to the sensor data based on timing, connections between blocks and the base station, and the capabilities of the sensor data source.
  • The interactive blocks can also include a light source, such as a field-programmable light-emitting diode (LED), and the blocks can receive a lighting instruction that causes the light source to light up. In some embodiments, the blocks have sides with translucent regions for allowing light from the light source therethrough.
  • The interactive blocks can also include a base station that receives interaction and sensor data and sends the data to an electronic device or network-based platform. The base station can also be configured with a wireless (e.g., inductive) or physical (e.g., conductive) charging interface and the blocks can include a rechargeable power source an electrical contact capable of communicating power level information and receiving power from the base station.
  • In some embodiments of the present technology, an electronic device is coupled with a group of interactive blocks and the electronic device displays a task relating to the group of interactive blocks that test a wide variety of the user's mental acuity, dexterity, spatial awareness, creativity, etc. The electronic device can continuously receive state data for the collection of interactive blocks that describes how a user manipulates the interactive blocks relative to each other while attempting to complete the task.
  • Some embodiments of the present technology involve displaying, in near real-time, a dynamic virtual representation of the group of interactive blocks and determining, based on the received state information, task process for the task. Based on the task process, the electronic device, the blocks, or both can provide the user with a hint for completing the task. After detecting a task completion event, the electronic device can send task information to a tactile IQ platform and receive, in return, a task score and/or a recommendation for performing one or more further tasks. In some embodiments of the present technology, the task score can involve a comparison of how well the user completed the task compared to a cohort or peer group.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • In order to describe the manner in which the above-recited and other advantages and features of the disclosure can be obtained, a more particular description of the principles briefly described above will be rendered by reference to specific embodiments thereof which are illustrated in the appended drawings. Understanding that these drawings depict only exemplary embodiments of the disclosure and are not therefore to be considered to be limiting of its scope, the principles herein are described and explained with additional specificity and detail through the use of the accompanying drawings in which:
  • FIG. 1 illustrates an example of a group of interactive objects according to some embodiments of the present technology;
  • FIG. 2 illustrates a group of interactive objects in communication with a portable electronic device according to some embodiments of the present technology;
  • FIG. 3A illustrates an example of an interactive block according to some embodiments of the present technology;
  • FIG. 3B illustrates a close-up view of an interactive block according to some embodiments of the present technology;
  • FIGS. 4A-4C illustrate various examples of how interactive blocks can inter-couple according to some embodiments of the present technology;
  • FIG. 5 shows a partially exploded view of an interactive object showing some components that can be present within interactive object according to some embodiments of the present technology;
  • FIG. 6 shows a two-dimensional view of a face of carrier for an interactive object according to some embodiments of the present technology;
  • FIG. 7 illustrates a group of interactive blocks connected through a main base station according to some embodiments of the present technology;
  • FIG. 8A illustrates a group of interactive objects in communication with a portable electronic device according to some embodiments of the present technology;
  • FIG. 8B illustrates a network-based tactile IQ platform configured to monitor user interaction with interactive blocks, collect user task data, and synthesize feedback based on the task data for the user as well as task data from a community of users according to some embodiments of the present technology;
  • FIG. 9 illustrates a method of monitoring user interaction with a group of interactive blocks, sending interaction data to a tactile IQ platform, and receiving a task score from the tactile IQ platform according to some embodiments of the present technology; and
  • FIGS. 10A-10B illustrate exemplary possible system embodiments.
  • DESCRIPTION
  • Various embodiments of the disclosure are discussed in detail below. While specific implementations are discussed, it should be understood that this is done for illustration purposes only. A person skilled in the relevant art will recognize that other components and configurations may be used without parting from the spirit and scope of the disclosure.
  • A used herein the term “configured” shall be considered to interchangeably be used to refer to configured and configurable, unless the term “configurable” is explicitly used to distinguish from “configured”. The proper understanding of the term will be apparent to persons of ordinary skill in the art in the context in which the term is used.
  • The disclosed technology addresses the need in the art for robust solutions for collecting information about how tactile objects are interacted with, analyzing the interactivity, providing feedback to a user, and assessing manual ability and dexterousness, cognitive ability, memory and retention, providing challenges based on past performance, etc.
  • I. Introduction—Interactive Objects and Communication with a Host Device
  • FIG. 1 illustrates an example of a group of interactive objects 100 (100 a-100 g). The interactive objects 100 a-100 g can be physically coupled to multiple other interactive objects 100 a-100 g (represented by a chain link in FIG. 1) and, similarly, can be decoupled.
  • The coupling between interactive objects 100 a-100 g can be accomplished through a wide variety of means, for example, a mechanical latch, magnetic attraction, adhesive, ball and socket, tube and stud, fastener, etc. In some embodiments, the physical coupling is accomplished by one interactive object 100 having a male component and a complimentary interactive object 100 having a female component; alternatively, both interactive objects 100 in a coupling can have symmetrical coupling features. In some embodiments, interactive object 100 can couple along multiple faces or features of interactive object 100. In some embodiments, the physical coupling is accomplished or assisted by gravity and/or friction.
  • Coupling multiple interactive objects 100 can allow the creation of various three dimensional shapes, figures, designs, and arrangements. For example, FIG. 1 crudely depicts a tree.
  • In some embodiments, interactive object 100 is a cube or cuboid with six faces. Alternatively, interactive object 100 can be any n-faced object that can be coupled with another n-faced object, where n and m may be identical or different. Although planar faces provide certain benefits such as retaining a degree of freedom when coupling, interactive object 100 can have at least some curved faces as well. In some embodiments, an interactive object of one configuration (e.g., the number or type of faces) can couple with an interactive object of a different configuration.
  • Additionally, each interactive object 100 contains a processor, a sensor, a communication interface, internal circuitry, and can be configured to communicate with each other. For example, interactive object 100 a can be in communication with interactive object 100 b (represented by dashed double arrow). In some embodiments, the communication interface is wireless (e.g., sound, light, or other electromagnetic radiation such as Bluetooth, WiFi, NFC, RFID, etc). In some embodiments, interactive objects 100 share a common bus. Additionally, in some embodiments, interactive object 100 can also provide and/or receive power through the communication interface, as explained in greater detail below.
  • In some embodiments, interactive objects 100 are only in communication with adjacent interactive objects 100 (e.g., those that are inter-coupled). Alternatively, as explained in greater detail below, interactive objects 100 can be in communication with nearby interactive objects 100, regardless of adjacency (e.g. in a mesh network). For example, interactive object 100 d can be in communication with interactive object 100 f.
  • As explained in greater detail below, interactive objects 100 a-100 g can be configured with sensors that can capture data relating its individual orientation, its position relative to another object, whether it is connected to one or more other objects, etc. For example, the interactive objects 100 a-100 g can include one or more of an accelerometer, gyroscope, magnetometer, inertia moment unit, etc.
  • Furthermore, each of the interactive objects 100 a-100 g can be configured with a communication circuitry and can communicate information pertaining to its location (e.g., its placement in space, its relative position with respect to a coordinate system, or its position relative to other interactive objects), orientation, movement (e.g., translation or rotation), coupling status, nearby interactive objects, or current state. For example, in some embodiments, interactive object 100 can also communicate with a base station or other client device. Likewise, the communication circuitry can receive information from a base station or client device for controlling one or more aspect of the interactive object 100. For example, the interactive objects 100 a-100 g can contain light sources and a client device can send the interactive objects 100 a-100 g a control signal for controlling the light sources, as explained in greater detail below.
  • FIG. 2 illustrates a group of interactive objects 100 a-100 g in communication with a portable electronic device 200 (e.g., via a base station) according to some embodiments of the present technology. As explained below in greater detail, the portable electronic device 200 can communicate with a network of interactive objects through a base station. The network of interactive objects can be of various topologies (e.g., a mesh, bus, point-to-point, star, ring, tree, etc.). For example, each interactive object can independently connect to a base station and the base station can then connect to the portable electronic device 200.
  • In some embodiments, portable electronic device 200 communicates with only one interactive object 100 (which might, in turn, communicate with other interactive objects 100 and relay information to portable electronic device 200). Alternatively portable electronic device 200 can communicate with multiple interactive objects 100. In some embodiments, the communication is wireless (e.g., sound, light, or other electromagnetic radiation such as Bluetooth, WiFi, NFC, RFID, etc) or wired (e.g., a physical connection such as through a wire connected to interactive object 100 or by placing interactive object on a conductive surface).
  • In some embodiments, communication between portable electronic device 200 and interactive object 100 can be bidirectional. For example, portable electronic device 200 can send commands to interactive object 100, e.g., to change the color of light(s) embedded in interactive object, activate a motor, play a sound, etc. Also, one or more of the interactive objects 100 a-100 g can communicate back to the portable electronic device 200. For example, an interactive object can send the portable electronic device 200 information describing the interactive object's 100 state, position, location, orientation, etc. Likewise, a first interactive object 100 can communicate information about another interactive object 100. Accordingly, the portable electronic device 200 can learn the relative orientation and position of each of the interactive objects 100 a-100 g.
  • The portable electronic device 200 can also execute one or more application 201 configured for bidirectional communication and interaction with the interactive objects 100 a-100 g and for providing a user with feedback about the orientation of the interactive objects 100 a-100 g and about a task to be performed using the interactive objects interactive objects 100 a-100 g, as explained in greater detail below.
  • For example, as shown in FIG. 2, the application 201 can control (e.g., turn on, off, blink, etc.) a light source (e.g., using a Boolean toggle or instruction such as RGB, HSL, sRGB, CMYK, grayscale value, etc.) of the interactive objects 100 a-100 g and present a task to a user (e.g. assembling the interactive objects 100 a-100 g) and keep track of how well (e.g. how quickly) the user performs the task. As the interactive objects 100 a-100 g are manipulated by the user, the interactive objects 100 a-100 g send back information to the application regarding their configuration relative to each other. Once the interactive objects 100 a-100 g are configured according to the task, the application 201 can provide task feedback to the user. Also, as explained in greater detail below, the application 201 can be configured to communicate task performance details with a task platform over a network connection and the task platform can process the task performance details, compare the task performance details with other users' data, send the application 201 comparison results, send the application 201 suggestions for further tasks, etc.
  • The application 201 can also be configured to allow a user to perform “free play”. Free play can involve the application 201 tracking the location, orientation, connection status, etc. of a group of interactive objects 100 a-100 g and present a virtual model of the interactive objects 100 a-100 g on the portable electronic device 200. Likewise, the application 201 can provide user-selectable controls for controlling one or more aspect of the interactive objects, e.g. the color of the light source in an interactive object 100.
  • A wide variety of other applications can be used in conjunction with the interactive objects 100 a-100 g, some of which are explained in greater detail below and some of which will be apparent to those with ordinary skill in the art having the benefit of the present disclosure.
  • II. Interactive Object Details
  • FIG. 3A illustrates an example of an interactive block 300 according to some embodiments of the present technology. The interactive block 300 can have communications interface 301 for communicating with nearby interactive blocks or other devices. In some embodiments, communications interface 301 is centrally located near an edge of the interactive block 300. In some embodiments, there are communications interfaces 301 along each edge of each face of interactive block 300—making twenty-four communications interfaces 301 per interactive block 300. In some embodiments, communications interface 301 can be located in the centroid of a face or near a corner. In some embodiments, communications interface 301 is wireless and can be located anywhere within interactive block 300.
  • As shown in FIG. 3A, a coupling mechanism 302 can be used to physically connect two interactive blocks 300 as described above. In some embodiments, coupling mechanism is located at each corner of interactive block 300.
  • FIG. 3B illustrates a close-up view of an interactive block 300 according to some embodiments of the present technology. As shown, the communications interface 301 can include multiple contacts 304 a-304 e (collectively, “contact 304”). In some embodiments, contact 304 can be exposed so that a circuit is complete when the faces of two interactive blocks 300 come in contact. In some embodiments, contacts 304 are reversible such that contacts 304 can align with similarly-ordered contacts 304 or reversed-ordered contacts 304. For example, contacts 304 a-304 e of one interactive block 300 can connect with, respectively, 304 a-304 e of another interactive block 300 or with, respectively 304 e-304 a of the other interactive block 300.
  • In some embodiments, interactive block 300 can have window 303 which can allow light to pass through. Window 303 can be transparent, semitransparent, or translucent. In some embodiments, window 303 can include an entire face of interactive object; alternatively, window 303 can include only a portion of a face. In some embodiments, interactive block 300 can a single window 303 or multiple window portions 303 (e.g., each face of interactive block 300 can have window 303). In some embodiments, a single light source from within interactive object 100 can shine through multiple window portions 303. For example, the single light source can be a field-programmable light-emitting diode (“LED”). Alternatively, each window 303 can have distinct light sources. In some embodiments, window 303 can be a light affixed without any covering (e.g., an LED attached to the face of interactive block 300).
  • FIGS. 4A-4C illustrate various examples of how interactive blocks can inter-couple according to some embodiments of the present technology. For example, in FIG. 4A, interactive block 401 can connect with interactive block 402 such that the facing sides of the interactive blocks 401, 402 are completely aligned. In such configuration, multiple coupling mechanisms 302 can couple the two interactive objects (e.g., four magnet pairs) and multiple communications interfaces 301 can interconnect the two interactive objects (e.g., four communications interface 301 pairs). Alternatively, the interactive blocks can only connect using one communications interface 301 pair. As shown in FIG. 4B, one interactive block 411 can connect with another interactive block 412 on an offset, with only half of a face being in contact. In some embodiments, this configuration can utilize only two pairs of coupling mechanisms 302 and one pair of communications interfaces 301. In some embodiments, communications interface 301 can be reversible, this can permit interactive blocks to connect as shown in FIG. 4A and FIG. 4B while maintaining communication. FIG. 4C shows another example configuration of two interactive blocks. For example, interactive block 421 can be connected with interactive block 422 at an offset with only a quarter of a face being in contact. In some embodiments, an interactive block is configured such that it can physically couple via coupling mechanism 302 even if communications interface 301 is incapable of establishing a connection between two interactive blocks. Using only the connection options provided in FIGS. 4A-4C, there are 62*4 (FIG. 4A)+242 (FIG. 4B)+242 (FIG. 4C)=1248 connection possibilities using just two blocks. The more blocks used can exponentially increase the connection possibilities.
  • FIG. 5 shows a partially exploded view of an interactive object showing some components that can be present within interactive object. One or more light sources 502 can be various light emitting sources such as LEDs, light bulbs, electroluminescent wire, etc. A carrier 503 can fit within the faces of interactive object 100. The carrier 503 can be made of a stiff material such as foam or plastic. In some embodiments, the carrier 503 as shown in FIG. 5 is only one side of a multi-sided carrier 503. For example, the two carrier 503 portions in FIG. 5 can be attached, alternatively three or more portions can be combined to form a cuboid. The carrier 503 can provide structure for interactive object 100 and hold components in place, such as communications interface 301, coupling mechanism 302, lights 502, etc. In some embodiments, carrier 503 has a hollow portion.
  • FIG. 6 shows a two-dimensional view of a face of the carrier 503. The carrier 503 can contain a communications interface (not shown) as well as contacts 304 a-304 e. In some embodiments, carrier 503 is or contains a printed circuit board with traces supplying power and connectivity to different components. As described previously, communications interface 301 can be reversible; thus, in some embodiments, contacts can be symmetrical. For example, contact 304 a can be connected to 304 e. In some embodiments, two contacts 304 can be programmatically interchangeable; for example, a processor can detect that contact 304 b should be configured for one role (e.g., output) and another contact 304 d should be configured for another role (e.g., input). The carrier 503 can facilitate various communications standards such as TWI, I2C, SPI, etc. for internal communication and communication with other blocks. In some embodiments, a block can communicate information such as a universal unique identifier (UUID). The carrier 503 can contain through-holes to interconnect with components internal to carrier. The carrier 503 can have cutouts 601 a-601 d to house coupling mechanisms 302 (e.g., a spherical magnet). Carrier 503 can also have an aperture 602 so that light from lights 502 can pass through to window 303. In some embodiments, trace 608 and trace 610 provide one of positive or ground voltage to contacts 304 and other components. In some embodiments, one interactive object 100 can impact power through contacts 304 to another interactive object.
  • The interactive blocks are also configured with one or more Micro Control Units (MCU) connected to the PCB and trace and that is coupled with a power supply. The MCU is also connected to one or more memory device has stored a unique identification number. In some embodiments, the interactive blocks are connected in a network using a base station. Alternatively, the interactive blocks can be connected using other network topologies.
  • FIG. 7 illustrates a group of interactive blocks 700 connected through a main base station 750 according to some embodiments of the present technology. The main base station 750 includes a power source 752 with plug cord 753, a processor 754 and a memory 756 storing a main program for connecting the blocks in a network, registering captured data, managing power and charging for the blocks, etc. The main base station 750 can also include a communication interface 760 for connecting to a portable electronic device and/or a cloud-based platform for tracking block data and task data (explained in greater detail below).
  • The group of interactive blocks 700 can be connected together in a network. The unique unit identification number of each of the blocks allows the base station to know which services are available and the base station can capture the data captured by an internal accelerometer, gyroscope, magnetometer, and/or inertia moment unit (IMU). The communication interface 760 can also communicate with a display, a portable electronic device, an application in a cloud-based platform, etc.
  • Using this captured data, the base station 750 can track how the blocks 700 are manipulated and the base station (or a connected portable device, an application of a cloud-based platform, etc.) can provide feedback to a user, e.g. by creating a virtual image of what is being build, providing task completion data, etc. and displaying the virtual image.
  • In some embodiments, each block effectively works as a node in the network, allowing several other blocks to connect to the network through it, thereby allowing the number of attached blocks to a single network to grow to the limits of whichever protocol is the basis for the network.
  • Additionally, in some embodiments of the present technology the base station 750 configured for physical, inductive, or other wireless charging of the blocks 700. In these embodiments, each interactive block is embedded with a battery and charging logic and distributed simultaneous charging can be achieved through either a physical, inductive, or other wireless connection, ensuring simultaneous or consecutive recharging by the base station 750 without any form of disassembly or introduction of a cable to the object. The base station 750 will send a current up to a first layer of blocks. Each of the blocks 700 will manage its own battery status and alert the base station 750 if it needs power. The base station 750 sends a power supply out through multiple channels to ensure the right amount of power for multiple blocks at the same time. During charging the embedded circuit of a single block will direct any input of power to the battery charger. When the battery is fully charged the circuit will lock the input to the battery in order to let the other connected blocks in the stack have more charging power.
  • In some embodiments, battery life is prolonged through power down actions, prompted by the lack of activity in the accelerometers/motion sensors or in the connections of the blocks. This initiates an interrupt in a main program stored in memory of the base station 750 and executed by the blocks' MCU, thereby issuing a power down of the main system components. After a power down, the blocks 700 exist in a standby mode with next to no power drain is occurring. When activity is registered by the accelerometer or external connections in standby mode an interrupt activates the main program and signs the object back into the system.
  • In some embodiments, battery life cycles are tracked through the base station and compared across all batteries. In order to optimize life cycles, the blocks can be discharged and then recharged during downtime as to preserve maximum long-term capacity for current Lithium-Ion battery technology.
  • III. Tracking Object Interaction, Tactile Intelligence, and Cloud-Based Platform
  • As explained above, some embodiments of the present technology involve interactive objects communicating information to an electronic device, pertaining to their location, orientation, movement (e.g., translation or rotation), coupling status, nearby interactive objects, current lighting status, etc. with a host device. A host device can be a base station, a phone, tablet, other portable electronic device, laptop computer, dedicated machine, gateway, router, etc. The interactive objects can communicate the information through a base station or an integrated communication module (e.g. Bluetooth).
  • FIG. 8A illustrates a group of interactive objects 850 in communication with a portable electronic device 800 according to some embodiments of the present technology. In some embodiments, portable electronic device 800 directly connects with only one interactive object (e.g., 850 3), which can route communications to other interactive objects 850. Alternatively, portable electronic device 800 can connect with multiple interactive objects 850. In some embodiments, portable electronic device 800 connects with all available interactive objects 850. In some embodiments, portable electronic device 800 only connects to interactive objects 850 that satisfy certain criteria, such as the closest one(s) (e.g., interactive object 850 3 and 850 7) which then relay communications with other interactive objects 850. In some embodiments, a limited number of interactive objects 850 have the capability (e.g. Bluetooth module) with communicating with portable electronic device 800. In some embodiments, communication between portable electronic device 800 and interactive objects 850 is accomplished via base station 803. In some embodiments, base station 803 receives data from one entity (e.g., interactive object 850 or portable electronic device 800) and processes it, relaying only the processed data to another entity.
  • Communication between interactive objects 850 can be accomplished using a communications interface (e.g. communication interface 301 in FIG. 3A above). In some embodiments, interactive objects 850 share data with each neighboring interactive object 850. For example, interactive object 850 1 can share data with interactive objects 850 2, 850 4, and 850 5. In some embodiments, interactive objects 850 use data it receives to gain situational awareness. For example, if interactive object 850 5 communicates its orientation to interactive object 850 6, interactive object 850 6 can determine its own orientation even if it otherwise lacks a module to help determine its orientation.
  • The data shared between interactive objects 850 and portable electronic device 800 can include sensor data, relative position data, status data (e.g., current state of lights), and data relayed from other interactive objects 850. Using this data, the portable electronic device 850 can determine the overall shape and configuration of multiple interactive objects 850 that are connected in the network 850. For example, the data can describe that the interactive objects 850 are forming the shape of a house and that interactive objects 850 with lights of specific colors are in various positions of the house shape. In some embodiments, multiple connected interactive objects 850 can act as one unit.
  • As explained above, the present technology can address the need in the art for robust solutions for collecting information about how tactile objects are interacted with, analyzing the interactivity, providing feedback to a user, and assessing manual ability and dexterousness, cognitive ability, creativity, memory and retention, etc. and providing challenges based on past performance, etc.
  • For example, the interactive objects can be used to perform tasks that test a user's mental acuity, spatial awareness, creativity, etc. As shown in FIG. 8A, a user can be presented a task through an application 801 on the portable electronic device 800 and the user's performance can be monitored by the application 801. For example, the application 801 can present the user with a challenge and begin a timer that measures the amount of time the user has taken to complete the challenge. The application 801 can monitor the configuration of the interactive objects 850 and, when the interactive objects are in the appropriate configuration to complete the challenge, the application 801 can determine that the challenge is complete and can stop the timer.
  • Sensor data from the interactive objects 850 can describe the user's attempt at completing the task. This task data collected throughout the attempt can provide an indication as to how effective and efficient the user was at completing (or partially completing the task). The task data can include a history of how the interactive objects are manipulated, connected, disconnected, reconnected, etc. over the duration of the test. Also, the test data can include information about the extent that clues are provided to the user—either through the application 801 or through the interactive objects 850 themselves (e.g. via LEDs).
  • In some embodiments of the present technology, the application 801 is configured to provide a wide variety of tasks for observing, testing, analyzing, etc. a wide variety of mental abilities, tactile and dexterity abilities, memory abilities, etc. Example tasks (e.g., test or challenges) include puzzles, memory tests, retention tests, tap tests, fine motor tests, dexterity texts, logical thinking tests, critical thinking tests, and other cognitive assessment tests. In some embodiments, tasks are similar to standardized and accredited tests.
  • In some embodiments, a task might include having the user create structures such as a representation of a human out of interactive objects. Some tasks can include each face of interactive objects being illuminated by one or more colors and a certain cue (e.g., a blinking face of a certain color) can indicate that the user should connect that face with a similarly colored face on another interactive object. Some tasks can include noticing faces that have illuminated in sequence and reproduce the sequence by tapping those faces in the correct order. Some tasks can include building a structure using interactive objects 850 based on a picture displayed in an application. In some embodiments, such a structure can require that specific interactive objects are placed in specific places with designated orientations, as indicated by colors shown on their faces.
  • Some tasks can utilize interactive features of the application in combination with interactive object to prompt a user to complete two tasks (one using interactive object and another using the electronic device) simultaneously, thus causing the user to divide their attention between two tasks. For example, a user might be asked to build a structure with interactive objects while memorizing a sentence said on a speaker of an electronic device according to user application. Some tasks can include rebuilding a structure out of interactive application from memory. For example, the user can be asked to tell a story on one day using interactive object and the next day the user will be asked to repeat the story while building the same structure. Some tasks can include having a user design a structure with interactive object; another person (e.g., a parent or observer) can then attempt to guess what the structure represents (e.g., a chair).
  • In some embodiments, user application provides a game-like environment in which the user can be presented with various challenges, games, and tasks. These challenges, games, and tasks can entertain, train, and monitor the user.
  • In some embodiments of the present technology, individual task data can be summarized individually and multiple task data summaries can be aggregated to form scores describing the user's “tactile IQ.” A tactile IQ can be a measure of metrics including speed, dexterity, short term memory, long term memory, working memory, critical thinking, creativity, spatial cognition, adaption, problem solving, concentration, etc.
  • Also, after an application administering a task collects data from interactive objects, it can send raw or synthesized data to another platform for analysis. The platform can process the task data and compare a user with the user's peers or other members. A user's peer group can consist of users of the same age, gender, location, etc.
  • FIG. 8B illustrates a network-based tactile IQ platform 802 configured to monitor user interaction with interactive blocks 860, collect user task data, and synthesize feedback based on the task data for the user as well as task data from a community of users.
  • A user can be administered a task relating to a group of interactive blocks 860 1, 860 2, . . . 860 n through a portable electronic device 814 running an application 816. The interactive blocks 860 can communicate data regarding their orientation, position, coupling status etc. to the application 816—either directly via a communication link or through a base station. The application 816 can process and package the task data and send it to the platform 802 through a network 899 for analysis.
  • The platform 802 can include a processor 806, memory 808, and a communication interface 810. The platform 802 can receive task data from the portable electronic device 814 through the communication interface 810, process the data, compare the processed data to other users' data that are stored in the memory 808, and calculate a result in the form of a pass/fail, score, tactile IQ, etc. The platform 802 can then send the results back to the portable electronic device 814 for display in the application 816.
  • Task results can provide an overview on the performance of the user and can provide suggestions to enhance the user's performance. For example, task results can provide suggestions that are based on general knowledge and data provided by other users using similar interactive objects. The platform 802 can also suggest tasks that should be performed by the user to most effectively enhance their Tactile IQ. In some embodiments, platform 802 can test the ability for certain tasks to enhance a user's capabilities.
  • In some embodiments, the platform 802 first emulates existing standardized and accredited tests in the form of games and challenges and, as the data of the platform grows, and the density of data along with it, new tests will emerge that better and more correctly measure the long line of metrics in a more natural playful setting, in whichever environment the user is.
  • The platform 802 allows a growing amount of data generated to be streamed and stored to a cloud or remote computing facility for statistical analysis. The result of this computation is presented back to the user in the backend application, where decisions on further games and challenges can be made based on the objective input from the measurements. Finally the next games a user will be presented with are a result of n! Games-played specifically by that one user, also with the results analyzed and compared to the nearest peers and optimized for optimal learning outcome.
  • Additionally, when a group of users share a common history and games, activities, and results, we can combine these into a cohort. Since these users with a high probability share learning patterns, the system can automatically optimize and experiment for the optimal next game to present a user with, through AB testing on the whole cohort. The objective of this Self Improving Adaptive Learning and Cognitive Assessment Feedback Loop is to tighten the scores for the whole cohort. A user can at any point in time be associated with a new cohort, as patterns and histories of users evolve, new trends will emerge, and as such, the system shall at all times try to match a user his nearest peers. A cohort thus becomes an ever changing entity of fluctuating sample size accommodating the changing nature of the mind.
  • FIG. 9 illustrates a method 900 of monitoring user interaction with a group of interactive blocks, sending interaction data to a tactile IQ platform, and receiving a task score from the tactile IQ platform according to some embodiments of the present technology.
  • The method 900 involves connecting an electronic device to a group of interactive blocks 905. In some embodiments, connecting an electronic device to a group of interactive blocks 905 can involve connecting the electronic device to a base station that is itself connected to the blocks. In some embodiments, the electronic device connects to a subset of a group of interactive blocks and the blocks in the subset connect the electronic devices to the rest of the group of interactive blocks.
  • Next, the method 900 involves the electronic device displaying tasks related to manipulating the group of interactive objects 910. As explained above, a wide variety of tasks can be used to test a user's mental acuity, dexterity, spatial awareness, creativity, etc. After receiving a user selection of a task 912, the method 900 involves continuously receiving state information from the group of interactive blocks 915. The state information describes the orientation of the blocks, rotations and movements of each of the blocks, the position of the blocks with respect to other blocks, the coupling status of each of the blocks, the lighting status of the blocks, etc.
  • The method 900 also involves displaying a dynamic virtual representation of the group of interactive blocks 920 in substantially real-time and based on the received state information. Also, the method 900 involves determining, based on the received state information, task process for the task 925. For example, if the task involves building a structure within a time period, determining task process can involve determining what percent of the structure has been correctly completed at a given time.
  • In some embodiments, the method 900 can involve determining, based on the task process, whether to provide the user with a hint for completing the task 930. For example, providing a hint can comprise sending a lighting signal to one or more of the interactive blocks that causes the interactive blocks to light up. In another example, providing a hint can involve displaying a textual or graphical hint on the display of the electronic device.
  • Next, the method 900 involves detecting a task completion event 935. For example, a task completion event can include an actual completion of a task. On the other hand, a task completion event can involve the task ending by nature of the task not being completed according to one or more requirements, e.g. within a certain time period, with a requisite accuracy, in a more efficient manner than a previous attempt, etc.
  • The method 900 next involves sending task information to a tactile IQ platform 940 and receiving, from the tactile IQ platform, a task score and a recommendation for performing one or more further tasks 945. Finally, the method 900 can involve displaying the task score 950 on the electronic device and/or recommending further tasks 955 through the electronic device.
  • FIG. 10A and FIG. 10B illustrate exemplary possible system embodiments. The more appropriate embodiment will be apparent to those of ordinary skill in the art when practicing the present technology. Persons of ordinary skill in the art will also readily appreciate that other system embodiments are possible.
  • FIG. 10A illustrates a conventional system bus computing system architecture 1000 wherein the components of the system are in electrical communication with each other using a bus 1005. Exemplary system 1000 includes a processing unit (CPU or processor) 1010 and a system bus 1005 that couples various system components including the system memory 1015, such as read only memory (ROM) 1020 and random access memory (RAM) 1025, to the processor 1010. The system 1000 can include a cache of high-speed memory connected directly with, in close proximity to, or integrated as part of the processor 1010. The system 1000 can copy data from the memory 1015 and/or the storage device 1030 to the cache 1012 for quick access by the processor 1010. In this way, the cache can provide a performance boost that avoids processor 1010 delays while waiting for data. These and other modules can control or be configured to control the processor 1010 to perform various actions. Other system memory 1015 may be available for use as well. The memory 1015 can include multiple different types of memory with different performance characteristics. The processor 1010 can include any general purpose processor and a hardware module or software module, such as module 1 1032, module 2 1034, and module 3 1036 stored in storage device 1030, configured to control the processor 1010 as well as a special-purpose processor where software instructions are incorporated into the actual processor design. The processor 1010 may essentially be a completely self-contained computing system, containing multiple cores or processors, a bus, memory controller, cache, etc. A multi-core processor may be symmetric or asymmetric.
  • To enable user interaction with the computing device 1000, an input device 1045 can represent any number of input mechanisms, such as a microphone for speech, a touch-sensitive screen for gesture or graphical input, keyboard, mouse, motion input, and so forth. An output device 1035 can also be one or more of a number of output mechanisms known to those of skill in the art. In some instances, multimodal systems can enable a user to provide multiple types of input to communicate with the computing device 1000. The communications interface 1040 can generally govern and manage the user input and system output. There is no restriction on operating on any particular hardware arrangement and therefore the basic features here may easily be substituted for improved hardware or firmware arrangements as they are developed.
  • Storage device 1030 is a non-volatile memory and can be a hard disk or other types of computer readable media which can store data that are accessible by a computer, such as magnetic cassettes, flash memory cards, solid state memory devices, digital versatile disks, cartridges, random access memories (RAMs) 1025, read only memory (ROM) 1020, and hybrids thereof.
  • The storage device 1030 can include software modules 1032, 1034, 1036 for controlling the processor 1010. Other hardware or software modules are contemplated. The storage device 1030 can be connected to the system bus 1005. In one aspect, a hardware module that performs a particular function can include the software component stored in a computer-readable medium in connection with the necessary hardware components, such as the processor 1010, bus 1005, display 1035, and so forth, to carry out the function.
  • FIG. 10B illustrates a computer system 1050 having a chipset architecture that can be used in executing the described method and generating and displaying a graphical user interface (GUI). Computer system 1050 is an example of computer hardware, software, and firmware that can be used to implement the disclosed technology. System 1050 can include a processor 1055, representative of any number of physically and/or logically distinct resources capable of executing software, firmware, and hardware configured to perform identified computations. Processor 1055 can communicate with a chipset 1060 that can control input to and output from processor 1055. In this example, chipset 1060 outputs information to output 1065, such as a display, and can read and write information to storage device 1070, which can include magnetic media, and solid state media, for example. Chipset 1060 can also read data from and write data to RAM 1075. A bridge 1080 for interfacing with a variety of user interface components 1085 can be provided for interfacing with chipset 1060. Such user interface components 1085 can include a keyboard, a microphone, touch detection and processing circuitry, a pointing device, such as a mouse, and so on. In general, inputs to system 1050 can come from any of a variety of sources, machine generated and/or human generated.
  • Chipset 1060 can also interface with one or more communication interfaces 1090 that can have different physical interfaces. Such communication interfaces can include interfaces for wired and wireless local area networks, for broadband wireless networks, as well as personal area networks. Some applications of the methods for generating, displaying, and using the GUI disclosed herein can include receiving ordered datasets over the physical interface or be generated by the machine itself by processor 1055 analyzing data stored in storage 1070 or 1075. Further, the machine can receive inputs from a user via user interface components 1085 and execute appropriate functions, such as browsing functions by interpreting these inputs using processor 1055.
  • It can be appreciated that exemplary systems 1000 and 1050 can have more than one processor 1010 or be part of a group or cluster of computing devices networked together to provide greater processing capability.
  • For clarity of explanation, in some instances the present technology may be presented as including individual functional blocks including functional blocks comprising devices, device components, steps or routines in a method embodied in software, or combinations of hardware and software.
  • In some embodiments the computer-readable storage devices, mediums, and memories can include a cable or wireless signal containing a bit stream and the like. However, when mentioned, non-transitory computer-readable storage media expressly exclude media such as energy, carrier signals, electromagnetic waves, and signals per se.
  • Methods according to the above-described examples can be implemented using computer-executable instructions that are stored or otherwise available from computer readable media. Such instructions can comprise, for example, instructions and data which cause or otherwise configure a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. Portions of computer resources used can be accessible over a network. The computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, firmware, or source code. Examples of computer-readable media that may be used to store instructions, information used, and/or information created during methods according to described examples include magnetic or optical disks, flash memory, USB devices provided with non-volatile memory, networked storage devices, and so on.
  • Devices implementing methods according to these disclosures can comprise hardware, firmware and/or software, and can take any of a variety of form factors. Typical examples of such form factors include laptops, smart phones, small form factor personal computers, personal digital assistants, and so on. Functionality described herein also can be embodied in peripherals or add-in cards. Such functionality can also be implemented on a circuit board among different chips or different processes executing in a single device, by way of further example.
  • The instructions, media for conveying such instructions, computing resources for executing them, and other structures for supporting such computing resources are means for providing the functions described in these disclosures.
  • Although a variety of examples and other information was used to explain aspects within the scope of the appended claims, no limitation of the claims should be implied based on particular features or arrangements in such examples, as one of ordinary skill would be able to use these examples to derive a wide variety of implementations. Further and although some subject matter may have been described in language specific to examples of structural features and/or method steps, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to these described features or acts. For example, such functionality can be distributed differently or performed in components other than those identified herein. Rather, the described features and steps are disclosed as examples of components of systems and methods within the scope of the appended claims.

Claims (19)

1. An interactive block comprising:
a sensor configured to determine an orientation of the first interactive block;
connectors configured to couple the first interactive block with a neighboring interactive block;
a wireless communications interface;
a neighboring interactive block communications interface; and
a computer-readable medium storing non-transitory computer-readable instructions, that when executed are effective to cause the first interactive block to:
receive, using the first communications interface, neighbor data from the neighboring interactive block;
determine, based at least on the neighbor data and the orientation of the first interactive block, state data for the first interactive block and the neighboring interactive block;
send, through the wireless communication interface, the state data to an electronic device.
2. The interactive block of claim 1, further comprising:
a light source configured to turn ON responsive to a lighting instruction received via the wireless communications interface.
3. The interactive block of claim 2, wherein the light source is a multi-colored light source and the lighting instruction defines a predefined wavelength for the light source.
4. The interactive block of claim 2, wherein the block has a substantially cube shape with faces, at least some of the faces being translucent through which light from the light source can penetrate.
5. The interactive block of claim 2, wherein the computer-readable instructions are further effective to cause the block to:
send using the neighbor-block communications interface, in response to the received lighting instruction indicating that the neighboring interactive block should illuminate a light source in the neighboring block.
6. The interactive block of claim 1, further comprising an electronic contact that can communicate power level information and receive power from the neighboring interactive block.
7. The interactive block of claim 1, wherein the electronic device comprises a base station for processing relative position data and for charging the interactive block and the neighboring interactive block, and wherein an electronic contact on the interactive block can communicate power level information and receive power from the base station.
8. The interactive block of claim 1, wherein the computer-readable instructions are further effective to cause the interactive block to:
connect with a plurality of interactive blocks to form a network of interactive blocks; and
determine whether any of the other interactive blocks in the network of interactive blocks provides a more efficient connection to the electronic device.
9. The interactive block of claim 1, wherein the connectors comprise a magnet in each corner of the interactive block.
10. A non-transitory computer-readable medium storing instructions that, when executed by one or more computer processors of a computing device, cause the computing device to:
display a task relating to a group of interactive blocks wirelessly connected to the electronic device;
receive state data for the collection of interactive blocks, wherein the state data describes how a user manipulates the interactive blocks relative to each other while attempting to complete the task;
determine task process for the task based on the state data;
sending the task process to a network-based platform that scores tasks for a distributed user base; and
receive a task score from the network-based platform that describes how well the user performed the task compared to other members of the user base that were presented the same task.
11. The non-transitory computer-readable medium of claim 10, wherein the interactive blocks comprise:
a sensor configured to determine an orientation of the interactive block;
connectors configured to couple the interactive block with a neighboring interactive block;
a wireless communications interface;
a neighboring interactive block communications interface; and
a memory device storing non-transitory computer-readable instructions, that when executed are effective to cause the interactive block to:
receive, using the first communications interface, neighbor data from the neighboring interactive block;
determine, based at least on the neighbor data and the orientation of the first interactive block, state data for the first interactive block and the neighboring interactive block;
send, through the wireless communication interface, the state data to the computing device.
12. The non-transitory computer-readable medium of claim 11, wherein the interactive blocks further comprise a light source configured to turn ON responsive to a lighting instruction received via the wireless communications interface.
13. The non-transitory computer-readable medium of claim 12, wherein the light source is a multi-colored light source and the lighting instruction defines a predefined wavelength for the light source.
14. The non-transitory computer-readable medium of claim 12, wherein the interactive blocks have a substantially cube shape with faces, at least some of the faces being translucent through which light from the light source can penetrate.
15. The non-transitory computer-readable medium of claim 12, wherein the lighting instruction is further effective to instruct the interactive block pass the lighting instruction to a neighboring interactive block for illuminating a light source in the neighboring block.
16. The non-transitory computer-readable medium of claim 10, wherein the instructions further cause the computing device to:
continuously receive state data for the collection of interactive blocks while the user attempts to complete the task; and
display a dynamic virtual representation the collection of interactive blocks on a screen of the computing device.
17. The non-transitory computer-readable medium of claim 10, wherein the instructions further cause the computing device to:
determine that the task process is below a predetermine progress threshold; and
provide a hint for the task.
18. The non-transitory computer-readable medium of claim 17, wherein the instructions further cause the computing device to provide the hint for the task through one or more of the computing device and a light source in an interactive block in the collection of interactive blocks.
19. The non-transitory computer-readable medium of claim 10, wherein the instructions further cause the computing device to:
receive, along with the task score and based on the task score, a recommendation for performing at least one additional task.
US16/266,962 2014-10-08 2019-02-04 Interactive learning blocks Abandoned US20190168128A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/266,962 US20190168128A1 (en) 2014-10-08 2019-02-04 Interactive learning blocks

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201462061621P 2014-10-08 2014-10-08
US14/877,596 US10195538B2 (en) 2014-10-08 2015-10-07 Interactive learning blocks
US16/266,962 US20190168128A1 (en) 2014-10-08 2019-02-04 Interactive learning blocks

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/877,596 Continuation US10195538B2 (en) 2014-10-08 2015-10-07 Interactive learning blocks

Publications (1)

Publication Number Publication Date
US20190168128A1 true US20190168128A1 (en) 2019-06-06

Family

ID=55027785

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/877,596 Expired - Fee Related US10195538B2 (en) 2014-10-08 2015-10-07 Interactive learning blocks
US16/266,962 Abandoned US20190168128A1 (en) 2014-10-08 2019-02-04 Interactive learning blocks

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/877,596 Expired - Fee Related US10195538B2 (en) 2014-10-08 2015-10-07 Interactive learning blocks

Country Status (6)

Country Link
US (2) US10195538B2 (en)
EP (1) EP3188812B1 (en)
JP (2) JP6513204B2 (en)
KR (1) KR20170068532A (en)
CN (1) CN106999781A (en)
WO (1) WO2016055862A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190015736A1 (en) * 2016-02-11 2019-01-17 Attocube Co., Ltd. Cube set operating based on short-distance wireless communication and method and system for recognizing cube pattern using same

Families Citing this family (56)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102012017305A1 (en) * 2012-09-03 2014-03-06 Leonhard Oschütz Connecting construction between construction elements and construction element
JP6324716B2 (en) * 2013-12-26 2018-05-16 株式会社ソニー・インタラクティブエンタテインメント Block, block system, display method, and information processing method
EP3903897A3 (en) 2014-01-09 2022-02-09 Boxine GmbH Toy
US11772003B2 (en) * 2014-02-28 2023-10-03 Alexander Kokhan Electrical construction toy system
UY35546A (en) * 2014-04-28 2014-06-30 Miguel Angel Solari Noble Element and Didactic and Psychopedagogical Method (FUTTDAM)
US10195538B2 (en) * 2014-10-08 2019-02-05 DXTR Tactile IvS Interactive learning blocks
US10047940B2 (en) * 2015-04-25 2018-08-14 Dawson I. Grunzweig Removably connectable units for power, light, data, or other functions
WO2016174494A1 (en) * 2015-04-28 2016-11-03 Miller Kenneth C Multi-function modular robot apparatus with stackable, interchangeable and interlocking modules
US10338753B2 (en) 2015-11-03 2019-07-02 Microsoft Technology Licensing, Llc Flexible multi-layer sensing surface
US10955977B2 (en) 2015-11-03 2021-03-23 Microsoft Technology Licensing, Llc Extender object for multi-modal sensing
US10649572B2 (en) 2015-11-03 2020-05-12 Microsoft Technology Licensing, Llc Multi-modal sensing surface
US20170136380A1 (en) * 2015-11-18 2017-05-18 Matthew Creedican Smart Toys
US11007450B2 (en) * 2015-12-22 2021-05-18 Leigh M. Rothschild System and method for identifying building blocks and then displaying on a smart device the correct and/or alternative ways to assemble the blocks
DE102016000631B3 (en) * 2016-01-25 2017-04-06 Boxine Gmbh Identification holder for a toy for playing music or a spoken story
DE102016000630A1 (en) 2016-01-25 2017-07-27 Boxine Gmbh toy
US9914066B2 (en) * 2016-03-07 2018-03-13 Microsoft Technology Licensing, Llc Electromagnetically coupled building blocks
CA3030251A1 (en) * 2016-03-29 2017-10-05 Play Properties Entertainment Ltd. A computerized system and method of using a physical toy construction set
US10512853B2 (en) * 2016-04-08 2019-12-24 Tenka Inc. Circuit blocks
EP3238796A1 (en) * 2016-04-26 2017-11-01 matoi GmbH Toy set
JP2017225518A (en) * 2016-06-21 2017-12-28 株式会社エフティエルインターナショナル Block and block control system
US11185267B2 (en) * 2016-10-18 2021-11-30 The Johns Hopkins University Automated system for measurement of spatial-cognitive abilities
WO2018101514A1 (en) * 2016-12-01 2018-06-07 엘지전자 주식회사 Image display device and image display system comprising same
WO2018124342A1 (en) * 2016-12-28 2018-07-05 위드큐브(주) System for implementing bodily sensitive content on basis of smart block
US10427065B2 (en) * 2017-03-31 2019-10-01 Intel Corporation Building blocks with lights for guided assembly
CH713688A1 (en) 2017-04-12 2018-10-15 Trihow Ag A device comprising an electronic device, set comprising such a device, associated use and method of using such a set.
CN107644577B (en) * 2017-09-30 2021-02-02 清华大学 Electronic building block
CN107583288A (en) * 2017-09-30 2018-01-16 中国地质大学(武汉) A kind of electronic building blocks based on I2C interface
KR102000673B1 (en) * 2017-12-06 2019-07-17 (주)로보케어 Interactive block tool
WO2019130590A1 (en) * 2017-12-28 2019-07-04 株式会社アイデアスケッチ Display system and display method employing same
TWD196441S (en) * 2018-01-29 2019-03-11 正崴科技有限公司 An electrical cube module of bettary
US10734759B2 (en) * 2018-03-07 2020-08-04 Xcelsis Corporation Configurable smart object system with magnetic contacts and magnetic assembly
US12052631B2 (en) 2018-03-07 2024-07-30 Ayako TAKAHASHI Electronic device and information processing device
US11017129B2 (en) 2018-04-17 2021-05-25 International Business Machines Corporation Template selector
KR102044516B1 (en) * 2018-04-24 2019-11-13 정옥근 Smart block system, method and terminal for controlling the same
DK3806970T3 (en) * 2018-06-12 2022-10-31 Lego As Modular toy construction system with interactive toy construction elements
CN108771857B (en) * 2018-08-01 2023-06-30 方聚众联(大连)科技有限责任公司 Scalable wireless somatosensory control system and method
EP3917634B1 (en) 2019-01-31 2024-02-21 Lego A/S Toy system having a contactless energy transfer system
WO2020156719A1 (en) * 2019-01-31 2020-08-06 Lego A/S A modular toy system with electronic toy modules
JP6539000B1 (en) * 2019-02-13 2019-07-03 株式会社白紙とロック Numerical calculation system of plural individuals, numerical calculation method, program
CA3139735A1 (en) * 2019-05-10 2020-11-19 Brickfit Pty Ltd Interactive human activity tracking system
US11847930B2 (en) * 2019-06-15 2023-12-19 Arjee Cohen Three dimensional cube-like member
KR20220047261A (en) 2019-08-06 2022-04-15 박신 게엠베하 Server providing user downloadable media files, and corresponding system and method
CN111866910B (en) * 2019-09-18 2021-06-15 上海葡萄纬度科技有限公司 Networking method and system of spliced building blocks and spliced building blocks suitable for wireless networking
AU2019474411A1 (en) * 2019-11-14 2022-04-21 Qubs Ag Educational toy set
CN111146641A (en) * 2019-12-28 2020-05-12 深圳市优必选科技股份有限公司 Magnetic connector, circuit and robot
CN111111232A (en) * 2020-01-07 2020-05-08 北京小米移动软件有限公司 Fingertip building block, and control method and control device thereof
US11229841B2 (en) * 2020-01-14 2022-01-25 Luxrobo Co., Ltd. Reader participation type electronic book system using module and operation method
TWM599455U (en) * 2020-01-17 2020-08-01 曾建榮 Building block toy with data collection chip
US11615714B2 (en) 2020-04-30 2023-03-28 Kyndryl, Inc. Adaptive learning in smart products based on context and learner preference modes
ES2877241B2 (en) * 2020-05-12 2022-07-05 Pitk Pelotas S L LUMINOUS TOY
US20220001292A1 (en) * 2020-06-18 2022-01-06 Saifeng Chen Programmable toy building blocks system
AU2021326003B2 (en) * 2020-08-11 2022-07-28 Toco Blox Pty Ltd. Interconnecting block system amd assembly
EP4226355A1 (en) * 2020-10-07 2023-08-16 Cartesia Solutions S.r.l. Ludic-educational modular system
CN115019964A (en) * 2021-06-11 2022-09-06 合肥工业大学 Cognitive ability evaluation system and method based on digital biomarker
GB2622191A (en) * 2022-08-26 2024-03-13 Do Be As Polyhedral device with integrated communication means
KR102704389B1 (en) 2023-06-01 2024-09-10 (주)로보케어 Interactive block, interactive device and method for providing interactive contents

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050196730A1 (en) * 2001-12-14 2005-09-08 Kellman Philip J. System and method for adaptive learning
US20070184722A1 (en) * 2006-02-07 2007-08-09 Dynatech Action, Inc. Powered modular building block toy
US20100001923A1 (en) * 2008-07-02 2010-01-07 Med Et Al, Inc. Communication blocks having multiple-planes of detection components and associated method of conveying information based on their arrangement
US20120258436A1 (en) * 2011-04-08 2012-10-11 Case Western Reserve University Automated assessment of cognitive, fine-motor, and memory skills
US10195538B2 (en) * 2014-10-08 2019-02-05 DXTR Tactile IvS Interactive learning blocks

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3863268B2 (en) 1997-11-04 2006-12-27 株式会社システムワット Toy building block
US20030148700A1 (en) * 2002-02-06 2003-08-07 David Arlinsky Set of playing blocks
WO2006044859A2 (en) * 2004-10-19 2006-04-27 Mega Brands International, Luxembourg, Zug Branch Illuminated, three-dimensional modules with coaxial magnetic connectors for a toy construction kit
JP2006145928A (en) * 2004-11-22 2006-06-08 Olympus Corp Optical block and optical block system
US7556563B2 (en) * 2005-01-10 2009-07-07 Mattel, Inc. Internet enabled multiply interconnectable environmentally interactive character simulation module method and system
JP3119039U (en) * 2005-10-05 2006-02-16 興夫 荒井 3D jigsaw puzzle type electronic circuit learning material device
US8398470B2 (en) * 2005-10-20 2013-03-19 Koninklijke Philips Electronics N.V. Game with programmable light emitting segments
US20080280682A1 (en) * 2007-05-08 2008-11-13 Brunner Kevin P Gaming system having a set of modular game units
JP2009028438A (en) * 2007-07-30 2009-02-12 Eighting:Kk United toy
JP5080292B2 (en) * 2008-01-15 2012-11-21 株式会社ステラアーツ Light emitting block and display device
WO2009100051A1 (en) * 2008-02-04 2009-08-13 Polchin George C Physical data building blocks system for video game interaction
WO2011011084A1 (en) * 2009-07-24 2011-01-27 Modular Robotics Llc Modular robotics
GB201019285D0 (en) * 2010-11-15 2010-12-29 Hepworth Browne Ltd Interactive system and method of modifying user interaction therein
US8873239B2 (en) * 2011-02-28 2014-10-28 Octo23 Technologies Llc Electronic module, control module, and electronic module set
CN102805945A (en) * 2011-06-02 2012-12-05 科域半导体有限公司 Photo-communication toy combination system
WO2013006139A1 (en) 2011-07-07 2013-01-10 Nanyang Technological University A tangible user interface and a system thereof
US9019718B2 (en) 2011-08-26 2015-04-28 Littlebits Electronics Inc. Modular electronic building systems with magnetic interconnections and methods of using the same
DE102012017305A1 (en) 2012-09-03 2014-03-06 Leonhard Oschütz Connecting construction between construction elements and construction element
CN104096366B (en) * 2014-07-31 2016-06-15 深圳市智慧郎数码科技有限公司 A kind of control method of intelligence modular system and intelligence building blocks

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050196730A1 (en) * 2001-12-14 2005-09-08 Kellman Philip J. System and method for adaptive learning
US20070184722A1 (en) * 2006-02-07 2007-08-09 Dynatech Action, Inc. Powered modular building block toy
US20100001923A1 (en) * 2008-07-02 2010-01-07 Med Et Al, Inc. Communication blocks having multiple-planes of detection components and associated method of conveying information based on their arrangement
US20120258436A1 (en) * 2011-04-08 2012-10-11 Case Western Reserve University Automated assessment of cognitive, fine-motor, and memory skills
US10195538B2 (en) * 2014-10-08 2019-02-05 DXTR Tactile IvS Interactive learning blocks

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190015736A1 (en) * 2016-02-11 2019-01-17 Attocube Co., Ltd. Cube set operating based on short-distance wireless communication and method and system for recognizing cube pattern using same

Also Published As

Publication number Publication date
EP3188812A1 (en) 2017-07-12
EP3188812B1 (en) 2019-06-05
WO2016055862A1 (en) 2016-04-14
KR20170068532A (en) 2017-06-19
JP2019111411A (en) 2019-07-11
US20160101370A1 (en) 2016-04-14
CN106999781A (en) 2017-08-01
JP2017530848A (en) 2017-10-19
JP6513204B2 (en) 2019-05-15
US10195538B2 (en) 2019-02-05

Similar Documents

Publication Publication Date Title
US20190168128A1 (en) Interactive learning blocks
CN111095150B (en) Robot as personal trainer
WO2020233464A1 (en) Model training method and apparatus, storage medium, and device
US10188939B2 (en) Modular construction for interacting with software
US20170192516A1 (en) Modular sensing device for processing gestures
US9108114B2 (en) Tangible user interface and a system thereof
CN106794378B (en) Storage and charging device for game pieces
TW201518995A (en) Assembled apparatus and control method thereof
US20190091544A1 (en) Dual Motion Sensor Bands for Real Time Gesture Tracking and Interactive Gaming
KR102376816B1 (en) Unlock augmented reality experience with target image detection
KR20170010720A (en) Block Toy Having Topology Recognition Function and Contents Providing System
WO2017014530A1 (en) Block toy having topology recognition function and content providing system using same
de Albuquerque Wheler et al. Toy user interface design—tools for child–computer interaction
JP6926490B2 (en) Behavior construction with constituent toys
KR102229422B1 (en) Method and system for learning coding using light emitting diode blocks
CN107292221A (en) A kind of trajectory processing method and apparatus, a kind of device for trajectory processing
de Albuquerque Wheler et al. IoT4Fun rapid prototyping tools for Toy User Interfaces
Wang et al. Teaching Next Generation Computing Skills; The Challenge of Embedded Computing
US10952662B2 (en) Analysis of cognitive status through object interaction
CN209060549U (en) A kind of smart electronics building blocks
de Albuquerque et al. Human-centered design tools for smart toys
CN109215418A (en) A kind of magic square, magic square tutoring system and magic square teaching method
KR102207517B1 (en) Smart cube and method for operating the same
KR101598925B1 (en) Apparatus with snippet for highly intelligent object
Ruumpol The creation of the Smart Interactive Mini Table

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION