US20230415051A1 - Interactive build plate - Google Patents
Interactive build plate Download PDFInfo
- Publication number
- US20230415051A1 US20230415051A1 US17/846,407 US202217846407A US2023415051A1 US 20230415051 A1 US20230415051 A1 US 20230415051A1 US 202217846407 A US202217846407 A US 202217846407A US 2023415051 A1 US2023415051 A1 US 2023415051A1
- Authority
- US
- United States
- Prior art keywords
- plate
- break beam
- sensors
- sensor data
- workspace
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000002452 interceptive effect Effects 0.000 title description 12
- 230000009471 action Effects 0.000 claims abstract description 38
- 238000001514 detection method Methods 0.000 claims abstract description 31
- 238000000034 method Methods 0.000 claims abstract description 30
- 230000000284 resting effect Effects 0.000 claims abstract description 20
- 230000004044 response Effects 0.000 claims abstract description 11
- 238000012800 visualization Methods 0.000 claims description 19
- 230000000007 visual effect Effects 0.000 claims description 14
- 230000003213 activating effect Effects 0.000 claims description 11
- 230000008878 coupling Effects 0.000 claims description 10
- 238000010168 coupling process Methods 0.000 claims description 10
- 238000005859 coupling reaction Methods 0.000 claims description 10
- 238000010276 construction Methods 0.000 claims description 6
- 230000005670 electromagnetic radiation Effects 0.000 claims description 4
- 238000004590 computer program Methods 0.000 abstract description 8
- 230000015654 memory Effects 0.000 description 45
- 238000004891 communication Methods 0.000 description 17
- 230000008569 process Effects 0.000 description 7
- 230000003993 interaction Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 239000011449 brick Substances 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 3
- 230000006870 function Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000008901 benefit Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H33/00—Other toys
- A63H33/04—Building blocks, strips, or similar building parts
- A63H33/042—Mechanical, electrical, optical, pneumatic or hydraulic arrangements; Motors
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H33/00—Other toys
- A63H33/04—Building blocks, strips, or similar building parts
- A63H33/06—Building blocks, strips, or similar building parts to be assembled without the use of additional elements
- A63H33/08—Building blocks, strips, or similar building parts to be assembled without the use of additional elements provided with complementary holes, grooves, or protuberances, e.g. dovetails
- A63H33/086—Building blocks, strips, or similar building parts to be assembled without the use of additional elements provided with complementary holes, grooves, or protuberances, e.g. dovetails with primary projections fitting by friction in complementary spaces between secondary projections, e.g. sidewalls
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H2200/00—Computerized interactive toys, e.g. dolls
Definitions
- the plate sensors can be integrated with the plate, positioned below the plate, or positioned on the plate surface.
- the plate sensors can include, for example, pressure sensors, contact sensors, or proximity sensors. Each plate sensor can be configured to detect objects resting on the plate surface within a proper subset of the area of the plate surface. When a toy piece is placed on the plate surface, the plate sensors can detect the presence of the toy piece on the plate surface.
- Plate sensor data can be output to the controller.
- the plate sensor data can include data indicating, for a particular plate sensor that detects an object, a location of the plate sensor, a start time of object detection by the plate sensor, a duration of object detection by the plate sensor, a weight of the object, etc.
- the controller can determine, based on the plate sensor data, that more than one object is located at a particular location, e.g., due to being stacked.
- two or more plate sensors can detect objects resting on the plate surface.
- the controller can determine, based on the plate sensor data from the two or more plate sensors, a size of the object and/or a number of objects resting on the plate surface.
- the controller can track object paths within the workspace and placement of objects on the plate surface. For example, the controller can track motion of a user's hand placing a first toy piece on a first location of the plate surface, removing a second toy piece from a second location of the plate surface, and placing the second toy piece on a second location of the plate surface.
- the controller can determine a time duration of each action and a path of movement for each action.
- the controller can perform actions based on the tracked object motion. For example, the controller may determine that the user requires assistance, and perform a feedback action to assist the user.
- the controller can determine that the user requires assistance, e.g., based on determining that a placement of an object differs from an expected placement of the object, based on the user's hand moving slowly within the workspace, based on the user's hand being within the workspace for longer than an expected time duration, based on the user moving a toy piece repeatedly between locations of the plate surface, etc.
- the controller can perform a feedback action such as illuminating a warning light, generating an alert sound, outputting audible instructions, outputting visual instructions, energizing a laser pointer, etc.
- the controller can generate a visualization of object paths and placement within the workspace.
- the visualization can include, for example, a heat map showing the path of an object through the workspace and the placement location of the object on the plate surface during a user session.
- the heat map can be presented on a display in near-real time, can be stored for later viewing by a user, or both.
- the controller can generate an aggregated visualization of objects paths and placement within the workspace.
- the aggregated visualization can represent multiple user sessions by the same user, or multiple user sessions by multiple users.
- An interactive build plate can use break beam sensors, e.g., including LEDs and photodiodes, to detect user motion and time of interaction.
- the interactive build plate can record how many times, where, and for how long optical sensor sets have their optical paths broken.
- the interactive build plate can be configured such that, when in use, block sets or models are built on top of the build plate. Each block can pass over the build plate when being put into place on the plate.
- Actions performed by a user can be compared with building plans. Based on the actions of the user compared with the building plans, the build plate system can perform actions. Actions can include providing feedback by activating guidance signals to guide the user. Actions can include generating a visualization of object movement. Actions can include comparing average user performance to expected performance specified by the building plans. Based on determining that average user performance does not satisfy performance criteria for a particular building plan, instructions for the building plan can be adjusted. For example, a lower-than-expected performance for a construction project by multiple users can indicate that the instructions for the construction project are not accurate and/or are not adequately specific.
- a system including a plate having a surface defining a floor of a three-dimensional workspace.
- the plate includes toy piece coupling elements and is configured to be used with releasably coupleable toy pieces.
- the system includes a plurality of plate sensors configured to detect objects resting on the surface of the plate; a plurality of break beam sensors configured to detect objects within the workspace; and a controller.
- the controller is configured to perform operations including: receiving break beam sensor data indicating object detection by at least one break beam sensor of the plurality of break beam sensors, receiving plate sensor data indicating object detection by at least one plate sensor of the plurality of plate sensors; and determining, based on the break beam sensor data and the plate sensor data, that a toy piece passed through the workspace to rest on the surface.
- the system includes a plurality of support structures, each support structure extending from an edge of the plate in a non-parallel direction to a plane of the surface.
- Each break beam sensor includes: an emitter configured to emit electromagnetic radiation; and a receiver configured to receive electromagnetic radiation emitted by the emitter. Emitters and receivers of the plurality of break beam sensors are supported by the plurality of support structures.
- each break beam sensor of the plurality of break beam sensors includes: an emitter supported by a first support structure coupled to the plate; and a receiver supported by a second support structure coupled to the plate. Electromagnetic energy traveling from the emitter to the receiver passes through the workspace.
- the plurality of break beam sensors are arranged in an array, the break beam sensor data including data indicating an array address of the at least one break beam sensor.
- receiving break beam sensor data indicating object detection by at least one break beam sensor of the plurality of break beam sensors includes: receiving break beam sensor data indicating simultaneous object detection by two or more break beam sensors at a first time, the break beam sensor data including data indicating an array address of the two or more break beam sensors; and based on the break beam sensor data, determining a three-dimensional coordinate location of the toy piece within the three-dimensional workspace at the first time.
- the break beam sensor data includes data indicating a time of object detection.
- the break beam sensor data includes data indicating a sequence of detections, the operations including determining, based on the break beam sensor data, a path traveled through the workspace by the toy piece.
- each plate sensor of the plurality of plate sensors is configured to detect objects resting on the surface within a respective proper subset of an area of the surface, the operations including: determining, based on the plate sensor data, a location of the toy piece on the surface, the location including a particular proper subset of the area of the surface.
- the operations include comparing a position of the toy piece on the surface to a target position of the toy piece on the surface; and in response to determining that the position of the toy piece does not satisfy similarity criteria for matching the target position, performing one or more actions.
- the one or more actions include at least one of: activating a visual alarm; activating an audible alarm; outputting visual instructions; or outputting audible instructions.
- the plurality of plate sensors are integrated with the plate and arranged in an array.
- the operations include determining, using the break beam sensor data and the plate sensor data, at least one of a size of the toy piece or a shape of the toy piece.
- the operations include: generating a visualization showing: a path of the toy piece through the workspace; and a placement of the toy piece on the surface; and providing the visualization for presentation on a display.
- the plurality of break beam sensors includes a plurality of infrared break beam sensors.
- the plurality of plate sensors includes a plurality of weight sensors, proximity sensors, or contact sensors.
- the method includes receiving, by a controller, break beam sensor data indicating object detection by at least one break beam sensor of a plurality of break beam sensors.
- the plurality of break beam sensors is configured to detect objects within the workspace.
- the method includes receiving, by the controller, plate sensor data indicating object detection by at least one plate sensor of a plurality of plate sensors.
- the plurality of plate sensors is configured to detect objects resting on a surface of a plate that defines a floor of the workspace.
- the method includes determining, by the controller and based on the break beam sensor data and the plate sensor data, that an object passed through the workspace to rest at a position on the surface; comparing, by the controller, the position of the object on the surface to a target position of the object on the surface; and in response to determining that the position of the object does not satisfy similarity criteria for matching the target position, performing, by the controller, one or more actions.
- the one or more actions include at least one of: activating a visual alarm; activating an audible alarm; outputting visual instructions; or outputting audible instructions.
- the method includes obtaining data indicating a plan for a construction to be built on the plate; and determining the target position of the object on the surface using the obtained data.
- the plate includes toy piece coupling elements and is configured to be used with releasably coupleable toy pieces.
- FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present disclosure.
- FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present disclosure.
- FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present disclosure.
- FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present disclosure.
- FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present disclosure.
- FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present disclosure.
- FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present disclosure.
- FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present disclosure.
- FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present disclosure.
- FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present disclosure.
- FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present disclosure.
- FIGS. 3 A and 3 B illustrate tracking of an object by the example interactive build plate system.
- FIG. 6 shows an example of a computing device and a mobile computing device that can be used to implement the techniques described herein.
- the build plate system 100 can have any appropriate size.
- the plate 104 can have an area of thirty square centimeters or greater (e.g., forty square centimeters or greater, fifty square centimeters or greater, sixty square centimeters or greater).
- the plate 104 can have an area of five hundred square centimeters or less (e.g., four hundred square centimeters or less, three hundred square centimeters or less, two hundred centimeters or less).
- the build plate system 100 can include any number of structures.
- the build plate system 100 can include three structures, four structures, or five or more structures.
- the structures 102 can be positioned in an arrangement that does not impede access to the workspace 108 .
- the number, size, shape, and positioning of the structures 102 can be such that a user can reach into the workspace 108 between the structures 102 .
- a ceiling of the workspace 108 can be defined by top edges 103 a, 103 b of the structures 102 a, 102 b.
- a plane extending from the top edge 103 a of the structure 102 a to the top edge 103 b of the structure 102 b can define the ceiling of the workspace 108 .
- the structures 102 can be coupled to the plate 104 at corners of the plate.
- the plate 104 can have a rectangular shape in the x-y plane.
- a structure 102 can be coupled to the plate 104 at each of the four corners of the plate 104 .
- FIG. 1 B illustrates a perspective view of the example build plate system 100 .
- the build plate system 100 can track the block 110 as a user's hand 140 enters the workspace 108 while holding the block.
- the build plate system 100 can track the block 110 as the user's hand 140 places the block 110 on the surface 112 .
- the build plate system 100 can determine a location and positioning of the block 110 on the surface 112 .
- the break beam sensors of the build plate system 100 can be arranged in an array and supported by structure 102 a, 102 b.
- Each break beam sensor includes an emitter, e.g., emitter 130 , and a receiver, e.g., receiver 132 .
- the break beam sensors can detect the presence and movement of objects within the workspace 108 , e.g., block 110 being carried by a user's hand 140 .
- the break beam sensors can detect and track movement of the user's hand 140 passing through the workspace 108 .
- emitters are represented by white circles, and receivers are represented by black circles.
- an emitter and corresponding receiver of a break beam sensor can be positioned across from each other and supported by opposing structures 102 .
- the direction of travel between an emitter and a receiver is parallel to the x-y plane and parallel to the x-z plane, e.g., path 152 .
- the direction of travel between an emitter and a receiver is parallel to the x-y plane and non-parallel to the x-z plane, e.g., path 154 .
- the direction of travel between an emitter and a receiver is non-parallel to the x-y plane and parallel to the x-z plane, e.g., path 156 . In some examples, the direction of travel between an emitter and a receiver is non-parallel to the x-y plane and non-parallel to the x-z plane, e.g., path 158 .
- FIG. 2 shows a block diagram of the example build plate system 100 .
- the build plate system 100 includes break beam sensors 204 and plate sensors 106 .
- the build plate system 100 optionally includes a camera 206 .
- the build plate system 100 includes a controller 210 .
- the controller 210 includes a movement tracker 212 and a placement tracker 214 .
- the build plate system 100 includes a memory 220 .
- the build plate system 100 optionally includes a visualization generator 222 and a signal device 224 .
- the break beam sensors 204 output break beam sensor data 234 to the controller 210 .
- the break beam sensor data 234 can include data indicating, for a particular break beam sensor that detects the block 110 , an array address of the break beam sensor, a start time of object detection by the break beam sensor, a duration of detection by the particular break beam sensor, or any combination of these.
- the array address of a break beam sensor can be, for example, a coordinate position of the emitter in the respective array, a coordinate position of the receiver in the respective array, or both.
- the break beam sensors 204 include infrared emitters and receivers.
- An example infrared emitter includes an infrared LED.
- the infrared LED can have a diameter of approximately 3.0 millimeters (mm) (e.g., 2.0 mm or greater, 2.5 mm or greater, 3.5 mm or greater).
- Infrared break-beam sensors can be used to detect object presence and object motion.
- An infrared emitter sends out a beam of human-invisible infrared light.
- a receiver such as a photodiode, that is sensitive to the infrared light is positioned across the workspace from the emitter.
- An array of break beam sensors can be used to detect and localize objects.
- the array of break beam sensors can be used to detect motion of objects, to determine speed of object motion, and to determine two-dimensional and three-dimensional direction of object motion.
- the build plate system 100 includes one or more plate sensors 106 .
- a plate sensor 106 is a sensor that detects the presence of the block 110 on the plate 104 .
- the plate sensor 106 can output data indicating a size, shape, location, weight, or any combination of these to the controller 210 .
- the plate sensors 106 can include, for example, pressure sensors, weight sensors, proximity sensors, load cells, contact sensors, capacitive sensors, or any combination of these.
- Each plate sensor can be configured to detect objects resting on the plate surface within a proper subset of the area of the plate surface.
- the plate sensors 106 can detect the presence of the toy piece on the surface 112 .
- Plate sensor data 236 can be output to the controller 210 .
- Plate sensor data 236 output by a particular plate sensor can include data indicating a location of the plate sensor that detected the object, a size of the object, a shape of the object, an orientation of the object, a start time of object detection by the plate sensor, a duration of object detection by the plate sensor, a weight of the object, or any combination of these.
- the plate sensors 106 can be arranged in an array.
- the build plate system 100 can include any appropriate number of plate sensors 106 .
- the resolution of the build plate system 100 can depend on the number of plate sensors. For example, a greater number of plate sensors 106 results in a higher resolution, while a lesser number of plate sensors 106 results in a lower resolution.
- a build plate system having a higher resolution can determine more precise locations and sizes of objects resting on the plate 104 , compared to a build plate system having a lower resolution.
- the controller 210 includes a movement tracker 212 and a placement tracker 214 .
- the movement tracker 212 can track object paths within the workspace 108 .
- the movement tracker 212 can determine a coordinate location of the block 110 within the workspace 108 at the particular time.
- the coordinate location can be a three-dimensional coordinate location within the workspace 108 where the block 110 is detected.
- the coordinate location can be a location of an estimated center of the block 110 at the particular time.
- the movement tracker 212 can determine a size of the block 110 based on a number of break beam sensors 204 that are simultaneously interrupted at a given time. In some examples, the movement tracker 212 can determine a shape of the block 110 , a location of an estimated center of the block 110 , or both, using the break beam sensor data 234 . The movement tracker 212 can determine the shape of the block 110 and/or the location of the center of the block 110 based on the number of break beam sensors 204 that are simultaneously interrupted at a given time, the array addresses of the break beam sensors 204 that are simultaneously interrupted at the given time, or both.
- the movement tracker 212 can determine a path of the block 110 moving through the workspace 108 .
- the path can include a trajectory of the block 110 , a speed of the block 110 , a time of travel of the block 110 , or any combination of these.
- the trajectory of the block 110 includes a series of coordinate locations of the block 110 in the workspace 108 , with each coordinate location being associated with a time of detection.
- the placement tracker 214 can track placement of objects on the surface 112 .
- the placement tracker 214 can determine a coordinate location of the block on the plate 104 .
- the coordinate location can be a two-dimensional coordinate location on the plate 104 where the block 110 is detected.
- the coordinate location can be a location of an estimated center of the block 110 at the particular time.
- the placement tracker 214 can determine a size of the block 110 based on a number of plate sensors 106 that detect the block 110 . In some examples, the placement tracker 214 can determine a shape of the 110 , a location of an estimated center of the block 110 , an orientation of the block 110 , or any combination of these, using the plate sensor data 236 . The placement tracker 214 can determine the shape, location, and orientation of the block 110 based on the number of plate sensors 106 that detect the block, the array addresses of the plate sensors 106 that detect the block, or both.
- the controller 210 can track motion of a user's hand 140 moving through the workspace 108 to place the block 110 at a particular location on the plate 104 .
- the controller 210 can determine a time duration of movement of the hand 140 within the workspace 108 , a speed of the hand 140 moving through the workspace 108 , a path that the hand 140 travels through the workspace 108 , and a coordinate address of the particular location on the plate 104 .
- the build plate system 100 includes a camera 206 .
- the camera 206 can capture images of the workspace 108 .
- the camera 206 can be activated when the user (or user's parents/guardians) consent to having the camera capture user interaction with the build plate system 100 and toy pieces.
- the controller 210 can use camera image data 238 captured by the camera 206 to track movement and placement of objects within the workspace 108 .
- the controller 210 can overlay camera image data 238 captured by the camera 206 with break beam sensor data 234 from the break beam sensors 204 , with plate sensor data 236 from the plate sensors 106 , or both.
- Camera image data 238 captured by the camera 206 can be used to verify and/or validate trajectories and placement of objects determined by the controller 210 .
- the build plate system 100 includes a memory 220 .
- the memory 220 can store calibration data.
- Calibration data can include data associating break beam sensor data with location and movement patterns of objects within the workspace 108 .
- Calibration data can include data associating plate sensor data 236 with locations, shapes, and sizes of objects placed on the plate 104 .
- the memory 220 can store a building plan 202 .
- the building plan 202 can be loaded into the memory 220 .
- the building plan 202 can include a plan for a construction to be built on the plate 104 .
- the building plan 202 can include an arrangement of blocks. The arrangement can include a number of blocks to be placed on the plate 104 , a location for each block on the plate 104 , a type of block to be placed at each location, an orientation for each block on the plate 104 , a number of blocks to be placed at each location of the plate 104 , or any combination of these.
- the controller 210 can access the building plan 202 and can compare object movement within the workspace 108 , and object placement on the plate 104 , to the building plan 202 . For example, the controller 210 can compare a sequence of object movement within the workspace 108 to a sequence of object movement specified by the building plan 202 . The controller 210 can compare a placement of a block on the plate 104 to a placement specified by the building plan 202 . The controller 210 can determine whether detected movement and placement of objects satisfies similarity criteria for matching the movement and placement of objects specified by the building plan 202 .
- the controller 210 can perform actions based on the tracked object motion. For example, the controller 210 may determine that the motion and/or placement of the block 110 is inaccurate based on determining that the motion and/or placement does not satisfy criteria for matching the building plan 202 . In response to determining that the motion and/or placement of the block 110 does not match the building plan 202 , the controller 210 can determine that the user requires assistance and can determine to perform an action to provide feedback and/or assistance to the user.
- the controller 210 can activate the signal device 224 by broadcasting audible sound through a speaker.
- the sound can include, for example, an alert sound indicating that the block 110 has been placed in correctly.
- the sound can include verbal instructions.
- the controller 210 can activate the signal device 224 by displaying visual instructions on a display coupled to the build plate system 100 .
- the visual instructions can include, for example, textual or graphical instructions.
- the instructions can specify one or more actions to be performed by the user in order to place the block 110 correctly per the building plan 202 .
- the controller 210 may determine that the user placed the block 110 at an incorrect or inaccurate location of the surface 112 .
- the controller 210 can perform an action to assist the user, e.g., by illuminating a light under the correct location of the surface 112 after a one second delay.
- the controller 210 can perform an action by illuminating a light supported by one of the structures 102 after a three second delay. The light can indicate the incorrect location of the block 110 without revealing the correct location.
- the controller 210 can perform an action by illuminating the light after a ten second delay.
- the build plate system 100 can operate in the different operating modes, e.g., based on the building plan 202 , based on user input, or both.
- the build plate system 100 can provide a user interface for receiving user input indicating the operating mode.
- the build plate system 100 can receive user input, through a user interface, specifying various settings of operation.
- the build plate system can receive user input specifying a setting for a preferred type of signal device 224 to be used for providing user feedback and assistance.
- the setting for the preferred type of signal device 224 can indicate, for example, a user preference for visual guidance over audible guidance.
- the build plate system 100 can include a visualization generator 222 .
- the visualization generator 222 can generate a visualization of object paths and placement within the workspace.
- the visualization can include, for example, a heat map showing the path of the block 110 through the workspace 108 and the placement location of the block 110 on the surface 112 during a user session.
- the visualization can be presented on a display device.
- the visualization can be presented in near-real time, can be stored for later viewing by a user, or both.
- the visualization generator 222 can generate an aggregated visualization of objects paths and placement within the workspace.
- the aggregated visualization can represent multiple user sessions by a same user, or multiple user sessions by multiple users.
- FIGS. 3 A and 3 B illustrate tracking of an object by the example interactive build plate system 100 .
- the user's hand 140 holds the block 110 and moves the block 110 through the workspace 108 .
- the build plate system 100 can track movement of the hand 140 and the block 110 using the break beam sensors.
- the build plate system 100 can determine a time of interruption of the beam 302 between the emitter 120 and the receiver 122 .
- the build plate system 100 can determine a three-dimensional coordinate location of the block 110 within the workspace 108 at a particular time based on the array addresses of break beam sensors that detected the block at the particular time.
- FIGS. 4 A and 4 B illustrate tracking of multiple objects by the example interactive build plate system 100 .
- blocks 110 , 410 are placed side-by-side on the plate 104 .
- Plate sensor data 236 generated can indicate array addresses of the plate sensors 106 that detect the presence of the blocks 110 , 410 .
- the build plate system 100 can determine, based on the plate sensor data 236 , a number of objects resting on the plate 104 , a size of each object, a weight of each object, an orientation of each object, a shape of each object, or any combination of these.
- the build plate system 100 can determine, based on the plate sensor data 236 , that two blocks are resting on the plate 104 .
- the build plate system 100 can track individual objects moving between the workspace 108 and the surface 112 . For example, a user may place the block 110 in a first location on the surface 112 , place the block 410 in a second location on the surface 112 , remove the block 110 from the first location, and place the block 110 in a third location.
- the build plate system 100 can compare the break beam sensor data 234 with the plate sensor data 236 to determine which block, e.g., block 110 or 410 , is being moved.
- the build plate system 100 can generate an identifier for each object that enters the workspace 108 .
- the build plate system 100 can generate an identifier for each block.
- the identifier can be stored in the memory 220 with data indicating the location of the respective block and associated characteristics of the respective block.
- the associated characteristics can be determined using the break beam sensor data 234 , the plate sensor data 136 , the camera image data 138 , or any of these. Characteristics can include, for example, a size, shape, weight, and/or color of the block.
- the build plate system 100 can determine the identifier of the block that is moving.
- the build plate system 100 can determine the identifier of the block that is moving based on characteristics of the moving block, based on the starting location of the moving block, or both. The build plate system 100 can then track movement of the identified block from its starting location to an ending location. The build plate system 100 can thus track location and movement of individual objects by storing identifiers, locations, and/or characteristics of each object within the memory 220 .
- the build plate system 100 can determine, based on the plate sensor data 236 , a number of objects placed in a same location, e.g., due to being stacked.
- the plate sensor data 236 can indicate a first increase in weight detected by a first plate sensor at a first time.
- the plate sensor data 236 can indicate a second increase in weight detected by the first plate sensor at a second time after the first time. Based on the first increase in weight, followed by the second increase in weight, the build plate system 100 can determine that two objects are stacked at a location of the plate 104 corresponding to the first plate sensor.
- objects can be detected by break beam sensors while resting on the plate 104 .
- the block 420 stacked on top of the block 110 , breaks the beam 302 .
- the block 420 is therefore detected by both the break beam sensors 204 and the plate sensors 106 while stacked on the block 110 .
- the build plate system 100 can overlay the break beam sensor data 234 and the plate sensor data 236 to determine a precise location and other attributes of the block 420 .
- the build plate system 100 can determine a three-dimensional coordinate location of the block 420 , a height of the block 420 when stacked on the block 110 , a shape of the block 420 , a size of the block 420 , a weight of the block 420 , a three-dimensional orientation of the block 420 , or any of these.
- Information about the block 420 determined from both the break beam sensor data 234 and the plate sensor data 236 can be more precise and accurate than information about the block 420 determined from only the break beam sensor data 234 or only the plate sensor data 236 .
- FIG. 5 is a flow diagram of an example process 500 for tracking object movement within a three-dimensional workspace.
- the process 500 includes receiving, by a controller, break beam sensor data indicating object detection by at least one break beam sensor of a plurality of break beam sensors ( 502 ).
- the plurality of break beam sensors is configured to detect objects within the workspace.
- the array of break beam sensors 205 can detect and track objects moving within the workspace 108 .
- the process 500 includes receiving, by the controller, plate sensor data indicating object detection by at least one plate sensor of a plurality of plate sensors ( 504 ).
- the plurality of plate sensors is configured to detect objects resting on a surface that defines a floor of the workspace.
- the plate sensors 106 can detect the block 110 resting on the surface 112 that defines the floor of the workspace 108 .
- the process 500 includes determining, based on the break beam sensor data and the plate sensor data, that an object passed through the workspace to rest on the surface ( 506 ).
- the controller 210 can determine, based on the break beam sensor data 234 and the plate sensor data 236 , that the block 110 passed through the workspace 108 to rest on the surface 112 .
- the controller 210 can determine a trajectory of the block 110 through the workspace 108 , and a location of placement of the block 110 on the surface 112 .
- FIG. 6 shows an example of a computing device 600 and a mobile computing device 650 that can be used to implement the techniques described here.
- the computing device 600 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers.
- the mobile computing device 650 is intended to represent various forms of mobile devices, such as personal digital assistants, cellular telephones, smart-phones, and other similar computing devices.
- the components shown here, their connections and relationships, and their functions, are meant to be examples only, and are not meant to be limiting.
- the computing device 600 includes a processor 602 , a memory 604 , a storage device 606 , a high-speed interface 608 connecting to the memory 604 and multiple high-speed expansion ports 610 , and a low-speed interface 612 connecting to a low-speed expansion port 614 and the storage device 606 .
- Each of the processor 602 , the memory 604 , the storage device 606 , the high-speed interface 608 , the high-speed expansion ports 610 , and the low-speed interface 612 are interconnected using various busses, and may be mounted on a common motherboard or in other manners as appropriate.
- the processor 602 can process instructions for execution within the computing device 600 , including instructions stored in the memory 604 or on the storage device 606 to display graphical information for a GUI on an external input/output device, such as a display 616 coupled to the high-speed interface 608 .
- an external input/output device such as a display 616 coupled to the high-speed interface 608 .
- multiple processors and/or multiple buses may be used, as appropriate, along with multiple memories and types of memory.
- multiple computing devices may be connected, with each device providing portions of the necessary operations (e.g., as a server bank, a group of blade servers, or a multi-processor system).
- the memory 604 stores information within the computing device 600 .
- the memory 604 is a volatile memory unit or units.
- the memory 604 is a non-volatile memory unit or units.
- the memory 604 may also be another form of computer-readable medium, such as a magnetic or optical disk.
- the storage device 606 is capable of providing mass storage for the computing device 600 .
- the storage device 606 may be or contain a computer-readable medium, such as a floppy disk device, a hard disk device, an optical disk device, or a tape device, a flash memory or other similar solid state memory device, or an array of devices, including devices in a storage area network or other configurations.
- Instructions can be stored in an information carrier.
- the instructions when executed by one or more processing devices (for example, processor 602 ), perform one or more methods, such as those described above.
- the instructions can also be stored by one or more storage devices such as computer- or machine-readable mediums (for example, the memory 604 , the storage device 606 , or memory on the processor 602 ).
- the high-speed interface 608 manages bandwidth-intensive operations for the computing device 600 , while the low-speed interface 612 manages lower bandwidth-intensive operations. Such allocation of functions is an example only.
- the high-speed interface 608 is coupled to the memory 604 , the display 616 (e.g., through a graphics processor or accelerator), and to the high-speed expansion ports 610 , which may accept various expansion cards (not shown).
- the low-speed interface 612 is coupled to the storage device 606 and the low-speed expansion port 614 .
- the low-speed expansion port 614 which may include various communication ports (e.g., USB, Bluetooth, Ethernet, wireless Ethernet) may be coupled to one or more input/output devices, such as a keyboard, a pointing device, a scanner, or a networking device such as a switch or router, e.g., through a network adapter.
- input/output devices such as a keyboard, a pointing device, a scanner, or a networking device such as a switch or router, e.g., through a network adapter.
- the computing device 600 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as a standard server 620 , or multiple times in a group of such servers. In addition, it may be implemented in a personal computer such as a laptop computer 622 . It may also be implemented as part of a rack server system 624 . Alternatively, components from the computing device 600 may be combined with other components in a mobile device (not shown), such as a mobile computing device 650 . Each of such devices may contain one or more of the computing device 600 and the mobile computing device 650 , and an entire system may be made up of multiple computing devices communicating with each other.
- the processor 652 can execute instructions within the mobile computing device 650 , including instructions stored in the memory 664 .
- the processor 652 may be implemented as a chipset of chips that include separate and multiple analog and digital processors.
- the processor 652 may provide, for example, for coordination of the other components of the mobile computing device 650 , such as control of user interfaces, applications run by the mobile computing device 650 , and wireless communication by the mobile computing device 650 .
- the processor 652 may communicate with a user through a control interface 658 and a display interface 656 coupled to the display 654 .
- the display 654 may be, for example, a TFT (Thin-Film-Transistor Liquid Crystal Display) display or an OLED (Organic Light Emitting Diode) display, or other appropriate display technology.
- the display interface 656 may comprise appropriate circuitry for driving the display 654 to present graphical and other information to a user.
- the control interface 658 may receive commands from a user and convert them for submission to the processor 652 .
- an external interface 662 may provide communication with the processor 652 , so as to enable near area communication of the mobile computing device 650 with other devices.
- the external interface 662 may provide, for example, for wired communication in some implementations, or for wireless communication in other implementations, and multiple interfaces may also be used.
- the memory 664 stores information within the mobile computing device 650 .
- the memory 664 can be implemented as one or more of a computer-readable medium or media, a volatile memory unit or units, or a non-volatile memory unit or units.
- An expansion memory 674 may also be provided and connected to the mobile computing device 650 through an expansion interface 672 , which may include, for example, a SIMM (Single In Line Memory Module) card interface.
- SIMM Single In Line Memory Module
- the expansion memory 674 may provide extra storage space for the mobile computing device 650 , or may also store applications or other information for the mobile computing device 650 .
- the expansion memory 674 may include instructions to carry out or supplement the processes described above, and may include secure information also.
- the expansion memory 674 may be provided as a security module for the mobile computing device 650 , and may be programmed with instructions that permit secure use of the mobile computing device 650 .
- secure applications may be provided via the SIMM cards, along with additional information, such as placing identifying information on the SIMM card in a non-hackable manner.
- the memory may include, for example, flash memory and/or NVRAM memory (non-volatile random access memory), as discussed below.
- instructions are stored in an information carrier.
- the instructions when executed by one or more processing devices (for example, processor 652 ), perform one or more methods, such as those described above.
- the instructions can also be stored by one or more storage devices, such as one or more computer- or machine-readable mediums (for example, the memory 664 , the expansion memory 674 , or memory on the processor 652 ).
- the instructions can be received in a propagated signal, for example, over the transceiver 668 or the external interface 662 .
- the mobile computing device 650 may communicate wirelessly through the communication interface 666 , which may include digital signal processing circuitry where necessary.
- the communication interface 666 may provide for communications under various modes or protocols, such as GSM voice calls (Global System for Mobile communications), SMS (Short Message Service), EMS (Enhanced Messaging Service), or MMS messaging (Multimedia Messaging Service), CDMA (code division multiple access), TDMA (time division multiple access), PDC (Personal Digital Cellular), WCDMA (Wideband Code Division Multiple Access), CDMA2000, or GPRS (General Packet Radio Service), among others.
- GSM voice calls Global System for Mobile communications
- SMS Short Message Service
- EMS Enhanced Messaging Service
- MMS messaging Multimedia Messaging Service
- CDMA code division multiple access
- TDMA time division multiple access
- PDC Personal Digital Cellular
- WCDMA Wideband Code Division Multiple Access
- CDMA2000 Code Division Multiple Access
- GPRS General Packet Radio Service
- a GPS (Global Positioning System) receiver module 670 may provide additional navigation- and location-related wireless data to the mobile computing device 650 , which may be used as appropriate by applications running on the mobile computing device 650 .
- the mobile computing device 650 may also communicate audibly using an audio codec 660 , which may receive spoken information from a user and convert it to usable digital information.
- the audio codec 660 may likewise generate audible sound for a user, such as through a speaker, e.g., in a handset of the mobile computing device 650 .
- Such sound may include sound from voice telephone calls, may include recorded sound (e.g., voice messages, music files, etc.) and may also include sound generated by applications operating on the mobile computing device 650 .
- the systems and techniques described here can be implemented on a computer having a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to the user and a keyboard and a pointing device (e.g., a mouse or a trackball) by which the user can provide input to the computer.
- a display device e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor
- a keyboard and a pointing device e.g., a mouse or a trackball
- Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user can be received in any form, including acoustic, speech, or tactile input.
- the systems and techniques described here can be implemented in a computing system that includes a back end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front end component (e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back end, middleware, or front end components.
- the components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network (LAN), a wide area network (WAN), and the Internet.
- LAN local area network
- WAN wide area network
- the Internet the global information network
- the computing system can include clients and servers.
- a client and server are generally remote from each other and typically interact through a communication network.
- the relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
Landscapes
- Toys (AREA)
Abstract
Methods, systems, and apparatus, including computer programs encoded on computer storage media, for tracking object movement within a three-dimensional workspace. A method includes receiving, by a controller, break beam sensor data indicating object detection by a break beam sensor of a plurality of break beam sensors configured to detect objects within the workspace. The method includes receiving plate sensor data indicating object detection by a plate sensor of a plurality of plate sensors configured to detect objects resting on a surface of a plate defining a floor of the workspace. The method includes determining that an object passed through the workspace to rest at a position on the surface; comparing the position of the object to a target position of the object; and in response to determining that the position of the object does not satisfy similarity criteria for matching the target position, performing one or more actions.
Description
- The present invention relates to the field of building systems. More particularly, the invention relates to releasably interconnecting building elements and workspaces.
- Certain toy pieces in the form of toy bricks have releasable couplings between bricks, which allow them to be connected to form a larger structure. In their simplest form they build inanimate objects such as castles or houses. In some cases, the toy created using toy bricks can be supported on a baseplate, which can include coupling elements to provide stability or proper positioning, or both.
- In general, this disclosure relates to an interactive build plate system for tracking movement of objects within a three-dimensional workspace. The build plate system can be used to track movement and time of interaction with objects, e.g., modular building blocks. The workspace has a floor defined by a surface of a plate. The plate includes plate sensors for detecting objects resting on the plate. Break beam sensors are arranged to detect objects passing through the workspace. The plate sensors and break beam sensors output sensor data to a controller.
- In some examples, the plate surface is configured to be used with releasably coupleable toy pieces, such as toy building blocks. The plate surface can include toy piece coupling elements. A user may reach into the workspace in order to place a toy piece on the plate surface. The break beam sensors can be arranged in an array. A break beam sensor includes an electromagnetic emitter and an electromagnetic receiver. In some examples, emitters, receivers, or both are supported by structures that are attached to the plate and extend non-parallel to the plate surface. The structures can be attached to the edges of the plate. The break beam sensors can detect the presence of the user's hand and/or of the toy piece passing through the workspace. Break beam sensor data can be output to a controller. The break beam sensor data can include data indicating, for a particular break beam sensor that detects an object, an array address of the break beam sensor, a start time of object detection by the break beam sensor, and/or a duration of detection by the particular break beam sensor.
- The plate sensors can be integrated with the plate, positioned below the plate, or positioned on the plate surface. The plate sensors can include, for example, pressure sensors, contact sensors, or proximity sensors. Each plate sensor can be configured to detect objects resting on the plate surface within a proper subset of the area of the plate surface. When a toy piece is placed on the plate surface, the plate sensors can detect the presence of the toy piece on the plate surface. Plate sensor data can be output to the controller. The plate sensor data can include data indicating, for a particular plate sensor that detects an object, a location of the plate sensor, a start time of object detection by the plate sensor, a duration of object detection by the plate sensor, a weight of the object, etc.
- In some examples, the controller can determine, based on the plate sensor data, that more than one object is located at a particular location, e.g., due to being stacked. In some examples, two or more plate sensors can detect objects resting on the plate surface. The controller can determine, based on the plate sensor data from the two or more plate sensors, a size of the object and/or a number of objects resting on the plate surface.
- Using the break beam sensor data and the plate sensor data, the controller can track object paths within the workspace and placement of objects on the plate surface. For example, the controller can track motion of a user's hand placing a first toy piece on a first location of the plate surface, removing a second toy piece from a second location of the plate surface, and placing the second toy piece on a second location of the plate surface. The controller can determine a time duration of each action and a path of movement for each action.
- In some examples, the controller can perform actions based on the tracked object motion. For example, the controller may determine that the user requires assistance, and perform a feedback action to assist the user. The controller can determine that the user requires assistance, e.g., based on determining that a placement of an object differs from an expected placement of the object, based on the user's hand moving slowly within the workspace, based on the user's hand being within the workspace for longer than an expected time duration, based on the user moving a toy piece repeatedly between locations of the plate surface, etc. To assist the user, the controller can perform a feedback action such as illuminating a warning light, generating an alert sound, outputting audible instructions, outputting visual instructions, energizing a laser pointer, etc.
- In some examples, the controller can generate a visualization of object paths and placement within the workspace. The visualization can include, for example, a heat map showing the path of an object through the workspace and the placement location of the object on the plate surface during a user session. The heat map can be presented on a display in near-real time, can be stored for later viewing by a user, or both. In some examples, the controller can generate an aggregated visualization of objects paths and placement within the workspace. The aggregated visualization can represent multiple user sessions by the same user, or multiple user sessions by multiple users.
- Understanding how people interact with toys and other objects can provide information about the person's development, engagement with the toy, and ability to follow instructions. Sets of modular building blocks can be used to track these parameters. An interactive build plate can use break beam sensors, e.g., including LEDs and photodiodes, to detect user motion and time of interaction. The interactive build plate can record how many times, where, and for how long optical sensor sets have their optical paths broken. The interactive build plate can be configured such that, when in use, block sets or models are built on top of the build plate. Each block can pass over the build plate when being put into place on the plate.
- Actions performed by a user can be compared with building plans. Based on the actions of the user compared with the building plans, the build plate system can perform actions. Actions can include providing feedback by activating guidance signals to guide the user. Actions can include generating a visualization of object movement. Actions can include comparing average user performance to expected performance specified by the building plans. Based on determining that average user performance does not satisfy performance criteria for a particular building plan, instructions for the building plan can be adjusted. For example, a lower-than-expected performance for a construction project by multiple users can indicate that the instructions for the construction project are not accurate and/or are not adequately specific.
- In general, one innovative aspect of the subject matter described in this specification can be embodied in a system including a plate having a surface defining a floor of a three-dimensional workspace. The plate includes toy piece coupling elements and is configured to be used with releasably coupleable toy pieces. The system includes a plurality of plate sensors configured to detect objects resting on the surface of the plate; a plurality of break beam sensors configured to detect objects within the workspace; and a controller. The controller is configured to perform operations including: receiving break beam sensor data indicating object detection by at least one break beam sensor of the plurality of break beam sensors, receiving plate sensor data indicating object detection by at least one plate sensor of the plurality of plate sensors; and determining, based on the break beam sensor data and the plate sensor data, that a toy piece passed through the workspace to rest on the surface.
- These and other embodiments may each optionally include one or more of the following features. In some implementations, the system includes a plurality of support structures, each support structure extending from an edge of the plate in a non-parallel direction to a plane of the surface. Each break beam sensor includes: an emitter configured to emit electromagnetic radiation; and a receiver configured to receive electromagnetic radiation emitted by the emitter. Emitters and receivers of the plurality of break beam sensors are supported by the plurality of support structures.
- In some implementations, each break beam sensor of the plurality of break beam sensors includes: an emitter supported by a first support structure coupled to the plate; and a receiver supported by a second support structure coupled to the plate. Electromagnetic energy traveling from the emitter to the receiver passes through the workspace.
- In some implementations, the plurality of break beam sensors are arranged in an array, the break beam sensor data including data indicating an array address of the at least one break beam sensor.
- In some implementations, receiving break beam sensor data indicating object detection by at least one break beam sensor of the plurality of break beam sensors includes: receiving break beam sensor data indicating simultaneous object detection by two or more break beam sensors at a first time, the break beam sensor data including data indicating an array address of the two or more break beam sensors; and based on the break beam sensor data, determining a three-dimensional coordinate location of the toy piece within the three-dimensional workspace at the first time.
- In some implementations, the break beam sensor data includes data indicating a time of object detection.
- In some implementations, the break beam sensor data includes data indicating a sequence of detections, the operations including determining, based on the break beam sensor data, a path traveled through the workspace by the toy piece.
- In some implementations, each plate sensor of the plurality of plate sensors is configured to detect objects resting on the surface within a respective proper subset of an area of the surface, the operations including: determining, based on the plate sensor data, a location of the toy piece on the surface, the location including a particular proper subset of the area of the surface.
- In some implementations, the operations include comparing a position of the toy piece on the surface to a target position of the toy piece on the surface; and in response to determining that the position of the toy piece does not satisfy similarity criteria for matching the target position, performing one or more actions.
- In some implementations, the one or more actions include at least one of: activating a visual alarm; activating an audible alarm; outputting visual instructions; or outputting audible instructions.
- In some implementations, the plurality of plate sensors are integrated with the plate and arranged in an array.
- In some implementations, the operations include determining, using the break beam sensor data and the plate sensor data, at least one of a size of the toy piece or a shape of the toy piece.
- In some implementations, the operations include: generating a visualization showing: a path of the toy piece through the workspace; and a placement of the toy piece on the surface; and providing the visualization for presentation on a display.
- In some implementations, the plurality of break beam sensors includes a plurality of infrared break beam sensors.
- In some implementations, the plurality of plate sensors includes a plurality of weight sensors, proximity sensors, or contact sensors.
- In general, one innovative aspect of the subject matter described in this specification can be embodied in a method for tracking object movement within a three-dimensional workspace. The method includes receiving, by a controller, break beam sensor data indicating object detection by at least one break beam sensor of a plurality of break beam sensors. The plurality of break beam sensors is configured to detect objects within the workspace. The method includes receiving, by the controller, plate sensor data indicating object detection by at least one plate sensor of a plurality of plate sensors. The plurality of plate sensors is configured to detect objects resting on a surface of a plate that defines a floor of the workspace. The method includes determining, by the controller and based on the break beam sensor data and the plate sensor data, that an object passed through the workspace to rest at a position on the surface; comparing, by the controller, the position of the object on the surface to a target position of the object on the surface; and in response to determining that the position of the object does not satisfy similarity criteria for matching the target position, performing, by the controller, one or more actions.
- These and other embodiments may each optionally include one or more of the following features. In some implementations, the one or more actions include at least one of: activating a visual alarm; activating an audible alarm; outputting visual instructions; or outputting audible instructions.
- In some implementations, the method includes obtaining data indicating a plan for a construction to be built on the plate; and determining the target position of the object on the surface using the obtained data.
- In some implementations, the plate includes toy piece coupling elements and is configured to be used with releasably coupleable toy pieces.
- Other embodiments of these aspects include corresponding computer systems, apparatus, computer program products, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods. A system of one or more computers can be configured to perform particular operations or actions by virtue of having software, firmware, hardware, or a combination of them installed on the system that in operation causes or cause the system to perform the actions. One or more computer programs can be configured to perform particular operations or actions by virtue of including instructions that, when executed by data processing apparatus, cause the apparatus to perform the actions.
- The details of one or more implementations of the subject matter described in this specification are set forth in the accompanying drawings and the description below. Other features, aspects, and advantages of the subject matter will become apparent from the description, the drawings, and the claims.
-
FIGS. 1A and 1B illustrate an example interactive build plate system. -
FIG. 2 is a block diagram of the example interactive build plate system. -
FIGS. 3A and 3B illustrate tracking of an object by the example interactive build plate system. -
FIGS. 4A and 4B illustrate tracking of multiple objects by the example interactive build plate system. -
FIG. 5 is a flow diagram of an example process for tracking object movement within a three-dimensional workspace. -
FIG. 6 shows an example of a computing device and a mobile computing device that can be used to implement the techniques described herein. - Like reference numbers and designations in the various drawings indicate like elements.
-
FIGS. 1A and 1B illustrate an example interactivebuild plate system 100 for tracking movement of objects within a three-dimensional workspace 108. Theworkspace 108 has a floor defined by asurface 112 of aplate 104. Thesurface 112 extends in a horizontal plane, e.g., in the x-y plane. Sides of theworkspace 108 can be defined byside edges 124 a, 124 b (“edges 124”) of theplate 104. For example, a plane extending vertically in the z-direction from the edges 124 of theplate 104 can define the sides of theworkspace 108. - The
plate 104 includesplate sensors 106 for detecting objects resting on the plate. In some examples, thesurface 112 is configured to be used with releasably coupleable toy pieces, such asblock 110. In some examples, theblock 110 is a toy building block. Thesurface 112 can include toypiece coupling elements 114. Thecoupling elements 114 can provide stability or proper positioning when theblock 110 is placed on thesurface 112. A user may reach into the workspace in order to place theblock 110 on thesurface 112 and couple theblock 110 to thecoupling elements 114. - Break beam sensors are arranged to detect objects passing through the
workspace 108. A break beam sensor includes anelectromagnetic emitter 120 and anelectromagnetic receiver 122. Theplate sensors 106 and break beam sensors can output sensor data to a controller. Theplate sensors 106 can be integrated with theplate 104, positioned below theplate 104, or positioned on thesurface 112. - The
build plate system 100 can have any appropriate size. In some examples, theplate 104 can have an area of thirty square centimeters or greater (e.g., forty square centimeters or greater, fifty square centimeters or greater, sixty square centimeters or greater). In some examples, theplate 104 can have an area of five hundred square centimeters or less (e.g., four hundred square centimeters or less, three hundred square centimeters or less, two hundred centimeters or less). - The
plate 104 can have any appropriate shape. In some examples, theplate 104 has a polygonal shape in the x-y plane. For example, theplate 104 can have a triangular, rectangular, square, pentagonal, or hexagonal shape in the x-y plane. In some examples, theplate 104 has a non-polygonal shape in the x-y plane. For example, theplate 104 can have an elliptical, oval, or circular shape in the x-y plane. - The
build plate system 100 includesstructures FIG. 1A , theemitter 120 is supported by thestructure 102 a and thereceiver 122 is supported by thestructure 102 b. In some examples, electromagnetic energy travels from theemitter 120 to thereceiver 122 in a diagonal direction, e.g., a direction non-parallel to the x-y plane of thesurface 112. In some examples, electromagnetic energy travels from an emitter to a receiver in a direction parallel to the x-y plane of thesurface 112. - The structures 102 are coupled to the
plate 104. The structures 102 extend non-parallel to thesurface 112. For example, aninner surface surface 112. In some examples, the structures 102 are attached to the edges 124 of theplate 104. In some examples, each structure 102 is attached to a respective edge 124 of theplate 104. In some examples, the structures 102 are attached to opposing edges of theplate 104, e.g., such that theinner surface 105 a faces theinner surface 105 b across theplate 104. - Although shown as having two structures, the
build plate system 100 can include any number of structures. For example, thebuild plate system 100 can include three structures, four structures, or five or more structures. The structures 102 can be positioned in an arrangement that does not impede access to theworkspace 108. For example, the number, size, shape, and positioning of the structures 102 can be such that a user can reach into theworkspace 108 between the structures 102. A ceiling of theworkspace 108 can be defined bytop edges 103 a, 103 b of thestructures structure 102 a to thetop edge 103 b of thestructure 102 b can define the ceiling of theworkspace 108. - In some examples, the structures 102 can be coupled to the
plate 104 at corners of the plate. For example, theplate 104 can have a rectangular shape in the x-y plane. A structure 102 can be coupled to theplate 104 at each of the four corners of theplate 104. -
FIG. 1B illustrates a perspective view of the examplebuild plate system 100. Thebuild plate system 100 can track theblock 110 as a user'shand 140 enters theworkspace 108 while holding the block. Thebuild plate system 100 can track theblock 110 as the user'shand 140 places theblock 110 on thesurface 112. Thebuild plate system 100 can determine a location and positioning of theblock 110 on thesurface 112. - The break beam sensors of the
build plate system 100 can be arranged in an array and supported bystructure emitter 130, and a receiver, e.g.,receiver 132. The break beam sensors can detect the presence and movement of objects within theworkspace 108, e.g., block 110 being carried by a user'shand 140. The break beam sensors can detect and track movement of the user'shand 140 passing through theworkspace 108. - In the example of
FIG. 1B , emitters are represented by white circles, and receivers are represented by black circles. In some examples, an emitter and corresponding receiver of a break beam sensor can be positioned across from each other and supported by opposing structures 102. In some examples, the direction of travel between an emitter and a receiver is parallel to the x-y plane and parallel to the x-z plane, e.g.,path 152. In some examples, the direction of travel between an emitter and a receiver is parallel to the x-y plane and non-parallel to the x-z plane, e.g.,path 154. In some examples, the direction of travel between an emitter and a receiver is non-parallel to the x-y plane and parallel to the x-z plane, e.g.,path 156. In some examples, the direction of travel between an emitter and a receiver is non-parallel to the x-y plane and non-parallel to the x-z plane, e.g.,path 158. - Each structure 102 supports an array of emitters, receivers, or both. In the example of
FIG. 1B , thestructure 102 a supports anarray 160 a including both emitters and receivers. Thestructure 102 b also supports anarray 160 b including both emitters and receivers. In some examples, each structure 102 supports an array of only emitters or only receivers. -
FIG. 2 shows a block diagram of the examplebuild plate system 100. Thebuild plate system 100 includesbreak beam sensors 204 andplate sensors 106. Thebuild plate system 100 optionally includes acamera 206. Thebuild plate system 100 includes acontroller 210. Thecontroller 210 includes amovement tracker 212 and aplacement tracker 214. Thebuild plate system 100 includes amemory 220. Thebuild plate system 100 optionally includes avisualization generator 222 and asignal device 224. - The
break beam sensors 204 output breakbeam sensor data 234 to thecontroller 210. The breakbeam sensor data 234 can include data indicating, for a particular break beam sensor that detects theblock 110, an array address of the break beam sensor, a start time of object detection by the break beam sensor, a duration of detection by the particular break beam sensor, or any combination of these. The array address of a break beam sensor can be, for example, a coordinate position of the emitter in the respective array, a coordinate position of the receiver in the respective array, or both. - In some examples, the
break beam sensors 204 include infrared emitters and receivers. An example infrared emitter includes an infrared LED. The infrared LED can have a diameter of approximately 3.0 millimeters (mm) (e.g., 2.0 mm or greater, 2.5 mm or greater, 3.5 mm or greater). Infrared break-beam sensors can be used to detect object presence and object motion. An infrared emitter sends out a beam of human-invisible infrared light. A receiver, such as a photodiode, that is sensitive to the infrared light is positioned across the workspace from the emitter. When theblock 110 passes between the emitter and the receiver, and the object is not transparent to infrared, the beam is broken and the receiver detects the interruption. An array of break beam sensors can be used to detect and localize objects. The array of break beam sensors can be used to detect motion of objects, to determine speed of object motion, and to determine two-dimensional and three-dimensional direction of object motion. - The
build plate system 100 includes one ormore plate sensors 106. Aplate sensor 106 is a sensor that detects the presence of theblock 110 on theplate 104. Theplate sensor 106 can output data indicating a size, shape, location, weight, or any combination of these to thecontroller 210. Theplate sensors 106 can include, for example, pressure sensors, weight sensors, proximity sensors, load cells, contact sensors, capacitive sensors, or any combination of these. - Each plate sensor can be configured to detect objects resting on the plate surface within a proper subset of the area of the plate surface. When a toy piece, e.g., block 110, is placed on the
surface 112, theplate sensors 106 can detect the presence of the toy piece on thesurface 112.Plate sensor data 236 can be output to thecontroller 210.Plate sensor data 236 output by a particular plate sensor can include data indicating a location of the plate sensor that detected the object, a size of the object, a shape of the object, an orientation of the object, a start time of object detection by the plate sensor, a duration of object detection by the plate sensor, a weight of the object, or any combination of these. - The
plate sensors 106 can be arranged in an array. Thebuild plate system 100 can include any appropriate number ofplate sensors 106. The resolution of thebuild plate system 100 can depend on the number of plate sensors. For example, a greater number ofplate sensors 106 results in a higher resolution, while a lesser number ofplate sensors 106 results in a lower resolution. A build plate system having a higher resolution can determine more precise locations and sizes of objects resting on theplate 104, compared to a build plate system having a lower resolution. - The
controller 210 includes amovement tracker 212 and aplacement tracker 214. Using the breakbeam sensor data 234 from thebreak beam sensors 204, themovement tracker 212 can track object paths within theworkspace 108. For breakbeam sensor data 234 generated at a particular time, themovement tracker 212 can determine a coordinate location of theblock 110 within theworkspace 108 at the particular time. The coordinate location can be a three-dimensional coordinate location within theworkspace 108 where theblock 110 is detected. In some examples, the coordinate location can be a location of an estimated center of theblock 110 at the particular time. - In some examples, the
movement tracker 212 can determine a size of theblock 110 based on a number ofbreak beam sensors 204 that are simultaneously interrupted at a given time. In some examples, themovement tracker 212 can determine a shape of theblock 110, a location of an estimated center of theblock 110, or both, using the breakbeam sensor data 234. Themovement tracker 212 can determine the shape of theblock 110 and/or the location of the center of theblock 110 based on the number ofbreak beam sensors 204 that are simultaneously interrupted at a given time, the array addresses of thebreak beam sensors 204 that are simultaneously interrupted at the given time, or both. - The
movement tracker 212 can determine a path of theblock 110 moving through theworkspace 108. The path can include a trajectory of theblock 110, a speed of theblock 110, a time of travel of theblock 110, or any combination of these. In some examples, the trajectory of theblock 110 includes a series of coordinate locations of theblock 110 in theworkspace 108, with each coordinate location being associated with a time of detection. - Using the
plate sensor data 236 from theplate sensors 106, theplacement tracker 214 can track placement of objects on thesurface 112. Forplate sensor data 236 generated at a particular time, theplacement tracker 214 can determine a coordinate location of the block on theplate 104. The coordinate location can be a two-dimensional coordinate location on theplate 104 where theblock 110 is detected. In some examples, the coordinate location can be a location of an estimated center of theblock 110 at the particular time. - In some examples, the
placement tracker 214 can determine a size of theblock 110 based on a number ofplate sensors 106 that detect theblock 110. In some examples, theplacement tracker 214 can determine a shape of the 110, a location of an estimated center of theblock 110, an orientation of theblock 110, or any combination of these, using theplate sensor data 236. Theplacement tracker 214 can determine the shape, location, and orientation of theblock 110 based on the number ofplate sensors 106 that detect the block, the array addresses of theplate sensors 106 that detect the block, or both. - In an example scenario, using the break
beam sensor data 234 and theplate sensor data 236, thecontroller 210 can track motion of a user'shand 140 moving through theworkspace 108 to place theblock 110 at a particular location on theplate 104. Thecontroller 210 can determine a time duration of movement of thehand 140 within theworkspace 108, a speed of thehand 140 moving through theworkspace 108, a path that thehand 140 travels through theworkspace 108, and a coordinate address of the particular location on theplate 104. - In some examples, the
build plate system 100 includes acamera 206. Thecamera 206 can capture images of theworkspace 108. Thecamera 206 can be activated when the user (or user's parents/guardians) consent to having the camera capture user interaction with thebuild plate system 100 and toy pieces. - The
controller 210 can usecamera image data 238 captured by thecamera 206 to track movement and placement of objects within theworkspace 108. In some examples, thecontroller 210 can overlaycamera image data 238 captured by thecamera 206 with breakbeam sensor data 234 from thebreak beam sensors 204, withplate sensor data 236 from theplate sensors 106, or both.Camera image data 238 captured by thecamera 206 can be used to verify and/or validate trajectories and placement of objects determined by thecontroller 210. - The
build plate system 100 includes amemory 220. In some examples, thememory 220 can store calibration data. Calibration data can include data associating break beam sensor data with location and movement patterns of objects within theworkspace 108. Calibration data can include data associatingplate sensor data 236 with locations, shapes, and sizes of objects placed on theplate 104. - In some examples, the
memory 220 can store abuilding plan 202. In some examples, thebuilding plan 202 can be loaded into thememory 220. Thebuilding plan 202 can include a plan for a construction to be built on theplate 104. Thebuilding plan 202 can include an arrangement of blocks. The arrangement can include a number of blocks to be placed on theplate 104, a location for each block on theplate 104, a type of block to be placed at each location, an orientation for each block on theplate 104, a number of blocks to be placed at each location of theplate 104, or any combination of these. - In some examples, the
building plan 202 includes a sequence of actions to be performed by the user within theworkspace 108. In some examples, thebuilding plan 202 includes a sequence of block movement within theworkspace 108. In some examples, thebuilding plan 202 includes a sequence of block placement on theplate 104. Thebuilding plan 202 can include an expected time duration for building the construction, an expected time duration for placing each block on theplate 104, an expected speed of the user'shand 140 through theworkspace 108, or any of these. - The
controller 210 can access thebuilding plan 202 and can compare object movement within theworkspace 108, and object placement on theplate 104, to thebuilding plan 202. For example, thecontroller 210 can compare a sequence of object movement within theworkspace 108 to a sequence of object movement specified by thebuilding plan 202. Thecontroller 210 can compare a placement of a block on theplate 104 to a placement specified by thebuilding plan 202. Thecontroller 210 can determine whether detected movement and placement of objects satisfies similarity criteria for matching the movement and placement of objects specified by thebuilding plan 202. - The
controller 210 can determine target motion patterns of theblock 110 using thebuilding plan 202. Thecontroller 210 can then determine whether motion of an object, e.g., a trajectory of theblock 110, satisfies similarity criteria for matching the target motion of theblock 110 specified by thebuilding plan 202. In response to determining that the motion of theblock 110 satisfies the similarity criteria, thecontroller 210 can determine that motion of theblock 110 matches thebuilding plan 202. In response to determining that the motion of theblock 110 does not satisfy the similarity criteria, thecontroller 210 can determine that the motion of theblock 110 does not match thebuilding plan 202. - The
controller 210 can determine a target placement of theblock 110 using thebuilding plan 202. Thecontroller 210 can determine whether placement of theblock 110 on theplate 104 satisfies similarity criteria for matching the target placement of theblock 110 specified by thebuilding plan 202. Similarity criteria can include, for example, a threshold distance between the placement of theblock 110 on theplate 104 and the target placement specified by thebuilding plan 202. In response to determining that the placement of theblock 110 satisfies the similarity criteria, thecontroller 210 can determine that theblock 110 placement matches thebuilding plan 202. In response to determining that the placement of theblock 110 does not satisfy the similarity criteria, thecontroller 210 can determine that the placement of theblock 110 does not match thebuilding plan 202. - In some examples, the
controller 210 can perform actions based on the tracked object motion. For example, thecontroller 210 may determine that the motion and/or placement of theblock 110 is inaccurate based on determining that the motion and/or placement does not satisfy criteria for matching thebuilding plan 202. In response to determining that the motion and/or placement of theblock 110 does not match thebuilding plan 202, thecontroller 210 can determine that the user requires assistance and can determine to perform an action to provide feedback and/or assistance to the user. In some examples, thecontroller 210 can determine that the user requires assistance based on determining that a placement of theblock 110 differs from a target placement of theblock 110, based on the user'shand 140 moving at a speed that is slow than a target speed within the workspace, based on the user's hand being within the workspace for longer than an expected time duration, based on the user moving a block repeatedly between multiple locations of thesurface 112, and/or based on other detected object or user movements. - To guide and assist the user, the
controller 210 can perform one or more actions. An example action includes providing feedback to the user by activating asignal device 224. Thesignal device 224 can include, for example, a visual alarm, a light, an audible alarm, a speaker, a laser pointer, or any combination of these. In some examples, thecontroller 210 can activate thesignal device 224 by activating a visible signal such as a light or laser pointer that illuminates a location of theplate 104 where theblock 110 should be placed, e.g., in accordance with thebuilding plan 202. In some examples, thecontroller 210 can activate thesignal device 224 by illuminating a light of a particular color. For example, a red light can indicate that theblock 110 has been placed incorrectly or is on the wrong path, and a green light can indicate that theblock 110 has been placed correctly or is on the correct path. - In some examples, the
controller 210 can activate thesignal device 224 by broadcasting audible sound through a speaker. The sound can include, for example, an alert sound indicating that theblock 110 has been placed in correctly. In some examples, the sound can include verbal instructions. In some examples, thecontroller 210 can activate thesignal device 224 by displaying visual instructions on a display coupled to thebuild plate system 100. The visual instructions can include, for example, textual or graphical instructions. The instructions can specify one or more actions to be performed by the user in order to place theblock 110 correctly per thebuilding plan 202. - The
controller 210 can operate thebuild plate system 100 in different operating modes, e.g. an easy mode, a medium mode, a hard mode. Thecontroller 210 can be configured to provide different levels of feedback and/or assistance in the different operating modes. For example, in the easy mode, thecontroller 210 can provide more assistance to the user than in the medium and hard modes. In the medium mode, thecontroller 210 can provide more assistance to the user than in the hard mode, but less assistance than in the easy mode. For example, in the easy mode, thecontroller 210 can provide more assistance by providing assistance more quickly after determining that the user needs assistance, by providing more specific assistance, or both. - In an example scenario, the
controller 210 may determine that the user placed theblock 110 at an incorrect or inaccurate location of thesurface 112. In the easy mode, thecontroller 210 can perform an action to assist the user, e.g., by illuminating a light under the correct location of thesurface 112 after a one second delay. In the medium mode, thecontroller 210 can perform an action by illuminating a light supported by one of the structures 102 after a three second delay. The light can indicate the incorrect location of theblock 110 without revealing the correct location. In the hard mode, thecontroller 210 can perform an action by illuminating the light after a ten second delay. - The
build plate system 100 can operate in the different operating modes, e.g., based on thebuilding plan 202, based on user input, or both. For example, thebuild plate system 100 can provide a user interface for receiving user input indicating the operating mode. In some examples, thebuild plate system 100 can receive user input, through a user interface, specifying various settings of operation. For example, the build plate system can receive user input specifying a setting for a preferred type ofsignal device 224 to be used for providing user feedback and assistance. The setting for the preferred type ofsignal device 224 can indicate, for example, a user preference for visual guidance over audible guidance. - The
build plate system 100 can include avisualization generator 222. In some examples, thevisualization generator 222 can generate a visualization of object paths and placement within the workspace. The visualization can include, for example, a heat map showing the path of theblock 110 through theworkspace 108 and the placement location of theblock 110 on thesurface 112 during a user session. The visualization can be presented on a display device. In some examples, the visualization can be presented in near-real time, can be stored for later viewing by a user, or both. In some examples, thevisualization generator 222 can generate an aggregated visualization of objects paths and placement within the workspace. The aggregated visualization can represent multiple user sessions by a same user, or multiple user sessions by multiple users. - The
memory 220 can store data generated from user sessions. For example, thememory 220 can store, for a user session, breakbeam sensor data 234,plate sensor data 236,camera image data 238, or any of these. In some examples, thememory 220 can store data indicating object movement paths determined by themovement tracker 212, and object placement determined by theplacement tracker 214. -
FIGS. 3A and 3B illustrate tracking of an object by the example interactivebuild plate system 100. Referring toFIG. 3A , the user'shand 140 holds theblock 110 and moves theblock 110 through theworkspace 108. Thebuild plate system 100 can track movement of thehand 140 and theblock 110 using the break beam sensors. For example, thebuild plate system 100 can determine a time of interruption of thebeam 302 between theemitter 120 and thereceiver 122. In some examples, thebuild plate system 100 can determine a three-dimensional coordinate location of theblock 110 within theworkspace 108 at a particular time based on the array addresses of break beam sensors that detected the block at the particular time. In some examples, thebuild plate system 100 can determine a size of theblock 110, an orientation of theblock 110, a shape of theblock 110, a speed of movement of theblock 110, a direction of movement of theblock 110, or any combination of these based on break beam sensor data generated by the break beam sensors while theblock 110 is in theworkspace 108. - Referring to
FIG. 3B , theblock 110 is placed on thesurface 112 of theplate 104. Thebuild plate system 100 can determine the location and orientation of theblock 110 based onplate sensor data 236 generated by theplate sensors 106. In some examples, thebuild plate system 100 can determine a size, shape, and weight of theblock 110 based on theplate sensor data 236. -
FIGS. 4A and 4B illustrate tracking of multiple objects by the example interactivebuild plate system 100. Referring toFIG. 4A , blocks 110, 410 are placed side-by-side on theplate 104.Plate sensor data 236 generated can indicate array addresses of theplate sensors 106 that detect the presence of theblocks - The
build plate system 100 can determine, based on theplate sensor data 236, a number of objects resting on theplate 104, a size of each object, a weight of each object, an orientation of each object, a shape of each object, or any combination of these. Thebuild plate system 100 can determine, based on theplate sensor data 236, that two blocks are resting on theplate 104. - The
build plate system 100 can determine an accuracy of block placement by comparing the placement of theblocks blocks building plan 202. Based on determining that the placement of theblocks blocks build plate system 100 can perform an action to provide feedback and assist the user in correct placement of theblocks - The
build plate system 100 can track individual objects moving between theworkspace 108 and thesurface 112. For example, a user may place theblock 110 in a first location on thesurface 112, place theblock 410 in a second location on thesurface 112, remove theblock 110 from the first location, and place theblock 110 in a third location. When a particular block transitions between theworkspace 108 to thesurface 112, thebuild plate system 100 can compare the breakbeam sensor data 234 with theplate sensor data 236 to determine which block, e.g., block 110 or 410, is being moved. - In some examples, the
build plate system 100 can generate an identifier for each object that enters theworkspace 108. For example, when eachblock workspace 108, thebuild plate system 100 can generate an identifier for each block. The identifier can be stored in thememory 220 with data indicating the location of the respective block and associated characteristics of the respective block. The associated characteristics can be determined using the breakbeam sensor data 234, the plate sensor data 136, the camera image data 138, or any of these. Characteristics can include, for example, a size, shape, weight, and/or color of the block. When thebuild plate system 100 detects movement of one of the blocks, thebuild plate system 100 can determine the identifier of the block that is moving. Thebuild plate system 100 can determine the identifier of the block that is moving based on characteristics of the moving block, based on the starting location of the moving block, or both. Thebuild plate system 100 can then track movement of the identified block from its starting location to an ending location. Thebuild plate system 100 can thus track location and movement of individual objects by storing identifiers, locations, and/or characteristics of each object within thememory 220. - Referring to
FIG. 4B , block 420 is stacked on top ofblock 110 on theplate 104. Thebuild plate system 100 can determine, based on theplate sensor data 236, a number of objects placed in a same location, e.g., due to being stacked. For example, theplate sensor data 236 can indicate a first increase in weight detected by a first plate sensor at a first time. Theplate sensor data 236 can indicate a second increase in weight detected by the first plate sensor at a second time after the first time. Based on the first increase in weight, followed by the second increase in weight, thebuild plate system 100 can determine that two objects are stacked at a location of theplate 104 corresponding to the first plate sensor. - In some examples, objects can be detected by break beam sensors while resting on the
plate 104. For example, theblock 420, stacked on top of theblock 110, breaks thebeam 302. Theblock 420 is therefore detected by both thebreak beam sensors 204 and theplate sensors 106 while stacked on theblock 110. Thebuild plate system 100 can overlay the breakbeam sensor data 234 and theplate sensor data 236 to determine a precise location and other attributes of theblock 420. For example, using the breakbeam sensor data 234 and theplate sensor data 236, thebuild plate system 100 can determine a three-dimensional coordinate location of theblock 420, a height of theblock 420 when stacked on theblock 110, a shape of theblock 420, a size of theblock 420, a weight of theblock 420, a three-dimensional orientation of theblock 420, or any of these. Information about theblock 420 determined from both the breakbeam sensor data 234 and theplate sensor data 236 can be more precise and accurate than information about theblock 420 determined from only the breakbeam sensor data 234 or only theplate sensor data 236. -
FIG. 5 is a flow diagram of anexample process 500 for tracking object movement within a three-dimensional workspace. - The
process 500 includes receiving, by a controller, break beam sensor data indicating object detection by at least one break beam sensor of a plurality of break beam sensors (502). The plurality of break beam sensors is configured to detect objects within the workspace. For example, the array of break beam sensors 205 can detect and track objects moving within theworkspace 108. - The
process 500 includes receiving, by the controller, plate sensor data indicating object detection by at least one plate sensor of a plurality of plate sensors (504). The plurality of plate sensors is configured to detect objects resting on a surface that defines a floor of the workspace. For example, theplate sensors 106 can detect theblock 110 resting on thesurface 112 that defines the floor of theworkspace 108. - The
process 500 includes determining, based on the break beam sensor data and the plate sensor data, that an object passed through the workspace to rest on the surface (506). For example, thecontroller 210 can determine, based on the breakbeam sensor data 234 and theplate sensor data 236, that theblock 110 passed through theworkspace 108 to rest on thesurface 112. Thecontroller 210 can determine a trajectory of theblock 110 through theworkspace 108, and a location of placement of theblock 110 on thesurface 112. -
FIG. 6 shows an example of acomputing device 600 and amobile computing device 650 that can be used to implement the techniques described here. Thecomputing device 600 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Themobile computing device 650 is intended to represent various forms of mobile devices, such as personal digital assistants, cellular telephones, smart-phones, and other similar computing devices. The components shown here, their connections and relationships, and their functions, are meant to be examples only, and are not meant to be limiting. - The
computing device 600 includes aprocessor 602, amemory 604, astorage device 606, a high-speed interface 608 connecting to thememory 604 and multiple high-speed expansion ports 610, and a low-speed interface 612 connecting to a low-speed expansion port 614 and thestorage device 606. Each of theprocessor 602, thememory 604, thestorage device 606, the high-speed interface 608, the high-speed expansion ports 610, and the low-speed interface 612, are interconnected using various busses, and may be mounted on a common motherboard or in other manners as appropriate. Theprocessor 602 can process instructions for execution within thecomputing device 600, including instructions stored in thememory 604 or on thestorage device 606 to display graphical information for a GUI on an external input/output device, such as adisplay 616 coupled to the high-speed interface 608. In other implementations, multiple processors and/or multiple buses may be used, as appropriate, along with multiple memories and types of memory. Also, multiple computing devices may be connected, with each device providing portions of the necessary operations (e.g., as a server bank, a group of blade servers, or a multi-processor system). - The
memory 604 stores information within thecomputing device 600. In some implementations, thememory 604 is a volatile memory unit or units. In some implementations, thememory 604 is a non-volatile memory unit or units. Thememory 604 may also be another form of computer-readable medium, such as a magnetic or optical disk. - The
storage device 606 is capable of providing mass storage for thecomputing device 600. In some implementations, thestorage device 606 may be or contain a computer-readable medium, such as a floppy disk device, a hard disk device, an optical disk device, or a tape device, a flash memory or other similar solid state memory device, or an array of devices, including devices in a storage area network or other configurations. Instructions can be stored in an information carrier. The instructions, when executed by one or more processing devices (for example, processor 602), perform one or more methods, such as those described above. The instructions can also be stored by one or more storage devices such as computer- or machine-readable mediums (for example, thememory 604, thestorage device 606, or memory on the processor 602). - The high-
speed interface 608 manages bandwidth-intensive operations for thecomputing device 600, while the low-speed interface 612 manages lower bandwidth-intensive operations. Such allocation of functions is an example only. In some implementations, the high-speed interface 608 is coupled to thememory 604, the display 616 (e.g., through a graphics processor or accelerator), and to the high-speed expansion ports 610, which may accept various expansion cards (not shown). In the implementation, the low-speed interface 612 is coupled to thestorage device 606 and the low-speed expansion port 614. The low-speed expansion port 614, which may include various communication ports (e.g., USB, Bluetooth, Ethernet, wireless Ethernet) may be coupled to one or more input/output devices, such as a keyboard, a pointing device, a scanner, or a networking device such as a switch or router, e.g., through a network adapter. - The
computing device 600 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as astandard server 620, or multiple times in a group of such servers. In addition, it may be implemented in a personal computer such as alaptop computer 622. It may also be implemented as part of arack server system 624. Alternatively, components from thecomputing device 600 may be combined with other components in a mobile device (not shown), such as amobile computing device 650. Each of such devices may contain one or more of thecomputing device 600 and themobile computing device 650, and an entire system may be made up of multiple computing devices communicating with each other. - The
mobile computing device 650 includes aprocessor 652, amemory 664, an input/output device such as adisplay 654, acommunication interface 666, and atransceiver 668, among other components. Themobile computing device 650 may also be provided with a storage device, such as a micro-drive or other device, to provide additional storage. Each of theprocessor 652, thememory 664, thedisplay 654, thecommunication interface 666, and thetransceiver 668, are interconnected using various buses, and several of the components may be mounted on a common motherboard or in other manners as appropriate. - The
processor 652 can execute instructions within themobile computing device 650, including instructions stored in thememory 664. Theprocessor 652 may be implemented as a chipset of chips that include separate and multiple analog and digital processors. Theprocessor 652 may provide, for example, for coordination of the other components of themobile computing device 650, such as control of user interfaces, applications run by themobile computing device 650, and wireless communication by themobile computing device 650. - The
processor 652 may communicate with a user through acontrol interface 658 and adisplay interface 656 coupled to thedisplay 654. Thedisplay 654 may be, for example, a TFT (Thin-Film-Transistor Liquid Crystal Display) display or an OLED (Organic Light Emitting Diode) display, or other appropriate display technology. Thedisplay interface 656 may comprise appropriate circuitry for driving thedisplay 654 to present graphical and other information to a user. Thecontrol interface 658 may receive commands from a user and convert them for submission to theprocessor 652. In addition, anexternal interface 662 may provide communication with theprocessor 652, so as to enable near area communication of themobile computing device 650 with other devices. Theexternal interface 662 may provide, for example, for wired communication in some implementations, or for wireless communication in other implementations, and multiple interfaces may also be used. - The
memory 664 stores information within themobile computing device 650. Thememory 664 can be implemented as one or more of a computer-readable medium or media, a volatile memory unit or units, or a non-volatile memory unit or units. Anexpansion memory 674 may also be provided and connected to themobile computing device 650 through anexpansion interface 672, which may include, for example, a SIMM (Single In Line Memory Module) card interface. Theexpansion memory 674 may provide extra storage space for themobile computing device 650, or may also store applications or other information for themobile computing device 650. Specifically, theexpansion memory 674 may include instructions to carry out or supplement the processes described above, and may include secure information also. Thus, for example, theexpansion memory 674 may be provided as a security module for themobile computing device 650, and may be programmed with instructions that permit secure use of themobile computing device 650. In addition, secure applications may be provided via the SIMM cards, along with additional information, such as placing identifying information on the SIMM card in a non-hackable manner. - The memory may include, for example, flash memory and/or NVRAM memory (non-volatile random access memory), as discussed below. In some implementations, instructions are stored in an information carrier. The instructions, when executed by one or more processing devices (for example, processor 652), perform one or more methods, such as those described above. The instructions can also be stored by one or more storage devices, such as one or more computer- or machine-readable mediums (for example, the
memory 664, theexpansion memory 674, or memory on the processor 652). In some implementations, the instructions can be received in a propagated signal, for example, over thetransceiver 668 or theexternal interface 662. - The
mobile computing device 650 may communicate wirelessly through thecommunication interface 666, which may include digital signal processing circuitry where necessary. Thecommunication interface 666 may provide for communications under various modes or protocols, such as GSM voice calls (Global System for Mobile communications), SMS (Short Message Service), EMS (Enhanced Messaging Service), or MMS messaging (Multimedia Messaging Service), CDMA (code division multiple access), TDMA (time division multiple access), PDC (Personal Digital Cellular), WCDMA (Wideband Code Division Multiple Access), CDMA2000, or GPRS (General Packet Radio Service), among others. Such communication may occur, for example, through thetransceiver 668 using a radio-frequency. In addition, short-range communication may occur, such as using a Bluetooth, WiFi, or other such transceiver (not shown). In addition, a GPS (Global Positioning System)receiver module 670 may provide additional navigation- and location-related wireless data to themobile computing device 650, which may be used as appropriate by applications running on themobile computing device 650. - The
mobile computing device 650 may also communicate audibly using anaudio codec 660, which may receive spoken information from a user and convert it to usable digital information. Theaudio codec 660 may likewise generate audible sound for a user, such as through a speaker, e.g., in a handset of themobile computing device 650. Such sound may include sound from voice telephone calls, may include recorded sound (e.g., voice messages, music files, etc.) and may also include sound generated by applications operating on themobile computing device 650. - The
mobile computing device 650 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as acellular telephone 680. It may also be implemented as part of a smart-phone 682, personal digital assistant, or other similar mobile device. - Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various implementations can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.
- These computer programs (also known as programs, software, software applications or code) include machine instructions for a programmable processor, and can be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the terms machine-readable medium and computer-readable medium refer to any computer program product, apparatus and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term machine-readable signal refers to any signal used to provide machine instructions and/or data to a programmable processor.
- To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to the user and a keyboard and a pointing device (e.g., a mouse or a trackball) by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user can be received in any form, including acoustic, speech, or tactile input.
- The systems and techniques described here can be implemented in a computing system that includes a back end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front end component (e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back end, middleware, or front end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network (LAN), a wide area network (WAN), and the Internet.
- The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
- Although a few implementations have been described in detail above, other modifications are possible. For example, while a client application is described as accessing the delegate(s), in other implementations the delegate(s) may be employed by other applications implemented by one or more processors, such as an application executing on one or more servers. In addition, the logic flows depicted in the figures do not require the particular order shown, or sequential order, to achieve desirable results. In addition, other actions may be provided, or actions may be eliminated, from the described flows, and other components may be added to, or removed from, the described systems. Accordingly, other implementations are within the scope of the following claims. What is claimed is:
Claims (20)
1. A system comprising:
a plate having a surface defining a floor of a three-dimensional workspace, wherein the plate includes toy piece coupling elements and is configured to be used with releasably coupleable toy pieces;
a plurality of plate sensors configured to detect objects resting on the surface of the plate;
a plurality of break beam sensors configured to detect objects within the workspace; and
a controller configured to perform operations comprising:
receiving break beam sensor data indicating object detection by at least one break beam sensor of the plurality of break beam sensors,
receiving plate sensor data indicating object detection by at least one plate sensor of the plurality of plate sensors; and
determining, based on the break beam sensor data and the plate sensor data, that a toy piece passed through the workspace to rest on the surface.
2. The system of claim 1 , comprising a plurality of support structures, each support structure extending from an edge of the plate in a non-parallel direction to a plane of the surface, wherein:
each break beam sensor comprises:
an emitter configured to emit electromagnetic radiation; and
a receiver configured to receive electromagnetic radiation emitted by the emitter, and
emitters and receivers of the plurality of break beam sensors are supported by the plurality of support structures.
3. The system of claim 1 , wherein each break beam sensor of the plurality of break beam sensors comprises:
an emitter supported by a first support structure coupled to the plate; and
a receiver supported by a second support structure coupled to the plate, wherein electromagnetic energy traveling from the emitter to the receiver passes through the workspace.
4. The system of claim 1 , wherein the plurality of break beam sensors are arranged in an array, the break beam sensor data comprising data indicating an array address of the at least one break beam sensor.
5. The system of claim 4 , wherein receiving break beam sensor data indicating object detection by at least one break beam sensor of the plurality of break beam sensors comprises:
receiving break beam sensor data indicating simultaneous object detection by two or more break beam sensors at a first time, the break beam sensor data comprising data indicating an array address of the two or more break beam sensors; and
based on the break beam sensor data, determining a three-dimensional coordinate location of the toy piece within the three-dimensional workspace at the first time.
6. The system of claim 1 , wherein the break beam sensor data comprises data indicating a time of object detection.
7. The system of claim 1 , wherein the break beam sensor data includes data indicating a sequence of detections, the operations comprising determining, based on the break beam sensor data, a path traveled through the workspace by the toy piece.
8. The system of claim 1 , wherein each plate sensor of the plurality of plate sensors is configured to detect objects resting on the surface within a respective proper subset of an area of the surface, the operations comprising:
determining, based on the plate sensor data, a location of the toy piece on the surface, the location comprising a particular proper subset of the area of the surface.
9. The system of claim 1 , the operations comprising:
comparing a position of the toy piece on the surface to a target position of the toy piece on the surface; and
in response to determining that the position of the toy piece does not satisfy similarity criteria for matching the target position, performing one or more actions.
10. The system of claim 9 , wherein the one or more actions comprise at least one of:
activating a visual alarm;
activating an audible alarm;
outputting visual instructions; or
outputting audible instructions.
11. The system of claim 1 , wherein the plurality of plate sensors are integrated with the plate and arranged in an array.
12. The system of claim 1 , the operations comprising determining, using the break beam sensor data and the plate sensor data, at least one of a size of the toy piece or a shape of the toy piece.
13. The system of claim 1 , the operations comprising:
generating a visualization showing:
a path of the toy piece through the workspace; and
a placement of the toy piece on the surface; and
providing the visualization for presentation on a display.
14. The system of claim 1 , wherein the plurality of break beam sensors comprise a plurality of infrared break beam sensors.
15. The system of claim 1 , wherein the plurality of plate sensors comprises a plurality of weight sensors, proximity sensors, or contact sensors.
16. A computer-implemented method for tracking object movement within a three-dimensional workspace, the method comprising:
receiving, by a controller, break beam sensor data indicating object detection by at least one break beam sensor of a plurality of break beam sensors, wherein the plurality of break beam sensors is configured to detect objects within the workspace;
receiving, by the controller, plate sensor data indicating object detection by at least one plate sensor of a plurality of plate sensors, wherein the plurality of plate sensors is configured to detect objects resting on a surface of a plate that defines a floor of the workspace;
determining, by the controller and based on the break beam sensor data and the plate sensor data, that an object passed through the workspace to rest at a position on the surface;
comparing, by the controller, the position of the object on the surface to a target position of the object on the surface; and
in response to determining that the position of the object does not satisfy similarity criteria for matching the target position, performing, by the controller, one or more actions.
17. The method of claim 16 , wherein the one or more actions comprise at least one of:
activating a visual alarm;
activating an audible alarm;
outputting visual instructions; or
outputting audible instructions.
18. The method of claim 16 , comprising:
obtaining data indicating a plan for a construction to be built on the plate; and
determining the target position of the object on the surface using the obtained data.
19. The method of claim 16 , wherein the plate includes toy piece coupling elements and is configured to be used with releasably coupleable toy pieces.
20. A non-transitory computer storage medium storing instructions that, when executed by one or more processors, cause the one or more processors to perform operations for tracking object movement within a three-dimensional workspace, the operations comprising:
receiving, by a controller, break beam sensor data indicating object detection by at least one break beam sensor of a plurality of break beam sensors, wherein the plurality of break beam sensors is configured to detect objects within the workspace;
receiving, by the controller, plate sensor data indicating object detection by at least one plate sensor of a plurality of plate sensors, wherein the plurality of plate sensors is configured to detect objects resting on a surface of a plate that defines a floor of the workspace;
determining, by the controller and based on the break beam sensor data and the plate sensor data, that an object passed through the workspace to rest at a position on the surface;
comparing, by the controller, the position of the object on the surface to a target position of the object on the surface; and
in response to determining that the position of the object does not satisfy similarity criteria for matching the target position, performing, by the controller, one or more actions.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/846,407 US20230415051A1 (en) | 2022-06-22 | 2022-06-22 | Interactive build plate |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/846,407 US20230415051A1 (en) | 2022-06-22 | 2022-06-22 | Interactive build plate |
Publications (1)
Publication Number | Publication Date |
---|---|
US20230415051A1 true US20230415051A1 (en) | 2023-12-28 |
Family
ID=89324029
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/846,407 Pending US20230415051A1 (en) | 2022-06-22 | 2022-06-22 | Interactive build plate |
Country Status (1)
Country | Link |
---|---|
US (1) | US20230415051A1 (en) |
-
2022
- 2022-06-22 US US17/846,407 patent/US20230415051A1/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110301118B (en) | Location calibration for an intelligent assistant computing device | |
JP6129863B2 (en) | Three-dimensional touch type input system and optical navigation method | |
EP3400705B1 (en) | Active speaker location detection | |
US10012508B2 (en) | Providing directions to a location in a facility | |
JP5807130B2 (en) | A secure method for gesture-based game systems | |
US9575155B2 (en) | Ultrasonic location determination | |
KR20190077059A (en) | Virtual assistant identification near the computing device | |
CN107796395B (en) | It is a kind of for the air navigation aid of indoor objects position, device and terminal device | |
CN102681958A (en) | Transferring data using physical gesture | |
CN103765879A (en) | Method to extend laser depth map range | |
US11181376B2 (en) | Information processing device and information processing method | |
US8861310B1 (en) | Surface-based sonic location determination | |
US20210158809A1 (en) | Execution of function based on user being within threshold distance to apparatus | |
CN107589625A (en) | The automatic zooming method and projecting apparatus of projecting apparatus | |
US10620303B2 (en) | Distance measurement device, distance measurement system and distance measurement method | |
WO2023064902A1 (en) | Ultrasonic device-to-device communication for wearable devices | |
US20230415051A1 (en) | Interactive build plate | |
US20120162137A1 (en) | Electronic device with touch input function | |
TWI636381B (en) | Interactive display system and controlling method of interactive display | |
WO2019103900A1 (en) | Electromagnetic radiation activated athletic timers | |
JP2007003448A (en) | Movement information generating device, movement information generating method, program, and storage medium | |
CN110478205A (en) | Avoidance guiding device and barrier-avoiding method based on infrared TOF measurement sensor | |
CN105786221B (en) | Information processing method and electronic equipment | |
CN108815840A (en) | A kind of method, device and mobile terminal controlling application program | |
US11188206B2 (en) | Information processing apparatus and information processing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: X DEVELOPMENT LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KHAMIS, SAMUEL;MISKOVIC, VLADIMIR;STRATTON, KATIE B.;SIGNING DATES FROM 20220617 TO 20220621;REEL/FRAME:060279/0937 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |